I am an Orion user, but I have been talking to a net admin bud of mine who is considering SW products. I think that ipMonitor is a better fit for him due to network size and the fact that he does not need to keep historical data.
That said, I don't know enough about ipMonitor from a theory standpoint to be able to talk about server requirements (CPU speed and number, amount of RAM needed, drive space and configuration options, etc.) and other planning aspects that one should consider prior to purchasing and installing ipMonitor. For instance, should one be concerned about the effects of the number of configured monitors on server resources and network bandwidth?
Does SW have a guidance document that details that essential questions and issues that one should consider before installing and configuring ipMonitor?
Any help would be appreciated.
FYI - I'm not sure if this helps answer the original question, but I have 1200 monitors with most at 30 second polling (though some at 60 second) and I get a new 22MB PLOG file in dbstore each day.
ipMonitor handles the deletion of a PLOG file really well. If you delete a PLOG file, the history for that day just disappears :). I have not noticed any glitches in monitoring, reporting or anything else as the result of deleting the 30+ of these files that were collected by ipMonitor before it became one of the live production systems.
Thanks for the Great Question freeman!
We do have system requirements available in the Admin Guide and on our website. Here is a link to our web page that describes these things.
As for bandwidth consumption, this is not a simple metric to calculate because every cleint uses the software uniquely in their own environment. For example: Someone using ipMonitor to test simple availability (ping) of 5000 nodes on their network will see significantly less network overhead then someone who is monitoring 300 SQL servers. The amount of network traffic generated depends on monitor types. Additionally, the more frequently these monitors are set to perform each test. The default is 300 seconds between each test, but you can set it to be as aggressive as 1 second (however, this is not recommended)
I see some data files in these folders with TLOG and PLOG extensions. What, if anything, can open and read these files?
As far as data retention and deletion are concerned, does one simply need to delete these files at some arbitrary size limit? Or, is there a more formal procedure for maintaining these data files?
Every file within the \dbstore\ directory contains the statistics for all monitors for a specific day. In order to keep these files an unbelievably small size, they are written in a binary and proprietary format which cannot be read and understood by anything other than ipMonitor. As you can store many years of statistics without requiring any significant drive space (unless you are running this on a computer made in 1999), the need for maintenance is simply not there.
Hope this answers your question.
Here is a link to the FAQ:
This should address most of your questions. There are also links for system requirements, as well as features, and an overview.
SolarWinds solutions are rooted in our deep connection to our user base in the THWACK® online community. More than 150,000 members are here to solve problems, share technology and best practices, and directly contribute to our product development process. Learn more today by joining now.