quote:Originally posted by dperdue
Looks like the deletes are not able to keep up with the inserts for your environment. We are looking into problems like this one.
Can you provide some details on your environment:
1) What version of SQL Server are you using?
2) Is SQL Server running on the Orion server or on a separate server?
3) What is the hardware configuration of the SQL Server?
As for manually deleting older data, you can try the following (but know that this is a one-time shrinking; the growing database is likely to return):
1) in the settings app, change the port you are listening on to some other port - this will stop new data from being added to the database while the deletion of old data is allowed to continue.
2) Let the service continue to run for a little while to shrink the database
3) set the listening port back to the original port
David Perdue
SolarWinds Development Team
I'm having the same problem. I'm evaluating NTA. I am currently collecting Netflow data from only 4 interfaces. None over 10Mbps and two of them are typically less than 300 Kbps. SQL Server 2005 Express on the same host as Orion and NPM. It is running on an AMD 64bit 2GHZ processor and 2 GB of RAM. Avg. Disk queue length is usually less than 1 so it doesn't seem like a disk utilization issue. It doesn't appear that NetFlow has EVER flushed data out of the database. I started my eval on 6/27 and today (7/5), I had to manually delete several million rows in the database to free up some space. The data in the Netflow tables were dated all the way back to the 27th.
What are your retention settings? NetFlow will keep compressed data for 90 days be default. It will keep uncompressed data for 60 minutes by default. It will begin compressing data after 15 minutes. It will delete the uncompressed data after 60 minutes. If that's not happening, then your system probably can't keep with the data.
The system you are running in do not meet the minimum requirements for NTA.
http://www.solarwinds.com/products/orion/NetFlowSysReq.aspx
*We do not support SQL Express. It's a desktop database, and it won't scale to the needs of a NetFlow management solution.
*You need SQL Server Standard or Enterprise on a separate server with 4-8 GB or RAM
*The Orion/NTA server needs 2GB of RAM
My settings are:
Keep uncompressed data for 1 hours
Keep compressed data for 2 days
Are there any other settings?
I was going to switch to an existing enterprise SQL server (hosted on another machine) when I go into production. I was hoping with those light retention settings and only 4 Netflow interfaces setup I could get a good demo.
The lowest uncompressed setting is 16 minutes. You could try that. Uncompressed data is really the big space hog. Cutting it down from an hour should help.
However, I'm guessing that it's more about CPU and Memory on the server than retention settings. If the NetFlow receiver doesn't have the resources it needs to execute compression, then the database will grow very quickly. The NetFlow protocol produces a lot of data.
SolarWinds solutions are rooted in our deep connection to our user base in the THWACK® online community. More than 195,000 members are here to solve problems, share technology and best practices, and directly contribute to our product development process.