7 Replies Latest reply on Jul 5, 2007 5:33 PM by denny.lecompte

    Database Retention

      I’m currently evaluating NetFlow Anaylzer w/ Orion v8.  I’ve set the retention date to 2 days but it seems older data are not being deleted.  The database is still growing rapidly – my DB server (remote) is almost out of disk space.  Can I delete part of the database manually?

      Any suggestions would be greatly appreciated.

      Regards,
      CP
        • Re: Database Retention
          Looks like the deletes are not able to keep up with the inserts for your environment.  We are looking into problems like this one. 

          Can you provide some details on your environment:
          1) What version of SQL Server are you using?
          2) Is SQL Server running on the Orion server or on a separate server?
          3) What is the hardware configuration of the SQL Server?

          As for manually deleting older data, you can try the following (but know that this is a one-time shrinking; the growing database is likely to return):
          1) in the settings app, change the port you are listening on to some other port - this will stop new data from being added to the database while the deletion of old data is allowed to continue.
          2) Let the service continue to run for a little while to shrink the database
          3) set the listening port back to the original port



          David Perdue
          SolarWinds Development Team
          • Re: Database Retention
            quote:Originally posted by dperdue

            Looks like the deletes are not able to keep up with the inserts for your environment.  We are looking into problems like this one. 

            Can you provide some details on your environment:
            1) What version of SQL Server are you using?
            2) Is SQL Server running on the Orion server or on a separate server?
            3) What is the hardware configuration of the SQL Server?

            As for manually deleting older data, you can try the following (but know that this is a one-time shrinking; the growing database is likely to return):
            1) in the settings app, change the port you are listening on to some other port - this will stop new data from being added to the database while the deletion of old data is allowed to continue.
            2) Let the service continue to run for a little while to shrink the database
            3) set the listening port back to the original port



            David Perdue
            SolarWinds Development Team


            Thanks David, here are the answers to the above questions.

            1. SQL 2005
            2. On a separate server
            3. Because it's a test box it does not quite meet the requirements recommended by Solarwinds - 2GHz CPU , 3GB RAM


            CP

            • Re: Database Retention
              Thanks Cherrypicker.  Some more questions...

              On your SQL Server, can you run perfmon (Start->Run->perfmon) and look at the Avg. Disk Queue Length counter.  Let it run for a minute and look at the average.  What is it averaging?

              You have retention set at 1 day.  How many days worth of data are you seeing in the DB?

              You say you are running out of disk space.  How big is the DB? How much spece do you have free on the DB server?

              Thanks,

              David Perdue
              SolarWinds Development Team
                • Re: Database Retention

                   I'm having the same problem. I'm evaluating NTA. I am currently collecting Netflow data from only 4 interfaces. None over 10Mbps and two of them are typically less than 300 Kbps. SQL Server 2005 Express on the same host as Orion and NPM. It is running on an AMD 64bit 2GHZ processor and 2 GB of RAM. Avg. Disk queue length is usually less than 1 so it doesn't seem like a disk utilization issue. It doesn't appear that NetFlow has EVER flushed data out of the database. I started my eval on 6/27 and today (7/5), I had to manually delete several million rows in the database to free up some space. The data in the Netflow tables were dated all the way back to the 27th.

                    • Re: Database Retention
                      denny.lecompte

                       What are your retention settings?   NetFlow will keep compressed data for 90 days be default.  It will keep uncompressed data for 60 minutes by default. It will begin compressing data after 15 minutes.  It will delete the uncompressed data after 60 minutes.  If that's not happening, then your system probably can't keep with the data. 

                      The system you are running in do not meet the minimum requirements for NTA.

                      http://www.solarwinds.com/products/orion/NetFlowSysReq.aspx

                       *We do not support SQL Express.  It's a desktop database, and it won't scale to the needs of a NetFlow management solution.

                      *You need SQL Server Standard or Enterprise on a separate server with 4-8 GB or RAM

                      *The Orion/NTA server needs 2GB of RAM

                        • Re: Database Retention

                          My settings are: 

                          Keep uncompressed data for 1            hours                      

                          Keep compressed data for 2 days

                          Are there any other settings? 

                           

                          I was going to switch to an existing enterprise SQL server (hosted on another machine) when I go into production. I was hoping with those light retention settings and only 4 Netflow interfaces setup I could get a good demo. 

                            • Re: Database Retention
                              denny.lecompte

                              The lowest uncompressed setting is 16 minutes.  You could try that.  Uncompressed data is really the big space hog.  Cutting it down from an hour should help.

                              However, I'm guessing that it's more about CPU and Memory on the server than retention settings.  If the NetFlow receiver doesn't have the resources it needs to execute compression, then the database will grow very quickly.  The NetFlow protocol produces a lot of data.