24 Replies Latest reply on Oct 12, 2010 1:31 AM by ET

    NetFlow Log file Size.

      I have had to expand our data base log file from 10 GB to 38 GB in the last three weeks. From what I can see NetFlow seems to be the cause. I had to stop our NetFlow process to get Orion running stably.  When I started the NetFlow process up with the additional log file space the app burned through 38 GB of log file space in 30 minutes. Currently NetFlow accounts for 80 GB of our total 101 GB data base. I am in the process of expanding the log file space again, but I am wondering if anyone has seen this issue before. My guess is that NetFlow is wrapping up data all at once and is not committing data until it is done.
        • Re: NetFlow Log file Size.

          I have the same issue.  I have capped the log file to 30GB so we can run the Netflow service until we hit the log file limit.  Typically this gives us 30-45 mins.  I have a case open and am awaiting help.  Let me know if you find out anything and I will do the same.

          • Re: NetFlow Log file Size.

            did you get a resolution to your problem yet?  I am having the same issue and have had no luck getting it resolved.

            • Re: NetFlow Log file Size.

              Is there any current solution to this. We have this issue as well. My log file is growing out of control and we just had a 4 hour outage on both npm and nta as a result. We are currently running nmp 10 and NTA 3.6


              We have netflow on 24 devices and about 3 interfaces per device.

                • Re: NetFlow Log file Size.

                  Log file grows is mostly caused by a very long transaction. You can determine culprit of it, open SQL management studio and run command.

                  DBCC OPENTRAN

                  If there is some transaction for a very log time, you can see log file grows very rapid. After you have it, you can post it here and we can do some tuning.

                  Also log file grows behavior depends on your [recovery model], do you use recommended simple mode?

                    • Re: NetFlow Log file Size.

                      We did have it set to simple but in order to try and keep the system running we have changed it to Full and are running tlog backups every 15 min just to keep the log file empty. We have also capped the log at 100g to prevent the corruption issue we had.


                      I will look for long open transactions.

                        • Re: NetFlow Log file Size.

                          No open transactions for more than a few seconds that I have seen so far.

                            • Re: NetFlow Log file Size.

                              In case you use Full Recovery model it's no wonder that your transaction log is growing a lot. NTA provides very massive bulk inserts of netflow data, and with full recovery model all these operations are logged, which can cause your log grows.

                              You can open a support ticket, but I really think it's caused by full recovery model. If you want use it, you have to care about it ( periodical backup, shrink, ... it's not for free, it requires admin attention) .

                              NTA doesn't use switch to bulk-logged recovery, so if you use full recovery, all inserts are really logged and it mean a lot a lot of records to tlog.