This discussion has been locked. The information referenced herein may be inaccurate due to age, software updates, or external references.
You can no longer post new replies to this discussion. If you have a similar question you can start a new discussion in this forum.

How do I trim logs

LEM running for a couple of years, DB getting full, slow.

How do I manually or automatically trim the logs to 6 months?

  • Lem has no option to define a retention period, it holds everything until the disk is full then starts deleting the oldest data to free space as needed.  With the way the DB is structured having a lot of old data in there shouldn't really slow it down unless you start running reports and ndepth searches that actually look back that far.

  • so the LEM violates every data retention policy ever written?  Interesting.  puts anyone who uses it out of compliance of several regs/standards doesn't it.

  • You can find information about retention by reporting in the LEM:

    Success Center

    There's no hard coded number of days that LEM keeps data, it's based on disk size.

    As the disk fills LEM will purge the older data, basically as mesverrum stated. If you need to retain more, then you increase the disk space allocated to the LEM. Old data will automatically be removed over time.

  • I don't understand your point here. Did you look into the LEM backup utilities? We back up daily from LEM to disk, (LEM offers full backup and incremental backups),  then Netbackup picks up the LEM data for longer term storage SAN disks and tapes. I wrote a PowerShell to clean up.

    If I have to restore I pull the data for the requested time period out of Netbackup into a secondary LEM in our test lab to perform the forensics. This is a requirement here and I had to demonstrate the solution to move forward. We make sure we retain the backups for two years. Actually pretty simple.

    Splunks method is actually better, it allows you to build freezers or something like that so one can restore/search based on time frame. I think it also allows imports from old files. The issue is cost for us and time frame for searching thru all the old data. Splunk is also slow when doing searches thru old data just like LEM.  To store all that data and bring it in cost too much, Splunk charges by volume of data. LEM is by servers. LEM was a third of the cost. I also liked segmenting the data into a smaller LEM to pull out the data, longer to setup but the searches were fast because of the lesser value set. It seemed cleaner, if that makes sense.

    Hope this helps.