Showing results for 
Search instead for 
Did you mean: 
Create Post
Level 9

Are you interested in improved support for log file monitoring?

Hi All -

The SAM team here at SolarWinds has been thinking about what we can do to provide our users more support for monitoring log files.  If this is something you are interested in, we here on the UX team would love to spend 30-60 minutes talking with you.  We want to understand your needs around monitoring log files and how you are attempting to accomplish this today.

Beyond having the opportunity to contribute to the direction we head in with log file monitoring, we are also giving participants 2000 thwack points.

If you are interested in participating in a session, please email me directly at

We're looking forward to your input!

28 Replies

What kind of logfiles are you talking about?   I'd be interested in "syslog" oriented discussions, but not windows event logs and such.    Guessing since its the SAM team its the latter?

0 Kudos

This is actually unrelated to both syslog and WIndows Event Logs. This is in reference to text log files typically generated by applications and the operating system.

Count me in !!!

This is one of our big choke points for application specific logfiles.

Challenges we have are:

1) dates in file name or directory path

2) pid's in file name

3) watching for multiple strings in same log file

4) duplicating all the above for 3 - 8 other log file instances on same server and then again across 3 - 6 other servers.

5) matching multiple lines (think java logs) for a single event

6) x number of specific events in a sliding window of time

7) correlating log events across 2 or more servers


need regex in filename/path definition

need regex in search string definition

Then we get to talk about unix (linux, AIX, HP, Solaris, etc.)


- Find the record with string a, but ignore it if string b is also in the record

- Be able to define an identifier string in the records that can be used for grouping multiple records together. That way, if we find multiple search strings that have the same identifier, we only generate a single alert instead of one for each string that was detected.

There are more.

Level 16

I'm in!!!! and excited about it!!!!

We have a lot of systems that have custom log files that need to be watched.  Today we use PERL to manipulate these but it is not the desired procedure.  Tivoli handled this fairly well and is a big item we are now missing.  The ability to "pull" log files is a large need for us, especially since I am a pathetic PERL script writer.

A static log file should be fairly easy to monitor, but I think the hard one would be a log file that is named based on the date/time it was created.  These are fairly common and I will look for some examples between now and when we get a chance to talk.

LOVE how you guys are allways pushing forward!!!

also, thanks aLTeReGo for calling me out, I get busy and miss alot of threads out there....

Level 12

We don't use SAM - it's too expensive - but one thing I imagine would be useful (if it doesn't already have the ability) is to be able to monitor any files, including new files with unknown filenames and/or matching a file *mask*.* (or regex?) in a folder & subfolders. Monitoring log files over a UNC path etc would be extremely inefficient, so this kind of thing would need a remote agent. The criteria for matching lines should also be extremely flexible, i.e. you should be able to specify variables (current day/time), perhaps specify a specific "column" within the log file which will contains an ID for that "group" of events - some apps log an event which covers multiple lines which share a unique identifier, with the bits of intest spread over multiple lines.

... and since a remote agent is involved, such a system would probably need to be extensible to accommodate future "plugins" using a standard (secure) deployment and update system. Basically a better version of Nagios's NSClient++, but perhaps designed to be usable by many different Solarwinds' products in the future.

I just responded to the email address Susan provided; we are absolutely interested in this!

0 Kudos