My director wants a report that gives us the amount of time each branch location spent above 80% bandwidth utilization for the previous month. Honestly, right now I'd settle for being able to give him the duration of an individual event. I know how to get the min/max/average utilization for an interface but I can't figure out the duration. As far as I can see there's absolutely NO way to get this out of Report Writer or the Web Reporting Interface. I tried creating a report in report writer that would pull everything listing a 5000 or 5001 event within my chosen time span. I can even get the report to order it so that each Alert trigger is followed by it associated Alert Reset.
Here's where I lose it. I'm not a programmer at all. I know it should be possible to calculate the difference between the DATETIME stamps for each alert pair. I should then be able to add up the results of that operation for each pair of a given node to give me a total for that time period. It seems so simple on the surface. I just can't seem to figure out how to do it. At this point, I'd be pretty happy if I could, for a single specified node, get a report that gives me the duration between each high bandwidth utilization trigger and its associated reset. Actually, I'd be happy if I could get that for a single event. If anyone can lead me in the right direction on this I'd appreciate it.
I've attached the report from report writer just to show the information I'm dealing with.