Can some expert out there explain this to me. When I pull up an Average Response Time & Packet Loss graph to show an outage and look at the sample rate for 5 minutes and 1 hour the 1 hour graph "looks" like the outage occured at the beginning of the hour.
I am not saying the graph is broken - just need to be able to explain it to folks who are asking. See samples:

