A colocation prospect asks me how long will a given burst have to last before it is converted from a finger pointing upward on the bandwidth monitor graph to the solid line representing continuous throughput?
I imagine that this has to do with how often your system polls the switch, and ours is set to whatever the default is, every couple of minutes seems like, but still I think it's an interesting question? Say the system polls the switch and sees that your interface is doing four megs. Then it polls it again a couple of minutes later and sees only a half a meg, then that four megs would have shown on the graph as a burst. But say it still sees the four megs of traffic outbound from the customer (inbound to the switch) when it polls the next time, and maybe the time after that. How many times will it have to see this four megs before it converts that burst on the graph to a solid line representing continous throughput?
Thanks whoever tackles this one!