How is response time threshold determined? I'm changing our alerting to try to make things more dynamic by using threshold values for things like packet loss, response time etc so we can edit the node and change the threshold rather than creating unique alerts and excluding nodes from certain nodes.
I'm testing my response time alert which triggers if "warning (response time threshold)" is true. Testing using a Linux node and using "tc qdisc ... netem delay 600ms" to simulate latency. When I ping the Linux node it's working as expected:
Packets: Sent = 766, Received = 766, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 600ms, Maximum = 604ms, Average = 600ms
Orion is reporting the current response time as 600ms. But the threshold isn't breached. It seems as though it's tied to the current average response time. Is that the case?