I would like to know the specific data points Orion uses and then the mathematical formula used on them to calculate bandwidth usage, specifically bytes transferred and bytes received.
We have an MRTG system in house that does this now and when comparing the numbers Orion is consistently off by the same amount suggesting a difference in the mathematical formula used between the two systems.
Thanks in advance for any help with this!