This discussion has been locked. The information referenced herein may be inaccurate due to age, software updates, or external references.
You can no longer post new replies to this discussion. If you have a similar question you can start a new discussion in this forum.

How Does Orion Calculate Bandwidth?

I would like to know the specific data points Orion uses and then the mathematical formula used on them to calculate bandwidth usage, specifically bytes transferred and bytes received.

We have an MRTG system in house that does this now and when comparing the numbers Orion is consistently off by the same amount suggesting a difference in the mathematical formula used between the two systems.

Thanks in advance for any help with this!

  • I would like to give an overview on how bandwidth usage is calculated.

    Most of the data stored in the NPM database is coming via an snmp get request for all the available OIDs for any particular device.So there is an OID for InBandwidth and OutBandwidth which is collected by the NPM while doing a polling to the core device via SNMP get.This collected data is then stored in the database which is then available in the GUI.

    So,I would like to have more details of the device and more specific about the table or graph which you are reffering to,so that I would be able to explain in detail. 

  • I am fairly sure this isn't accurate and it also lacks the details that I need.  I understand how the SNMP data collection process works, I need the specifics...

    Which specific OID's are being collected?

    Which specific mathematical calculations are being performed on those data points to get bytes transferred?

    Please let me know if this is something that I should open a support ticket on.

  • Hi Byron,

    I looks like you are interested in how we sum ifInOctets and ifOutOctets to total byte transfered by time period. 

    I'll chase that down. 



  • Hi Byron,

    I looks like you are interested in how we sum ifInOctets and ifOutOctets to total byte transfered by time period. 

    I'll chase that down. 



    Exactly what I am looking for!

    I assume by this that the OID's you start with are ifInOctets and ifOutOctets?  I had a pretty good idea that is what you would use but wanted to make sure and then get the following mathematical process as well.  I am trying to clear up a dependency problem between our MRTG system and Orion so I can eventually consider using Orion to generate bandwidth billing reports.

    Thanks for looking into this for me, it's very much appreciated!

  • There was a typo in your original post. Of course what you meant to type is that MRTG is consistently off. ;)

    For total bytes per time period we simply sum the octet counter data. That is good data for billing assuming the interface is segrated to the billed party.



  • There was a typo in your original post. Of course what you meant to type is that MRTG is consistently off. ;)

    For total bytes per time period we simply sum the octet counter data. That is good data for billing assuming the interface is segrated to the billed party.



    In my defense, I said "off" not "wrong".  = )

    You sum the octet counter data but there has to be some math performed on that in order to get it to bytes, kbytes, mbytes, gbytes, etc.  What does that math look like?

  • Ah - the 1000 v 1024 factor. I'll find out.



  • Ah - the 1000 v 1024 factor. I'll find out.



    It's like you can read my mind!  Or you have just done this before.  = )

    Yeah, that's what I suspect is the culprit for the difference I am seeing.  Having the exact math calculations Orion is using will clear it all up for me.

    Thanks again Andy!

  • FormerMember
    0 FormerMember in reply to byrona

    is the difference divisible by 48576

  • I guess what I would like to see is the actual formula that Orion is using starting with the actual datapoint that is collected all the way to kbytes of transfer for a given period.