3 Replies Latest reply on Jan 5, 2009 9:02 PM by r0berth1

    Generating Reports

      Is it my install, or do others have similar problems? 


      I have attempted to generate many of the reports on the "Network Performance Monitor Reports Page"


      MOST reports NEVER appear. When I select a report to generate, ("historical neflow reports" - then "top 50 receivers - last 24 hours") The web-site chuggs along for about 5 minutes, then the "NPM Monitor Reports" page just refreshes with the options of which reports to select, again.


      ??


      Other symptoms, MOST reports that do get generated, take an extremely long amount of time to generate.


      I realize what stats Orion is collecting, and the amount of datat that is being processed. Orion NPM is slow out-of-the-box, anyway right? Under every circumstance -  If the box had just been rebooted or running for some time, it's "naturally" slow in response.


      Trust me, the host that contains our install of Orion is quite an impressive box (by technical specs, a few processors, 4GBs RAM, 10K Hard-drive system) by today's standards, and SQL is locally installed so the stats are generated from the localhost. (not going acorss the network to gather stats)


      Any recommendations to maybe help speed this product up?

        • Re: Generating Reports
          vhcato

          Bill,

          There are many elements that play a role in performance. Here are some questions I would ask of your installation:

          1. Are you running SQL Express, Standard, or Enterprise?
          2. How many of each type of elements are you monitoring?
          3. How big is your database?
          4. What are your polling and statistics collection rates?
          5. What are your data retention values?
          6. What are your exact CPU specs (speed, number, hyper-threading)?
          7. How many drives do you have and how are they configured?
          8. What are your average disk queue lengths (normally, and during report generation)

          Just based on the info you provided, here is what I would recommend:

          1. For databases of any real size, SQL should always be hosted on its own server. In general, the volume of data transferred from the SQL server to the server hosting the website (across the network) is very small, but the benefits of offloading the queries to a dedicated server are huge.
          2. The database server should be as beefy as possible, relative to the size of the database. The server specs you mentioned don't really sound too out of line, but again it depends on the size of the database. Memory and CPU speed are HUGE for SQL. The more/faster, the better.
          3. Perhaps you could split your reports into smaller chunks.

          We too have problems with certain very large reports that don't display, but fortunately we can segregate ours sufficiently to provide the info we need. We will soon be making some significant upgrades in our environment that should help with this as well.

          As for general web performance, there has been quite a bit of discussion about it on the forum. We are still running v8.5.1 SP3 in our environment, and really have no complaints about web performance. In the past, we have seen slower performance, but this improved noticeably when we moved to our current NPM host server and SQL server hardware. I anticipate this will get even better when we make our upcoming changes.

          • Re: Generating Reports
            casonline

            I'm having the same problem on v9.2 sp2... any answer to this ?  I'm trying to do traffic reports on a larg number of interfaces... > 900

            • Re: Generating Reports
              r0berth1

              That usually happens when the report is too large and it times out waiting on the SQL database. You should only pull the necessary fields then filter out the stuff that you dont need.

              And to answer one of you questions. I have also found Orion to be slow out of the box. takes a lot of tweaking to get it running at a slightly managable rate. the problem is if you dont know how to tweak the network connection for maximum throughput and how to tweak the SQL server for maximum speed and trying to find them all in one place. you just have to play with it, and stop when it is close to what you want.

              With that many elements you will want to split your SQL install to a different server and follow the recommended requirements for that version of SQL or top it if you can. Also if both servers are connected to say a 1 gigabit ethernet switch (especially Cisco, because it likes to default to 10/half if both sides are not hardcoded to the same speed and duplex settings) and you have gig cards in both servers, hard code the switch ports for those servers to 1000/full and match those settings on the network cards of the servers. Be sure to turn of power management on the network cards as well.