I'm just trying to figure out how the data is stored in the dbo.ApplicationStatistics table for an Application.
For example, I'm wondering why there are 9 entries for each time period, 5 of which have NULL values for CPU Load and MemoryUsed?
NodeID AppName AppID DateTime CPULoad MemoryUsed
BUFDISP2 Internet Explorer 291 2007-10-01 00:17:23.000 NULL NULL
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 -2 2.809037E+07
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 -2 5.694259E+07
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 NULL NULL
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 NULL NULL
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 -2 3.60407E+07
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 -2 5.599232E+07
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 NULL NULL
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 NULL NULL
BUFDISP2 Internet Explorer 291 2007-10-01 00:18:24.000 NULL NULL
BUFDISP2 Internet Explorer 291 2007-10-01 00:19:38.997 0 2.821734E+07
Also, I understand that the memory used values are raw- but again, are the 9 (4 that are not NULL) values added together for a total?
And finally (sorry lots of questions) the CPU load values are really weird...there are some -2s in there? Any help would be greatly appreciated- I'm trying to generate some reports that show all applications from a specific node over a specific time period. I would much rather pull straight from the table rather than go through every single chart, import the data into excel, etc etc.