In conjunction with SANS, SolarWinds conducted a survey of IT professionals on the impact of security threats and the use of security analytics and intelligence to resolve those threats.  We isolated the 120 government responses to get a sense of how analytics and intelligence are helping with the ever-increasing security challenges in the federal space.


Across the responses there was a commonality in uncertainty.  From truly understanding what the budget was for “information security management, compliance and response” (44 percent said unknown), to the number of attacks, to context around normal system behavior, to the roles needed in the organization, respondents agreed most on the lack of a standard.


What they do know is that security events happen. About 43 percent reported that in the past two years, their organizations experienced one or more attacks that were difficult to detect. Another 28 percent answered “unknown” to this question, continuing our theme of uncertainty.


Documented attacks take on average one week to detect. The three greatest blocks to discovering these attacks fall into the “we don’t know what we don’t know” category:

  • Lack of system and vulnerability awareness
  • Not collecting appropriate operational and security data
  • Lack of context to observe “normal behavior”


So, how is this problem overcome? With data of course! The data being used most frequently in the federal space to investigate security issues are:

  • Log data from networks and servers
  • Network monitoring data
  • Access data from applications and access control systems


In the next 12 months, respondents say they plan to begin using the following reporting data to enhance their security monitoring:

  • Monitoring and exception data pertaining to internal virtual and cloud environments
  • Access data from applications and access control systems
  • Security assessment data from endpoint, application and server monitoring tools


But as we all know, the more data you get, the more difficult it is to manage and make sense of it all. For that data to be effective, there needs to be a level of analytics. There is an even split between respondents saying that they correlate threat data using internally developed methods and those that say they do not correlate log data with external threat intelligence tools at all (43 and 42 percent, respectively). For those using analytics and analytic tools, the majority reported the biggest weakness was determining and measuring against a baseline.


What does this all mean? In order to get a handle on security threats, organizations must focus not necessarily on analyzing outliers, but on what the normal range should look like. Determining that baseline using monitoring tools and putting effort into correlating historical data with threat information will create more certainty and pay great dividends in being able to more quickly spot security events.  


Full public sector survey results are available by request.