cancel
Showing results for 
Search instead for 
Did you mean: 

Crowdsourced Incident Monitoring?

Level 11

In previous discussions about increasing the effectiveness of monitoring, it has been pointed out that having more eyes on the data will yield more insight into the problem. While this is true, part of the point of SIEM is to have more automated processes correlating the data so that the expense of additional human observation can be avoided. Still, no automated process quite measures up to the more flexible and intuitive observations of human beings. What if we look at a hybrid approach that didn’t require hiring additional security analysts?

In addition to the SIEM’s analytics, what if we took the access particulars of each user in a group and published a daily summary of what (generally) was accessed, where from, when and for how long to their immediate team and management? Such a summary could have a mechanism to anonymously flag access anomalies. In addition, the flagging system could have an optional comment on why this is seen as an abnormal event. e.g. John was with me at lunch at the time and couldn’t have accessed anything from his workstation.

Would something like this make the security analysis easier by having eyes with a vested interest in the security of their work examining the summaries? Would we be revealing too much about the target systems and data? Are we assuming that there is sufficient interest on the part of the team to even bother reading such a summary?

Thinking more darkly, is this a step onto a slippery slope of creating an Orwellian work environment? Or… is this just one more metre down a slope we’ve been sliding down for a long time?

6 Comments
jswan
Level 13

I think you'd have to be extremely selective about the kinds of anomalies reported. For most kinds of indicators, people who aren't security analysts probably can't tell a false positive from a true positive except in circumstances where the indicator is of extremely high fidelity, and it's hard to tune for fidelity without analysts.

If you restricted it to things like privilege elevation events, administrative group changes, authentication failures, or use of unusual executables (psexec, pwdump, etc), then it might be helpful.

bluefunelemental
Level 15

I am as we speak trying to kick off a "crowd-sourced" monitoring workshop within my company and as jswan points out there are just too many topics and intricacies to assume everyone walks away a data geek. Rather then Orwellian as ghostinthenet ruminates, data is just another tool to describe our surroundings. Drawing an illusion to the micro and macro sciences most everyone understands the layers of systems and to some extent their interactions such as the below image or it's peer in the astro sciences.

0-7645-5422-0_0101.jpg computerworks.gif


However the bigger the view, resolution, and/or scale (big data) the more complexity in getting a handle on it with some better at the interactions of systems and others the focused on specializations. I would posit that while scientists who focus on these systems are quite nuanced and varied from physicists and molecular biologists to general physicians and even virologists we don't have the same range in computer sciences which is generally developer, engineer, or operations at the root of their functions. But good toolmakers can still make the jumps needed to follow patterns and dependencies and know how to collaborate with those who can fill in the blanks. If data is not insightful then it's just taking up valuable space in your toolbox.

mharvey
Level 17

I think the problem that anyone runs into, especially when we look at human involvement is ensuring those vested parties are taking the time to evaluate the reports and adding in what they feel is needed. If not, then this becomes a variable of contention and can lead to a lack of understanding due to those vested not taking the time to put in the effort needed.  Of course that can be said about any facet of monitoring/administration, not just in this avenue.  There has to be a level of trust, but at the same time, there has to be a baseline behavior showing that reliablility is there.

jkump
Level 15

No matter how we develop AI technologies ultimately we as humans are still need to complete the last mile analysis.  The amount of correlation that automated systems can perform is extremely useful but the last analysis relies on making a particular distinction on what is really needed as the end game from the system.  If we are merely collected data to claim that we are getting the greatest benefit for the cost of the system.

ghostinthenet
Level 11

Agreed. We need to present a tightly-defined set of data in order to acquire a very narrow field of observation. If we're using peer analysis in order to mine insight from within a team, we can potentially get better context for our observations. The downside is that we may be risking confidence in the work environment when we start asking people to anonymously report on each other's activities.

ghostinthenet
Level 11

Absolutely. The trick is to keep everyone participating in the process on the same page. If everyone continues to see it as "just data" then we're good.

About the Author
Network Greasemonkey, Packet Macrame Specialist, Virtual Pneumatic Tube Transport Designer and Connectivity Nerfherder. The possible titles are too many to count, but they don’t really mean much when I’m essentially a hired gun in the wild west that is modern networking. I’m based in the Niagara region of Ontario, Canada and operate tishco networks, a consulting firm specializing in the wholesale provisioning of networking services to IT firms for resale to their respective clientele. Over my career, I have developed a track record designing and deploying a wide variety of successful networking solutions in areas of routing, switching, data security, unified communications and wireless networking. These range from simple networks for small-to-medium business clients with limited budgets to large infrastructure VPN deployments with over 450 endpoints. My broad experience with converged networks throughout Canada and the world have helped answer many complex requirements with elegant, sustainable and scalable solutions. In addition, I maintain current Cisco CCDP and CCIE R&S (41436) certifications. I tweet at @ghostinthenet, am a Tech Field Day delegate, render occasional pro-bono assistance on sites like the Cisco Support Community and Experts' Exchange and occasionally rant publicly on my experiences by "limpet blogging" on various sites. Outside of the realm of IT, I am both a husband and father. In what meagre time remains, I contribute to my community by serving as an RCAF Reserve Officer, supporting my local squadron of the Royal Canadian Air Cadets as their Commanding Officer.