Given the current state of networking and security and with the prevalence of DDoS attacks such as the NTP Monlist attack, SNMP and DNS amplifications as well as the very directed techniques like DoXing and most importantly to many enterprises, exfiltration of sensitive data, network and security professionals are forced to look at creative and often innovative means to ascertain information about their networks and traffic patterns. This can sometimes mean finding and collecting data from many sources and correlating it or in extreme cases, obtaining access to otherwise secure protocols.

Knowing your network and computational environment is absolutely critical to classification and detection of anomalies and potential security infractions. In today’s hostile environments that have often had to grow organically over time, and with the importance and often associated expenses of obtaining, analyzing and storing this information, what creative ways are being utilized to accomplish these tasks? How is the correlation being done? Are enterprises and other large networks utilizing techniques like full packet capture at their borders? Are you performing SSL intercept and decryption? How is correlation and cross referencing of security and log data accomplished in your environment? Is it tied into any performance or outage sources?