By Troy Townsend,
SEI Innovation Center
The majority of research in cyber security focuses on incident response or network defense, either trying to keep the bad guys out or facilitating the isolation and clean-up when a computer is compromised. It’s hard to find a technology website that’s not touting articles on fielding better firewalls, patching operating systems, updating anti-virus signatures, and a slew of other technologies to help detect or block malicious actors from getting on your network. What’s missing from this picture is a proactive understanding of who the threats are and how they intend to use the cyber domain to get what they want. Our team of researchers—which included Andrew Mellinger, Melissa Ludwick, Jay McAllister, and Kate Ambrose Sereno—sought to help organizations bolster their cyber security posture by leveraging best practices in methodologies and technologies that provide a greater understanding of potential risks and threats in the cyber domain. This blog posting describes how we are approaching this challenge and what we have discovered thus far.
Earlier this year, representatives from the government approached the SEI Innovation Center about conducting research to assess the state of the practice of cyber intelligence. Specifically, we were asked us to accomplish three core tasks:
- capture the state of the practice of cyber intelligence, specifically how cyber intelligence is being performed across private industry and government
- create an implementation framework that captures best practices and advances the state of the art
- prototype technology solutions that advance the state of the science
The overall intent is to expose industry to the best practices in capabilities and methodologies developed by the government, and for the government to learn from the process efficiencies and tools used in industry. In areas where both the government and industry are experiencing challenges, the SEI can leverage its expertise to develop and prototype innovative technologies and processes that can benefit all participants in the program.
We identified 25 organizations to participate in our research including
- federal agencies
- international law firms
- financial sector
- energy sector
- commercial intelligence providers
- retail sector
Our intent was to not rate participating organizations as good or bad, but rather to capture their processes, tools, and understanding of cyber intelligence as a means of enhancing cyber security. To accomplish this, we created a cyber intelligence framework that captured the core, fundamental components of a cyber intelligence process. Based on this framework, we devised interview questions researchers used to learn how organizations accomplished those core components, which we identified as:
- defining the cyber environment. Questions in this area—by far the broadest category—focused on the organization’s cyber footprint, their identified risks, threats, and overall cyber intelligence organization. We asked organizations about everything from the composition of their cyber intelligence component to the methods and techniques analysts use for identifying emerging cyber threats.
- data gathering. Data gathering is largely derived from requirements identified by defining the organization’s environment. If the environment is too broadly defined (“everything is a threat!”), then data gathering becomes inefficient, and analysts are burdened with more data than they can possibly use. If the environment is too narrowly defined, chances are the right data is not being collected, and the organization may be missing indicators of adversary activity. Questions in this area focused on data sources, tools, and processes of collecting and correlating data.
- functional analysis. Functional analysis is the technical assessment or niche analysis of data, which includes functions such as malware analysis, insider threat analysis, reverse engineering, supply chain, intrusion analysis, and forensics. This type of analysis tends to answer the “what is happening” and “how is this happening” questions. The primary utility in this analysis is to contribute directly to cyber security efforts and assist network defenders in remediating security gaps.
- strategic analysis. If functional analysis looks at the “what” and the “how,” strategic analysis attempts to answer the “who is doing this” and “why are they doing this” questions. Strategic analysis often incorporates functional analysis and conveys the complexity of the technical details to leadership in a way that they can understand and appreciate, such as how a cyber event impacts the organization’s strategic goals.
The Cyber Intelligence Framework
Mind the Gaps
For each participating organization, we applied the organization’s workflows and processes to this framework. One challenge that we identified early on in both government and industry is that a language gap exists between functional analysts and decision makers. Often, leadership and decision makers don’t understand the technical nature of the functional analysis, such as what malware is and how and why it works.
In the more effective cyber intelligence programs we observed, strategic analysts are able to translate that functional data in such a manner that decision makers can understand it and use it to make smarter security and business decisions. This translation helps the organization’s leadership better understand events such as distributed denial-of-service attacks (DDoS attacks). In a DDoS attack, the functional analysis provides technical details of the attack, including its provenance and effects on the server. Strategic analysis then applies those functional details to a broader view of how the DDoS attack impacts an organization’s business, how much money was lost as a result, and what could have been done to prevent it.
In December, we plan to present the results of our findings to our customer. We will then begin working with organizations to address the challenges that we identified and incorporate best practices into their operations. Some of the initial challenges that we identified include
- a lack of consistent training for the strategic analysis role. Cyber intelligence is a relatively new area of expertise, and there is a dearth of senior mentors that can provide guidance to new analysts. Moreover, a consistent method of training has not been developed and the skills necessary to perform this function are not yet well defined. There are ongoing attempts to address these inconsistencies, and we plan to share our data to help professionalize the tradecraft of cyber intelligence.
- reliance on traditional intelligence methodologies. Intelligence methodologies were developed in an era when governments were looking at the inventories of the tanks, missiles, and airplanes held by hostile countries and predicting what the leaders of those countries planned to do with them. Applying these same processes, workflows, and tradecraft to the cyber domain is not always feasible. Technology changes so fast in the cyber domain that by the time a strategic-level product on an emerging threat makes it through the publication process, it’s already out of date.
- data gluttony. When we looked at the data gathering phase, in particular, we realized that organizations were inundated with data. Some organizations collected so much data that they simply discarded it without looking at it. Other organizations saved the data but did not use it effectively, and it continued to worthlessly accumulate in their servers. Other organizations collected data, but failed to correlate it. For example, an organization that requires employees to badge in at work may not be cross-checking those badge logs against employees who are logging in to the network remotely with a virtual private network (VPN). It stands to reason that an incident where an employee that has badged in to work but is also logged creating a VPN session from overseas may warrant some investigation.
Our work thus far has focused on helping government leaders make smarter investments of the resources they use to secure the cyber infrastructure. In the coming months, after presenting our data to our sponsor, we will work with participating organizations to apply the best practices that we identified across their organizations.
We’ve received permission from our sponsor to publish our results, so we intend to publish an SEI state of the practice report on cyber intelligence. In addition to the SEI state of the practice report, we aim to present our findings to a broader audience through presentations and panel discussions hosted by professional associations and Information Security conferences around.
For the coming year, we have received further sponsorship to develop prototype solutions to address some of the challenges we identified in this phase of our research. In January, we will begin working with engineers and participants from the study to develop and pilot these prototypes. Our work on the Cyber Intelligence Tradecraft Project won’t be the silver bullet solution to everyone’s cyber security problems. Instead, we hope that our research is a significant voice in an on-going conversation of how cyber intelligence analysis benefits risk mitigation and resource allocation in the cyber environment. We welcome your input to this conversation too! Please add your comments in the section below.
To read the SEI report on our research findings, please visit
For information on the SEI Innovation Center, please visit
For information on the Atlantic Council History of Cyber Intelligence, please visit
For another perspective on the value of cyber intelligence from RSA, please see