Looking Ahead: The SEI Technical Strategic Plan, Part 2

Add comments

By Bill Scherlis
SEI Principal Researcher and Director, Institute for Software Research

Bill ScherlisThe Department of Defense (DoD) has become deeply reliant on software. As a federally funded research and development center (FFRDC), the SEI is chartered to work with the DoD to meet the challenges of designing, producing, assuring, and evolving software-reliant systems in an affordable and dependable manner. This blog post is the second in a multi-part series that describes key elements of our forthcoming Strategic Research Plan that address these challenges through research, acquisition support, and collaboration with the DoD, other federal agencies, industry, and academia.  The first post in this series focused on Architecture-Led Incremental Iterative Development.  This part focuses on the remaining three elements of our strategic plan: (1) designed-in security and quality (evidence-based software assurance), (2) a set of DoD critical component capabilities relating to cyber-physical systems (CPS), autonomous systems, and big data analytics, and (3) cybersecurity tradecraft and analytics.

Evidence-based Software Assurance and Certification

The goal of the second element in the SEI Strategy Research Plan is a dramatic reduction in the cost and difficulty of making assurance judgments related to quality and security attributes. Achieving this goal is particularly important as systems become more complex and evolve more rapidly. Current approaches for certification and accreditation are largely based on an after-the-fact evaluation of a snapshot of a system.

While after-the-fact approaches are effective for certain well-defined categories of components and systems, they tend to break down as systems increase in complexity, scale, and dynamism. They also tend to hinder ongoing evolution, rapid reconfiguration, dynamic loading of components, autonomy, and composition and interlinking of systems-of-systems. Put simply, these established techniques do not scale up, and they do not work well for the emerging software framework-based systems now prevalent in commercial and infrastructural applications.

The industry folklore has long asserted that quality-related activities, including security-related assurance, can consume half of total development costs for larger systems. For example, the IBM Systems Journal states that, in a typical commercial development organization, “the cost of providing [the assurance that the program will perform satisfactorily in terms of its functional and nonfunctional specifications within the expected deployment environments] via appropriate debugging, testing, and verification activities can easily range from 50 to 75 percent of the total development cost.” Additionally, after-the-fact evaluation practices can add a year or more to the elapsed time required to develop and deploy software-reliant systems.

Commercial systems, including products and software as a service (SaaS), cloud-based systems, tend to undergo a relatively rapid and continual evolution. For many of our DoD and infrastructural systems, we similarly need to support a continuous evolution.

Some areas of particular technical emphasis include

  • Architecture and composition principles that enable separate evaluation of individual components, with the possibility of combining results to achieve aggregate assurance judgments. These principles are motivated by the reality of modern software supply chains, which are rich and diverse in sourcing and geography.
  • Modeling and analytics to support the diversity of quality attributes significant to DoD and infrastructural systems. These include modeling, simulation, and design techniques that support critical security attributes.
  • Exploration of development approaches that incorporate creation of evidence in support of assurance claims into the process of development. This evidence-based assurance can harmonize incentives to create designs and implementations that can more readily support evaluation.
  • Evaluation and other techniques to support the use of more opaque components in systems, including binary components and potentially dangerous components from unknown sources.

Part of the SEI Strategy Research Plan addresses incentives for developers to adapt their architectural structures, development process, and tooling to better accommodate the idea of amassing evidence—during the development process—that can support an eventual assurance claim with respect to quality and security attributes critical to the operation of the particular system. Our work benefits from the fact that the development process can be naturally incremental through the composition of components and the incremental validation of assurance claims. This development process is supported by the partial accumulation of engineered artifacts and evidence.

These ideas of “designed-in security” build on the fourth theme of the Networking Information Technology Research & Development (NITRD) Program plan, “Trustworthy Cyberspace: Strategic Plan for the Federal Cybersecurity Research and Development Program.”

Critical Component Capabilities

The goal of the third element in the SEI Strategic Research Plan is to enhance DoD software capability in several areas that have critical and pervasive roles in DoD software-reliant systems. These areas include composable, cyber-physical systems (CPS), autonomous and distributed systems, and high-performance, data-intensive computing.

Each of the areas presents challenges:

  • Composable cyber-physical systems (CPS). “Cyber-physical” refers to the fact that embedded software operates in the context of physical sensors and affectors. CPS thus includes control systems, real-time systems, and many other categories of systems pervasive in DoD and critical infrastructure. These systems tend to have greater complexity due, for example, to higher coupling and the need to model and manage associated physical system components. They also tend to need to assure that real-time deadlines are met. A consequence is that they typically manifest greater internal coupling in their design, thwarting higher levels of capability, composition, and flexibility.

    There are diverse technical challenges related to design and evaluation, which include improving modeling and analysis for cyber-physical and control systems, modeling and managing hardware reliability issues, making safe use of concurrency and effective scheduling to assure that performance goals can be met, providing support for analysis and testing, and advancing the overall stack architectures beyond legacy concepts. Many mobile systems fall in the category of cyber-physical, including ad hoc networks connecting with large numbers of sensors and enhanced mobile devices deployed to the field. Several recent blog posts outline efforts by SEI researchers to address these challenges. 
  • Autonomous systems. Autonomous systems are cyber-physical systems that can accept sensor data and mission guidance, and, with very limited (or no) human interaction, arrive at mission decisions and enact outcomes. These systems are increasingly critical to the mission, and yet they pose particular challenges for verification and validation, since they rely so much less on ongoing human interaction.  Indeed, the capability and complexity of these systems are often limited for this reason. Systems must both behave as expected and, additionally, not manifest unwanted behaviors that can be dangerous or threaten mission success. From a technical perspective, autonomy can be particularly challenging because of the vast state space, the number of possibilities of combinations of inputs, and challenges of error tolerance, and the difficulty of fully modeling the environmental circumstances of their operation.
  • High-Performance Computing (HPC) for analytics and for modeling and simulation. Advances in sensor fidelity, rapid growth in network capacity, increasing convergence in data-center and high performance computing architectures, advances in large-scale data storage, and emerging frameworks for scalable distributed computing (such as MapReduce and GraphLab) have all resulted in the growing phenomenon of “big data.” There are many significant applications of big-data techniques in DoD and infrastructural systems—and in the development of those systems as well. Indeed, many of the other features of the SEI Strategic Research Plan build on progress in big data.

Cybersecurity Tradecraft and Analytics

The goal of the fourth strategic element is to advance analytic capability in support of diverse aspects of the cybersecurity mission. These aspects include analytics and situational awareness for malware, vulnerability categorization and assessment, vulnerability information management, network activity analysis, threat characterization and assessment, organizational security, and many other dimensions of operational response, remediation, and recovery.

This capability builds on a range of data assets related to adversarial tradecraft, malware, vulnerabilities, insider threats, and other results of experience with large numbers of cybersecurity-related incidents. There are diverse purposes of this strategic element, including:

  • Improvement in our understanding and communication of threats and risks
  • Development of better preventive approaches in the engineering of systems and in managing secure operations including considerations for security and assurance “at scale”, improved indications and warning, and near-real-time data analysis
  • Support for forensic and corpus analysis
  • Support for hypothesis generation using machine learning, near-real-time analysis, and other advanced capabilities

What’s Next

The next blog posting in this series focuses on our approach for evaluating and validating SEI research projects. The SEI Strategic Research Plan is designed to ensure that the SEI conducts high-quality and high-impact work that benefits the DoD by identifying and solving key technical challenges facing current and future DoD software-reliant systems. This strategy itself undergoes continual evolution and improvement; the broad range of SEI engagements enable us to continually refine the strategy as the technology advances, the mission evolves, and our understanding improves. We welcome engagement from our partners and stakeholders in the improvement and refinement of this strategy. Please leave comments below or contact us directly at info@sei.cmu.edu.

Additional Resources

To download the report Critical Code: Software Producibility for Defense, please see www.nap.edu/catalog.php?record_id=12979

To download the Report of the Defense Science Board Task Force on Defense Software (2000), please visit http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA385923

To view on-demand presentations from the SEI Agile Research Forum, please visit
www.sei.cmu.edu/go/agile-research-forum

For more information about the Networking Information & Technology Research & Development Program, please visit
www.nitrd.gov/about/about_nitrd.aspx

Share this

Share on Facebook Send to your Twitter page  Save to del.ico.us  Save to LinkedIn  Digg this  Stumble this page.  Add to Technorati favorites  Save this page on your Google Home Page 

0 responses to “Looking Ahead: The SEI Technical Strategic Plan, Part 2”

Add Comment


Leave this field empty: