The Value of Systems Engineering

Systems Engineering Add comments

By Joseph Elm
Senior Member of the Technical Staff

Joseph ElmBuilding a complex weapon system in today’s environment may involve many subsystems—propulsion, hydraulics, power, controls, radar, structures, navigation, computers, and communications.  Design of these systems requires the expertise of engineers in particular disciplines, including mechanical engineering, electrical engineering, software engineering, metallurgical engineering, and many others. But some activities of system development are interdisciplinary, including requirements development, trade studies, and architecture design, to name a few.  These tasks do not fit neatly into the traditional engineering disciplines, and require the attention of engineering staff with broader skills and backgrounds.  This need for breadth and experience is often met by systems engineers. Unfortunately, system engineering is often not valued among all stakeholders in the Department of Defense (DoD), and is often the first group of activities to be eliminated when a program is faced with budget constraints.  This blog post highlights recent research aimed at demonstrating the value of systems engineering to program managers in the DoD and elsewhere.

In 2004, the Director for Systems Engineering in the Office of the Undersecretary for Defense for Acquisition, Technology and Logistics (OUSD(AT&L)) came to the National Defense Industrial Association (NDIA) and voiced concerns that DoD acquisition programs were not capitalizing on the value of systems engineering (SE). He knew the value of SE and knew that it could help DoD programs, but he also knew that not all DoD program managers shared his convictions. Consequently program managers were taking shortcuts and eliminating SE capabilities from their programs.  He came to NDIA seeking quantitative evidence of the value of SE.

Subsequently, others have recognized this same problem.  A recent Government Accountability Office (GAO) report indicates that acquisition program costs are typically 26 percent over budget and development costs are typically 40 percent more than initial estimates. These programs routinely fail to deliver the capabilities when promised, experiencing, on average, a 21 month delay.  The report finds that “optimistic assumptions about system requirements, technology, and design maturity play a large part in these failures, and that these optimistic assumptions are largely the result of a lack of disciplined SE analysis early in the program.”

Despite findings such as this, many programs still fail to deploy good SE.  Why?  It may be because there is relatively little quantitative evidence of the impact and value of SE.  Everyone can see the SE costs, such as the labor applied and the time allocated in the schedule.  The benefits of SE, however, may be less identifiable.  They often manifest themselves as

  • risks that didn’t materialize
  • rework that didn’t need to be done
  • customer complaints that didn’t occur, and
  • product deficiencies that are circumvented

Because these benefits are hard to quantify, however, the return from investment in SE is often unrecognized.  To get a true picture of the value of SE, we need to quantitatively measure its impact on acquisition program performance.

The remainder of this blog posting describes a research effort that the SEI undertook in partnership with the NDIA and the IEEE Aerospace and Electronic Systems Society (IEEE AESS).  This effort provided quantitative evidence of the value of SE in terms of its impact on program cost, program schedule, and program technical performance – impacts that are crucially important to program managers and executives.

Building on Previous Research in Systems Engineering

While “software engineering” is etched in stone at the SEI’s Pittsburgh headquarters, it is sometimes hard to draw a clear line between software and the systems supported by software.  For that reason, SEI staff have often conducted research in the SE realm. For example, the Capability Maturity Model Integration Framework addresses processes that apply equally to software development and system development. Architecture development and evaluation methods developed for software are routinely adapted and applied to systems.

Through the SEI’s affiliation with NDIA, my fellow researcher, Dennis Goldenson and I became involved in developing their response to the 2004 inquiry from OUSD (AT&L) mentioned earlier.  In 2005, I suggested conducting a survey of acquisition programs to gather information about their activities related to SE, and how those programs performed.  We could then identify relationships between these factors.  We conducted the survey in 2006 and published our results in 2007. Our initial research demonstrated that those programs that deployed more systems engineering performed better against measures of cost, schedule, and technical performance. 

In 2010, the DoD approached NDIA and the SEI about strengthening the business case for SE by expanding the survey population to include not just the NDIA but other professional organizations including the IEEE-AESS, and the International Council on Systems Engineering (INCOSE). For this latest study, we surveyed individual programs in participating organizations to obtain answers to the following questions:

  1. What systems engineering activities do you perform on your program? 
  2. How well does your program perform?

We surveyed 148 different programs.  Although most programs were supplying systems for the U.S. defense sector, we also received a few responses from organizations serving other market sectors and operating in different countries.

An Even Stronger Link Between SE and Performance

Our latest results, published in the SEI technical report, The Business Case for Systems Engineering Study:  Results of the Systems Engineering Effectiveness Study, identified strong links between the performance of systems engineering tasks and overall program performance. These results provide a convincing case for the value of systems engineering. This latest study collected information from participating programs along three dimensions:

  • systems engineering deployment. We assessed SE deployment by examining both the presence and the quality of work products resulting from SE activities.  These work products were selected from those listed in the CMMI Framework by a panel of SE experts.  Based on this assessment, SE deployment for each program was categorized as either low, medium, or high.
  • program performance. We assessed program performance as a combination of cost performance (satisfaction of budget), schedule performance, and technical performance (satisfaction of requirements).  Again, based on this assessment, program performance for each program was categorized as either low, medium, or high.
  • program challenge.  Some programs are inherently more challenging than others due to factors such as size, duration, technology maturity, interoperability requirements, etc.  Based on the combination of these factors, program challenge was categorized as either low, medium, or high.

We then looked for relationships between these metrics. We found a very strong relationship between SE deployment and program performance.  In particular, as programs deployed more SE, they delivered better performance.  For example, among those programs deploying the least SE, only 15 percent delivered the highest level of program performance.  Among those deploying the most SE, however, 56 percent delivered the highest level of program performance.

As one would expect, our research showed an inverse relationship between program challenge and program performance.  But, we also learned that SE practices became even more valuable when used with these challenging programs. We already noted that the number of programs delivering high program performance increased from 15 percent to 56 percent as SE deployment increased.  For the most challenging programs, however, the number of programs delivering high program performance increased from 8 percent to 62 percent with increased SE deployment.  This result clearly shows the increasing need for SE as programs become more challenging.

As mentioned above, we measured SE deployment by assessing SE-related work products for each program.  Now, we could group these artifacts into process areas such as

  • requirements development and management
  • program planning
  • product architecture
  • trade studies
  • product integration
  • verification
  • validation
  • program monitoring and control
  • risk management
  • configuration management
  • integrated product teams

Grouping artifacts into process areas enabled us to probe more deeply into the relationships between SE and program performance, identifying not only the overall benefit of SE but also the benefit of specific SE processes. For each program, we assessed SE deployment in each of the 11 process areas above and analyzed the relationship to program performance.  Here again, we found strong supporting relationships for all SE process areas – increased SE deployment in any of these areas contributed to better program performance. Relationships to program performance, however, were stronger in some than in others. Particularly strong relationships to program performance were found for

  • program planning. The number of programs delivering highest performance increased from 13 percent to 50 percent as SE activities related to program planning increased.
  • requirements development and management. The number of programs delivering highest performance increased from 21 percent to 58 percent as SE activities related to requirements development and management increased.
  • verification. The number of programs delivering highest performance increased from 16 percent to 54 percent as SE activities related to verification increased.
  • product architecture. The number of programs delivering highest performance increased from 16 percent to 49 percent as SE activities related to product architecture increased.

As strong as these relationships were, we found that they grew even stronger for the more challenging programs.

Transitioning to the Public

The results of our research can be used in a number of ways by system developers, system acquirers, and academia. For example, our findings constitute a baseline of SE deployment across the industries surveyed.  System developers can use our methods to assess their own SE capabilities, compare them to this baseline, and identify their strengths and weaknesses.  They can then develop process improvement plans to improve their weaknesses and strategies to leverage their strengths.  We continue to work with defense contractors who are applying this process to improve their SE capabilities.

System acquirers can also benefit from these findings.  A Program Management Office (PMO) acquiring a system needs to deploy good SE practices in planning the program, defining system requirements, developing system architectures, etc.  The PMO also needs to ensure that the system supplier deploys good SE practices.  For example, the PMO must first include in the solicitation a definition of SE activities that they expect from the supplier.  They should evaluate the supplier’s response to these expectations as a factor in the selection of the supplier.  They should also ensure that the SE expectations are included in the contract.  They should monitor the supplier’s performance throughout execution to ensure that SE expectations are being met.

The academic community can also use the results from our study.  For example, several universities offering systems engineering programs at the master’s level are using this information in their curriculum and their courses to show their students the value of systems engineering and to direct some of their courses to capitalize on the knowledge that we’ve gathered here.  INCOSE is also incorporating the results of the survey into its systems engineering handbook.

Additional Resources

The technical report The Business Case for Systems Engineering may be downloaded at
http://www.sei.cmu.edu/library/abstracts/reports/12sr009.cfm

Organizations interested in the continuing effort of showing the value of systems engineering may join INCOSE’s Software Engineering Effectiveness Working Group at
http://www.incose.org/practice/techactivities/wg/seewg/

Organizations interested in joining NDIA’s Software Engineering Effectiveness Committee at
http://www.ndia.org/Divisions/Divisions/SystemsEngineering/Pages/Systems_Engineering_Effectiveness_Committee.aspx

Share this

Share on Facebook Send to your Twitter page  Save to del.ico.us  Save to LinkedIn  Digg this  Stumble this page.  Save this page on your Google Home Page 

7 responses to “The Value of Systems Engineering”

  1. Vincenzo Arrichiello Says:
    Dear Prof. Elm,

    your pointing out the "invisibility" of the benefits of SE, made me recall of a citation of W. E. Deming by Joseph E. Kasser (*): "Heard in a Seminar: One gets a good rating for fighting a fire. The result is visible; can be quantified. If you do it right the first time, you are invisible. You satisfied the requirements. That is your job. Mess it up, and correct it later, you become a hero"
    (*)W. E. Deming, "Out of the Crisis", MIT Center for Advanced Engineering Study, 1986, as cited by Joseph E. Kasser in "Eight deadly defects in systems engineering and how to fix them", Proceedings of the 17th Annual International Symposium of INCOSE, 2007
  2. Joseph Elm Says:
    Mr. Arrichiello
    I completely agree with your comment.
    Since Benjamin Franklin said “An ounce of prevention is worth a pound of cure.”, mankind has been diligent in ignoring this lesson. Fortunately, over the past several decades, we have seen the discipline of systems engineering gain traction in more and more domains I remain hopeful that this incremental shift from correction to prevention will continue.
  3. Major Jaime Villa, USAF, Cyberspace Engineer Says:
    Dear Prof. Elm,
    I’m fascinated about your keen insight into the challenges of the DoD systems development and deployment process. Placing a value into an intangible activity such as systems engineering is a challenge we live with every day here at the Air Force Life Cycle Management Center, Business Sustainment Systems.
    As you so noted, while there seems to be plenty of evidence that indicates its value, more often than not it continues to be overlooked by many PMOs during the materiel Solution Analysis and Technology Development Phase due to cost and schedule pressures. That problem is even greater for systems that have entered the Operations and Support Phase. As an engineer, often a minority among military members within my field, I found myself too often trying to sell the value of systems engineer and architecture to decision makers.
    I would like to share an epiphany I once had as a young Lieutenant – like the characters from the TV Sitcom show “Big bang Theory,” we technical personnel (engineers) often get lost in translation because of how we communicate. I attribute my success in delivering operational capabilities on-time and on-schedule to that epiphany.
    While research that quantifies system engineering value in terms of bottom-line results is critical, I also believe that more research could be used to show academia that we need to educate our young engineers to better communicate in a non-technical fashion to decision makers.
  4. Joseph Elm Says:
    Major Villa
    Thanks for your comments and insights.

    Communication is always an obstacle when dealing with collections of people, but it is an obstacle that can be overcome given sufficient diligence and training. For many years SEs have faced the challenge of communicating the value of SE to other stakeholders. But in addition to this communication challenge, we were also troubled by a lack of hard evidence of that value. My hope is that research such as this provides that needed evidence.

    To help overcome the communications barriers, I am presently developing training and documentation focused on presenting the value of SE to non-SEs. Additionally, I have recently met with officials at OSD who are interested in this work. Over time, perhaps we can include this knowledge in the training that DoD PMs and others receive.
  5. Major Jaime Villa, USAF, Cyberspace Engineer Says:
    Your work sounds fascinating. Thanks for your feedback.
  6. Ray Harris Says:
    Dear Mr. Elm,
    Thank you for the extensive work to quantify the effects of SE. Within the data do you have any cases where a program started with low SE then switched to high SE and showed the same improvement trend? Likewise, are there cases that show the opposite? This would help understand the effect, as opposed to a commonly argued position that just hiring the right people will improve performance.
  7. Joe Elm Says:
    Ray,

    Thank you for your comment to my blog post.

    You asked if I had any data regarding cases where a program started with low SE then switched to high SE.

    The data we collected for the SE Effectiveness survey was a snapshot of the projects at the time of the survey. As such it contains no trend data.

    Bear in mind that the survey did not study the capabilities of the people - only the outcomes of the SE activities. Thus, it doesn't speak to staffing with the "right people".

    Best Regards

    Joe Elm

Add Comment


Leave this field empty: