Assessing SE Effectiveness In Major Defense Acquisition Programs


Principal Investigator:  Dr. Barry Boehm, University of Southern California (boehm@usc.edu)

Timeframe:  September 2008 to September 2009

Category:  Systems Engineering and Systems Management Transformation


Description

Objectives: For the DoD, whose Major Defense Acquisition Programs (MDAPs) frequently and significantly overrun their budgets and schedules and deliver incomplete systems, the SE Effectiveness Measure (EM) framework, operational concepts, and tools will assist MDAP sponsors and performers to collaboratively determine their early SE shortfalls and enable the development of successful systems within their resource constraints.

Approach: Reviewed over two-dozen sources of candidate SE EMs, and converged on the strongest sources to identify candidate SE EMs. Developed a coverage matrix to determine the envelope of candidate EMs, and the strength of consensus on each candidate EM. Fed results back to the source originators to validate the coverage matrix results. This led to further insights and added candidate EMs to be incorporated into an SE Performance Assessment Framework, which is organized into a hierarchy with 4 Goals, 18 Critical Success Factors, and 74 Questions. Concurrently, the research project was extended to also assess SE personnel competency as a determinant of program success. The project analyzed an additional six personnel competency assessment frameworks and sets of questions. Their Goals and Critical Success Factors were very similar to those used in the SE Performance Assessment Framework. The resulting SE Competency Assessment Framework added one further Goal of Professional and Interpersonal Skills with five Critical Success Factors, resulting in a framework of 5 Goals, 23 Critical Success Factors, and 81 Questions.

Significant Research Findings & Products: The results of the SE Program Assessment Tool and SE Competency Assessment Tool pilot assessments, the DAPS and SADB comparative analysis, and the quantitative business case analysis for the use of the SE EM framework, tools, and operational concepts is sufficiently positive to conclude that implementation of the approach is worth pursuing. Presentations at workshops have generated considerable interest in refining, using, and extending the capabilities and in co-funding the follow-on research. The framework and prototype tools have been shown to be largely efficacious only to date for pilot projects done by familiar experts in a relatively short time. The greater the project’s size, criticality, and stability are, the greater is the need for validated architecture feasibility evidence. However, for very small, low-criticality projects with high volatility, the evidence generation efforts would make little difference and would need to be continuously redone, producing a negative return on investment. In such cases, agile methods such as rapid prototyping, Scrum and eXtreme Programming will be more effective.

Deliverables

Publications

Publications:  

  • Boehm, B., Ingold, D., Dangle, K., Turner, R., Componation, P. “Early identification of SE-related program risks”, Proceedings from 8th Annual Conference on Systems Engineering Research, March 15, 2010.

Research Team

Researchers

  • Barry Boehm, University of Southern California
  • Kathleen Dangle, Fraunhofer Center at the University of Maryland
  • Linda Esker, Fraunhofer Center at the University of Maryland
  • Forrest Shull, Fraunhofer Center at the University of Maryland
  • Rich Turner, Forrest Shull Stevens Institute of Technology
  • Jon Wade, Forrest Shull Stevens Institute of Technology
  • Mark Weitekamp, Forrest Shull Stevens Institute of Technology
  • Paul Componation, University of Alabama in Huntsville
  • Julie Fortune, University of Alabama in Huntsville
  • Sue O’Brien, University of Alabama in Huntsville
  • Dawn Sabados, University of Alabama in Huntsville
  • JoAnn Lane, University of Southern California
  • George Friedman, University of Southern California
  • Dan Ingold, University of Southern California
  • Windsor Brown, University of Southern California

Collaborating Institutions

Project Researchers