Entries Tagged as 'Measurement & Analysis'

Improving Data Quality Through Anomaly Detection

Data Quality , Measurement & Analysis No Comments »

By Mark Kasunic,
Senior Member of the Technical Staff,
Software Engineering Process Management Program

Organizations run on data. They use it to manage programs, select products to fund or develop, make decisions, and guide improvement. Data comes in many forms, both structured (tables of numbers and text) and unstructured (emails, images, sound, etc.). Data are generally considered high quality if they are fit for their intended uses in operations, decision making, and planning. This definition implies that data quality is both a subjective perception of individuals involved with the data, as well as the quality associated with the objective measurements based on the data set in question. This post describes the work we’re doing with the Office of Acquisition, Technology and Logistics (AT&L)—a division of the Department of Defense (DoD) that oversees acquisition programs and is charged with, among other things, ensuring that the data reported to Congress is reliable.

Read more...

A New Approach for Developing Cost Estimates in Software Reliant Systems, Second in a Two-Part Series

Measurement & Analysis , Software Cost Estimates No Comments »

By Robert Ferguson
Senior Member of the Technical Staff
Software Engineering Process Management Program


Robert Ferguson The Government Accountability Office (GAO) has frequently cited poor cost estimation as one of the reasons for cost overrun problems in acquisition programs. Software is often a major culprit. One study on cost estimation by the Naval Postgraduate School found a 34 percent median value increase of software size over the estimate.  Cost overruns lead to painful Congressional scrutiny, and an overrun in one program often cascades and leads to the depletion of funds from others. The challenges encountered in estimating software cost were described in the first post of this two-part series on improving the accuracy of early cost estimates.  This post describes new tools and methods we are developing at the SEI to help cost estimation experts get the right information they need into a familiar and usable form for producing high quality cost estimates early in the life cycle.

Read more...

Improving the Accuracy of Early Cost Estimates for Software-Reliant Systems, First in a Two-Part Series

Measurement & Analysis , Software Cost Estimates No Comments »

By Robert Ferguson
Senior Member of the Technical Staff
Software Engineering Process Management Program

Robert Ferguson Head Shot The Government Accountability Office (GAO) has frequently cited poor cost estimation as one of the reasons for cost overrun problems in acquisition programs. Software is often a major culprit. One study on cost estimation by the Naval Postgraduate School found a 34 percent median value increase of software size over the estimate.  Cost overruns lead to painful Congressional scrutiny, and an overrun in one program often leads to the depletion of funds from another.  This post, the first in a series on improving the accuracy of early cost estimates, describes challenges we have observed trying to accurately estimate software effort and cost in Department of Defense (DOD) acquisition programs, as well as other product development organizations.

Read more...