Entries by 'Donald Firesmith'

Using V Models for Testing

Testing 8 Comments »

By Donald Firesmith
Senior Member of the Technical Staff
Software Solutions Division

Donald FiresmithThe verification and validation of requirements are a critical part of systems and software engineering. The importance of verification and validation (especially testing) is a major reason that the traditional waterfall development cycle underwent a minor modification to create the V model that links early development activities to their corresponding later testing activities. This blog post introduces three variants on the V model of system or software development that make it more useful to testers, quality engineers, and other stakeholders interested in the use of testing as a verification and validation method.

Read more...

Common Testing Problems: Pitfalls to Prevent and Mitigate

Testing No Comments »

Second of a Two-Part Series
By Donald Firesmith
Senior Member of the Technical Staff
Acquisition Support Program

Donald Firesmith In the first blog entry of this two part series on common testing problems, I addressed the fact that testing is less effective, less efficient, and more expensive than it should be. This second posting of a two-part series highlights results of an analysis that documents problems that commonly occur during testing. Specifically, this series of posts identifies and describes 77 testing problems organized into 14 categories; lists potential symptoms by which each can be recognized; potential negative consequences, and potential causes; and makes recommendations for preventing them or mitigating their effects.

Read more...

Common Testing Problems: Pitfalls to Prevent and Mitigate

Testing 5 Comments »

First of a Two-Part Series
By Donald Firesmith
Senior Member of the Technical Staff
Acquisition Support Program

Donal Firesmith A widely cited study for the National Institute of Standards & Technology (NIST) reports that inadequate testing methods and tools annually cost the U.S. economy between $22.2 and $59.5 billion, with roughly half of these costs borne by software developers in the form of extra testing and half by software users in the form of failure avoidance and mitigation efforts. The same study notes that between 25 and 90 percent of software development budgets are often spent on testing. This posting, the first in a two-part series, highlights results of an analysis that documents problems that commonly occur during testing. Specifically, this series of posts identifies and describes 77 testing problems organized into 14 categories, lists potential symptoms by which each can be recognized, potential negative consequences, potential causes, and makes recommendations for preventing them or mitigating their effects.

Read more...

A Deeper Dive into the Method Framework for Engineering System Architectures

Acquisition No Comments »

By Don Firesmith
Senior Member of the Technical Staff
Acquisition Support Program

Don Firesmith Engineering the architecture for a large and complex system is a hard, lengthy, and complex undertaking. System architects must perform many tasks and use many techniques if they are to create a sufficient set of architectural models and related documents that are complete, consistent, correct, unambiguous, verifiable, usable, and useful to the architecture’s many stakeholders.  This blog posting, the second in a two-part series, takes a deeper dive into the Method Framework for Engineering System Architectures (MFESA), which is a situational process engineering framework for developing system-specific methods to engineer system architectures.

Read more...

The Method Framework for Engineering System Architectures

Acquisition 4 Comments »

By Don Firesmith
Researcher
Acquisition Support Program

Don Firesmith Engineering the architecture for a large and complex system is a hard, lengthy, and complex undertaking. System architects must perform many tasks and use many techniques if they are to create a sufficient set of architectural models and related documents that are complete, consistent, correct, unambiguous, verifiable, and both usable by and useful to the architecture’s many stakeholders.  This blog posting, the first in a two-part series, presents the Method Framework for Engineering System Architectures (MFESA), which is a situational process engineering framework for developing system-specific methods to engineer system architectures. This posting provides a brief historical description of situational method engineering, explains why no single system architectural engineering method is adequate, and introduces MFESA by providing a top-level overview of its components, describing its applicability, and explaining how it simultaneously provides the benefits of standardization and flexibility.  

Read more...