Entries by 'Donald Firesmith'

Four Types of Shift Left Testing

Testing 2 Comments »

By Donald Firesmith
Principal Engineer
Software Solutions Division

Don Firesmith One of the most important and widely discussed trends within the software testing community is shift left testing, which simply means beginning testing as early as practical in the lifecycle. What is less widely known, both inside and outside the testing community, is that testers can employ four fundamentally-different approaches to shift testing to the left. Unfortunately, different people commonly use the generic term shift left to mean different approaches, which can lead to serious misunderstandings. This blog post explains the importance of shift left testing and defines each of these four approaches using variants of the classic V model to illustrate them.

Read more...

Using V Models for Testing

Testing 10 Comments »

By Donald Firesmith
Senior Member of the Technical Staff
Software Solutions Division

Donald FiresmithThe verification and validation of requirements are a critical part of systems and software engineering. The importance of verification and validation (especially testing) is a major reason that the traditional waterfall development cycle underwent a minor modification to create the V model that links early development activities to their corresponding later testing activities. This blog post introduces three variants on the V model of system or software development that make it more useful to testers, quality engineers, and other stakeholders interested in the use of testing as a verification and validation method.

Read more...

Common Testing Problems: Pitfalls to Prevent and Mitigate

Testing No Comments »

Second of a Two-Part Series
By Donald Firesmith
Senior Member of the Technical Staff
Acquisition Support Program

Donald Firesmith In the first blog entry of this two part series on common testing problems, I addressed the fact that testing is less effective, less efficient, and more expensive than it should be. This second posting of a two-part series highlights results of an analysis that documents problems that commonly occur during testing. Specifically, this series of posts identifies and describes 77 testing problems organized into 14 categories; lists potential symptoms by which each can be recognized; potential negative consequences, and potential causes; and makes recommendations for preventing them or mitigating their effects.

Read more...

Common Testing Problems: Pitfalls to Prevent and Mitigate

Testing 6 Comments »

First of a Two-Part Series
By Donald Firesmith
Senior Member of the Technical Staff
Acquisition Support Program

Donal Firesmith A widely cited study for the National Institute of Standards & Technology (NIST) reports that inadequate testing methods and tools annually cost the U.S. economy between $22.2 and $59.5 billion, with roughly half of these costs borne by software developers in the form of extra testing and half by software users in the form of failure avoidance and mitigation efforts. The same study notes that between 25 and 90 percent of software development budgets are often spent on testing. This posting, the first in a two-part series, highlights results of an analysis that documents problems that commonly occur during testing. Specifically, this series of posts identifies and describes 77 testing problems organized into 14 categories, lists potential symptoms by which each can be recognized, potential negative consequences, potential causes, and makes recommendations for preventing them or mitigating their effects.

Read more...

A Deeper Dive into the Method Framework for Engineering System Architectures

Acquisition No Comments »

By Don Firesmith
Senior Member of the Technical Staff
Acquisition Support Program

Don Firesmith Engineering the architecture for a large and complex system is a hard, lengthy, and complex undertaking. System architects must perform many tasks and use many techniques if they are to create a sufficient set of architectural models and related documents that are complete, consistent, correct, unambiguous, verifiable, usable, and useful to the architecture’s many stakeholders.  This blog posting, the second in a two-part series, takes a deeper dive into the Method Framework for Engineering System Architectures (MFESA), which is a situational process engineering framework for developing system-specific methods to engineer system architectures.

Read more...