Finding the Solution to Crappy Software

Maturity levels are relative to each organization’s size and goals. By following this framework, you can identify and characterize the capabilities of your software quality practices. Then, the framework can suggest steps that you can take to begin improving. By realizing where your organization fits within the five stages, you are empowered to adopt a more proactive and structured approach to software quality throughout the application lifecycle.

Let’s take a look at the stages and see what each means. I have also included a few real world examples from my experience with various organizations.

Traditional Testing: A mostly reactive and ad hoc approach with very few formal test processes and unpredictable outcomes. Manual testing prevails and quality assurance (QA) teams are small with minimal budgets for tools.

Example: The Aerospace Manufacturing Division of Lear Siegler was clearly in this stage. Results were only as good as the individual developers’ testing expertise. As a rule, we experienced more failures than successes. Confidence in IT-produced software was almost nonexistent and users relied on IT as a last resort!

The solution was to implement simple, standard application lifecycle management (ALM) procedures that included quality checks at critical points in the development process. All developers utilized this standard and reported progress and shared results against the same baseline. As a result, individual program quality improved dramatically and within two years we reached Level 4 operational lifecycle quality management (LQM) and successfully installed an integrated manufacturing resource planning system.

Test Automation: Testing is managed with some automated procedures. Tests are repeatable and predictable, and regression and load/performance testing are routinely performed. However, there is little focus on process and any improvement initiatives are on a task by task basis.

Quality Management: Test process automation is utilized to drive improved testing efficiency. Requirements are better defined and there is clearer visibility of results. The QA function has a project or team focus, and is able to respond effectively to change. Often there is a drive to consolidate test tools, and to achieve a targeted return on investment (ROI) on any new tool purchase.

Example 1: The development organizations I found in health care and retail food distribution fell into this level of testing maturity. For their critical business systems (in health care, claim processing systems and, in retail, food scheduling/distribution systems), they had established a separate quality control team with rigorous testing processes to ensure system integrity. However, for all other development efforts it was back to Level 1, maybe Level 2 at best. The goal was to bring the integrity of critical business system to a satisfactory level.

The challenge was the existing process was very unresponsive. It was sometimes six months before a new release of the software could be placed into production—regardless of the magnitude of changes. Additionally, all other development quality efforts were not reliable and were dependent upon the individual developer resulting in very frustrated users.

The solution was to first implement a standard ALM that integrated and streamlined the process used by the existing QA management function. The turnover rate was improved to bi-monthly implementations. Then all developers were trained to use the same standard ALM with built in quality processes and performance check points. This caused an almost immediate improvement in software quality.