Voices on Project Management

> Back to Voices Home

Improving the Testing Process

| | Comments (2) | TrackBacks (0)
I work with an IT software development organization, so most of my posts are specific to software development projects. During the testing phase, we typically experience the following problems:

1.    We expend approx 15% to 20% of the development effort in the bug-fixing phase.
2.    Our team discovers a lot of missing functionality during the testing phase.
3.    Quality assurance (QA) and development teams have different mindsets, so they understand the same feature in different ways.
4.    Test cases written by the QA are a conversion of a software requirements specifications document into an excel format.
5.    During and after the coding phase, the developer doesn't often test the application himself and instead leaves everything for QA. He tends to believe bug identification is QA's task and that the developer should only be responsible for fixing bugs.
    I think the software testing cycle works on the 90:10 rule: Ninety percent of the project takes 10% of the allocated time, while the remaining 10% takes 90% of the time. After expending a lot thought on this process, we came up with some solutions that may reduce the testing and bug fixing cycle:

1.    Let the QA and development teams both review the requirements and get necessary clarifications from the client.
2.    Ask the developer to give a presentation of his project understanding to QA and his module lead.
3.    Have QA prepare the test cases.
4.    Ensure test cases cover the functionality as well as the user cases and scenarios
5.    Have the developer review and log defects, too.
6.    After the completion of the coding phase, ask the developer to run high priority test cases prepared by QA.
7.    The developer should submit the test log to QA.
8.    QA shall start the testing.
9.    Each discovered bug should have a corresponding test case ID. If the test case doesn't exist for the bug, then QA should add a new test case for it. This will ensure the test cases have covered all the use cases.
10.    In the test log for each failed test case, enter the bug ID. This will ensure all bugs are raised and tracked to closure.
11.    Perform the Root Cause Analysis (RCA) for each bug and improve the coding process.
12.    Track the bugs raised by QA versus the bugs raised in user acceptance testing or bugs raised in production.
13.    Test cases should be data-oriented, and QA should be trained enough to write simple SQL (Structured Query Language) queries.
14.    The test log should show the number of rounds executed with the number of test cases that have failed, passed or have not been executed for each round of testing.
15.    Track the actual effort spent in the testing and bug fixing phases to better plan for the next module or project.


Bookmark and Share


The views expressed within the PMI Voices on Project Management blog are contributed from external sources and do not necessarily reflect the views and opinions of PMI.

0 TrackBacks

Listed below are links to blogs that reference this entry: Improving the Testing Process .

TrackBack URL for this entry: http://blogs.pmi.org/mt-tb.cgi/173

Leave a comment

All comments are reviewed by our moderators, and will not appear on this blog unless they have been approved. Comments that do not relate directly to the blog entry's contents, are commercial in nature, contain objectionable or inappropriate material, or otherwise violate our User Agreement or Privacy Policy, will not be approved. For general inquiries not related to this blog, please contact Customer Service. Please read the Comments -- Question and Answers.


Hello Sanjay,

I work in an IT software development and consulting company and I agreed with you about your observations related to the testing phase. Thanks for sharing your solutions with us.

Best Regards,

Leticia Molina
FYC Group

While I agree with Mr. Saini's observations concerning the problems that occur all too frequently during the testing phase of a project, I disagree with one aspect of his recommendations, namely, that developers should run test scripts prepared by QA.

Developers are accountable for delivering fully functioning, bug-free, thoroughly-tested code. They should develop their own tests that validate every piece of code, including all branching logic, case structures, unexpected result processing, etc., in addition to business design and functionality. By doing so, they demonstrate their understanding of the business, the business need and the business proceses involved.

Developers are not Prima Donnas, nor are they incompetent when it comes to testing. Like most of us, they would prefer to code than document or undertake the more tedious parts of their job. Make no mistake about it, they are professionals who must not be coddled, but be held to professional standards. They are the primary testers and it is their jobs to find and fix any problems before they get to QA, let alone production.

QA is the second line of defense and brings a different perspective to the testing process. If QA develops the test scripts that coders use to test, then instead of applying a different set of tests to the system, you get redundancy and a greater chance that errors will get through to production.

As a rule, I do not insist that my coders create fully documented test scripts. They can if they choose. I may insist that they develop a comprehensive list of test conditions to ensure they are systematic in testing their code and show some proof they have tested all of them.

As every project participant knows, the earlier a problem is found, the less costly it is to fix. This applies to testing as well.

I recognize that developers won't always deliver perfect, bug-free code - that is rare indeed. But it is also true that the higher we set our standards and expectations, the better the results. Aim high, ladies and gentlemen!

About This Blog

Voices on Project Management offers insights, tips, advice and personal stories from project managers in different regions and industries. The goal is to get you thinking, and spark a discussion. So, if you read something that you agree with — or even disagree with — leave a comment.

All posts represent the opinions of the bloggers.

Follow PMvoices on Twitter

About Bloggers

Keep checking back because the voices for this blog will continue to grow and change to represent a variety of regions, industries and opinions.

Read blogger profiles

Voices Poll