How much time should your project spend on development vs. time on quality?
I've received a bunch of email over the past year asking me how much time a project should allocate to development and how much to quality. To me, that's a funny question, because I think of quality as integrated with development.
So, let's reframe the question: “How much time should we spend on making sure our development works during development and at the end of development?” I can answer that question better.
Software projects spend anywhere from 10% to 80% of their time on testing activities. Since that's such a wide range, I wrote a paper about the ratio of developers to testers.
Cem Kaner disagrees with my paper, and published something different at PNSQC 2001. I can't seem to find the paper anywhere on line.
Kathy Iberle attempts to describe the dynamics, see and click on the ratio article in whatever form you like.
Here's what I think: If you're not spending at least 20% of the project's activities on preventing, detecting, and fixing problems, you're not spending enough time on quality, no matter what your market says. Now, there are several ways to do that:
- you can inspect designs and code
- you can walk through designs and code
- you can develop tests for the product
- you can measure a whole bunch of things, such as the fault feedback ratio and the cost to fix a defect throughout the project, and take actions to see where you are.The problem is the best time and resources spent on software quality are *developer* time and resources. Testing time and resources are a good way to manage release risk, but they are not a proactive approach. System testing is a reactive approach, and reactive approaches always take longer and cost more.