Imagine you're a fly on the wall in a readiness review meeting—a meeting of the project and senior managers to see if the product is ready to release.
Senior manager: “Where are we with the testing?”
Test manager: “Oh, here's the defect data and the test data and…”
Senior manager: “No, tell me where we are with the testing!”
Later on, the senior manager says to a colleague: “That test group is a black hole. The product goes in, but nothing comes out.”
If this sounds familiar, you've encountered the data-but-not-information problem. The test group has defect and test data out the wazoo, and may even have limited data about the product's performance. But these reports inundate people with data without providing information about the product under test – and that's what your customers want to know.
Who Needs What Kind of Information?
Customers inside the organization such as developers, project managers, or senior managers look to the test group for data translated to meaningful information. They may also look to testers for explanations of risks and evaluations of whether to release the product. When you start thinking about what to measure and report on the results of the testing, think about all your customers.
Developers want feedback about their development efforts, so the location and frequency of defect occurrence is important to them. But if you only provide defect data, you haven't provided the big picture of the assessment of the product based on the test group's findings.
Senior managers and project managers want to know how much progress you've made on testing the system, the kinds of problems the testing effort has uncovered, the work remaining, and the risks of releasing.
How Can You Present the Information Effectively?
Just as a car's dashboard supplies the information you need while driving, a testing dashboard presents an integrated set of information about the testing effort. If you haven't yet thought about what your testing dashboard should contain, consider a one-page summary of progress that looks something like this:
|Area/Module/Feature||Last Test Date||State||Next Planned Test|
|Module A||1/12, Build 37||Blocked. See Ann for details||Build 42|
|Module B||1/13, Build 40||Passed all regression tests.||Recheck with Build 42|
|Module C||1/12, Build 38||Passed all regression tests. Exploratory testing ongoing||Ongoing Checking|
|Module D||1/13, Build 39||Vijay and Dan working together on regression tests. Major problems.||Build 41|
|Module E||1/13, Build 40||Passed all regression tests. All fixes verified. No more effort until final cycle.||Final Build|
With a dashboard, all the information is presented in one place. Highlighting the overall status draws the reader's eye, so management and developers can easily zero in on the summary data they care about. Every line in the dashboard details the current status and the next step planned. Module A shows severe problems, but reports when (Build 42) the next test effort will start. For Module D, managers read that a tester and a developer are working together to fix “major problems,” lessening the probability of them micromanaging you.
Strengthen the dashboard by adding the defect tracking chart (defect find, close, remaining open by week), the test progress chart (tests planned, run, passed by week), and the progress toward meeting release criteria. Now you have data that supports your release risk assessment and how you're utilizing the test staff and hard resources to complete the testing work.
You're in the Information Business
Providing data is important. You'll need the data to create a one-page status report for your customers across the organization, but furnishing data alone is not enough. Consider what other people need to know about the product or the status of testing. Remember: You are in the information gathering and dissemination business.
I thank Esther Derby, Dwayne Phillips, Keith Ray, and James Tierney for their helpful review of this column.
View other types of testing dashboards by James Bach at www.satisfice.com/presentations/dashboard.pdf., or read more in Rex Black's Critical Testing Processes (see chapter 15).