I presented a webinar today, Becoming a More Agile Tester. Here's the PDF. (It's a talk, so if you read it and think you've missed something, you have. Send me email with your question.) I've been thinking a lot about test assets these days, and here's a highlight from the presentation, a comparison of how nimble your tests are, depending on what kinds of tests you have. The reason I'm thinking about nimbleness of tests is simple. Organizations who have a large investment in the upper left corner of this table (and little or no ability to develop more tests in the lower right part of the table) can't easily move to Agile lifecycles.
|Types of tests||Requirements-based (including use cases)||Architecture-based||Design-based||Code-based|
|Manual, GUI-based||much less nimble||much less nimble||not sure these exist||not sure these exist|
|Automated, GUI-based||much less nimble||much less nimble||not sure these exist||not sure these exist|
|Manual, under-the-GUI||somewhat more nimble||somewhat more nimble||much more nimble||much more nimble|
|Automated, under-the-GUI||somewhat more nimble||somewhat more nimble||much more nimble||much more nimble|
There's at least one thing wrong with this table, because it doesn't discuss the cost of being nimble. However, I can't make good generalizations about how much each kind of tests costs, because the table also doesn't discuss how much risk is associated with not having a particular kind of test. Both cost and risk are particular to each product.
But I do know one thing. The more tests you have that require manual running, and are GUI-based, the harder it is to move to an agile lifecycle. And, agile lifecycles are the best at managing technical and schedule risk.