What Should Done Mean?

Josh Kerievsky has an intriguing post about Redefining Done. The idea is that a story is not done until:

A story isn't done until it is being used by real users in production and has been validated to be a useful part of a product.

I have trouble with this definition:

  • The development team is dependent on other people getting the product out to the users, especially in the case of shrink-wrap, or a hard product. The development team cannot deploy the product alone. (Rarely can a development team deploy a product alone, but it's possible with SaaS. It's not probable! For other products, you need a program team or some other mechanism to make sure the marketing is done, the bill of materials is created, etc. Deployment is almost always separate from development. I'll blog later about whether it should be.) Just as release criteria need to be under the team's control, the definition of done needs to be something the team can accomplish.
  • The development team is dependent on the users actually using the product. To me, this violates the separation of what to build (from the product owner or customer) and the how to build it (from the development team). There is one person, a product owner or customer talking to the team. Yes, that person needs to know what real users will use. And, I don't see how a team could talk to a gaggle of users and know what to build. The organization's strategy is part of what to build.
  • It's possible that you would have an entire release of product waiting for “done.” Features would be some sort of done, but not all the way done, because the users had not used the product and validated it. This violates–for me–the notion of being done for now, but having demonstrable, releaseable product at the end of an iteration or when a card is marked done.

I see that the feedback from real users is helpful, maybe even necessary. I've been on projects where I certainly would have liked the feedback earlier than at release. Maybe the real value of this definition is to jiggle us into thinking about how to get feedback from users earlier, in the same way as in How Short Can Your Iterations Be? Thinking and rethinking about what done means (or how long your iterations are) to reduce waste in the project and deliver a useful product is a good thing.

I still have trouble with done not being under control of the development team. If a user or user surrogate has explained what he/she wants, and the development team has implemented it, and that person has seen a demo or used it, I have to believe the feature is done.

I'm not convinced that expanding the definition of done helps a project team. It does help us think of the obstacles that prevent us from obtaining feedback as early as a feature is complete. I'm still going with the idea that done needs to be under the control of the development team.

13 Replies to “What Should Done Mean?”

  1. I agree that this definition is challenging if not impossible to implement in a helpful way in many organizations, especially larger ones. However, I think that if you can’t connect what you’re doing at any stage of development (PO, developer, tester, ops) with what users are actually getting, then you are asking for trouble. (Yes I know that almost no one does this, even so!)

    Joshua is arguing for systems thinking. Using any other definition of done besides his will create a system that can be “gamed” to the detriment of the company. In other words, development can “win” by delivering tons of features quickly that take months to be released. Product owners can “win” by defining hundreds of features that the market doesn’t care about. Ops can “win” by taking months to deOnly by tying everyone’s metrics together can you ensure that everyone has their eyes on the same goal.

    On the other hand, if that is the _only_ measure of progress you use, I don’t see how anything could work at all. I’m imagining a board that has columns for each function (ux, dev, test, ops, production, validated, etc) and each story goes through each of the columns. Each group’s responsibility is to get stories out of their column as quickly as possible – but the difference is that no one gets any “points” until the story gets to the end of the board and users have validated it.

  2. I understand your point Johanna. The whole Agile methodology is developer-centric. This definition tries to expand the scope of Agile. I think that product quality will probably improve if one used the new definition however. The product value to the organization would probably also improve. Jeremy’s point about “gaming” the system is a valid one. Getting stories “done” as far as a developer is concerned may not serve the sponsoring organization well. You give them what they ask for, but that may not be what they need. By extending the definition of “done”, the developer has move of an interest in creating something that will be useful to the people that actually use it rather than merely compliant with what the product owner asked for. It makes the team larger and gets everyone working toward the same goal rather than compartmentalizing work.

  3. I have trouble (as a developer) with producing software and systems that are not used or not good to use. *When possible* I’d go for Joshua’s definition of done (can we call it done-done or is that term for something else?).

    But of course it is not always possible.

  4. What Josh suggests is a benefit analysis determining if the Product Owners priority selection was appropriate/valid. He is looking for a definition of success, not a definition of done. Plenty of features have made the measure of done and come to be unsuccessful as a result of invalid priority.

    Take for example a changing environment. A feature is developed to the full extent of *the team’s* definition of done. The Product Owner (and Stateholder’s) are happy, but days or weeks before shipping a competitor releases a competing feature which nullifies the innovation of the feature being produced. This feature may no longer meet the needs of the customer as a result of these circumstances, but it certainly doesn’t mean the story isn’t done.

  5. The comments belie a serious issue which is team commitment. A committed team isn’t gaming a system. THey are working collaboratively to deliver as much value as possible.

  6. I’m not sure why tou wouldn’t just track both metrics. What is the velocity and/or cycle-time for any one part of the value-stream is just as important to know as the velocity/cycle-time for the entire thing. It’s hard to improve anything without both sets of data.

  7. There are stages of done. Hopefully there’s a stage where the issue is defined – and that’s done (usually ‘for now’ as there will undoubtedly be updates to it in the future). There is a stage where the requirements for the solution are done (or baselined as there will undoubtedly be updates to them as the team moves forward). Then there is the development of the various iterations – all of which will eventually be done. And of course testing of each iteration. Then deployment. etc.

    At each stage it’s possible to be ‘done’ yet the entire project isn’t complete. You can be done with a task and yet have much left to do to finish your share of the work.

    By the time a product is shipped for use by the consumer, the original Business Analyst who recorded and baselined the requirements may have been done for months – maybe even a year. But the project isn’t done until the product is shipped and used and verified in the real world. Even then, from the support aspect it’s not done, but it’s usually done enough to disband the project team.

    I think Josh is indeed defining “done-done” As in The END! Finis!

  8. I’m 100% with Josh on this issue. You say that his approach “… violates the separation of what to build (from the product owner or customer) and the how to build it (from the development team).” Where did this separation principle come from and just why is it a good idea?

    I have spent many years working on products that happened to involve software, and to me, the idea that the software is separate from the product is a non-starter. Expecting a single person to decide what the software should do violates the best research in agile development (see for example Angela Martin’s work) and also the best research in collaboration (well summarized in “The Wisdom of Crowds” by Surowiecki).

    I have always struggled with the concept of projects, because they tend to have this artificial separation of the project objectives from the overall objectives. I agree with Kent Beck when he said: “There is no such thing as a technical success.” If you think about it, the term “technical success” is a euphemism for failure.

  9. Wow — lots of comments so quickly! Thanks for the thought provoking article!

    I think the question isn’t so much about the definition of done, so much as what is a reasonable scope for the team. IE, at what point is there there a natural wall for something to be completed and thrown over? Deployment might be a wall, shrink wrapping might be one, marketing might be one, etc.

    In cases where the team can be expanded to include everything from the initial concept through deployment, marketing, and support … then I’d expand the definition of done to include all of those bits.

    But this is not always practical, so the definition of done needs to be kept closer to the scope the team can control or influence.

    Match the definition of done to the scope of the team’s influence.

  10. It seems to me that the value in having “done’ further down the development cycle is that it discourages us from ‘throwing a story over the wall’ when it’s done as far as we’re concerned.

    I wrote about a few of the definition of ‘done’ that I’ve come across in a blog post last year – http://www.markhneedham.com/blog/2009/01/04/agile-when-is-a-story-done/

    I think it’s still interesting to question why we need to ask when something is done…what do we get from knowing that it’s done?

    In some ways I think it’s about being able to track how much a team is producing so that we can predict how many features we’re likely to be able to add to our product in a given amount of time.

    A lot of people I’ve worked with get the majority of their satisfaction from knowing that they’ve completed something so from that perspective it makes sense to have a definition which is within the team’s control.

  11. As usual, I want to ask about the purpose of the definition of “done” before commenting on its appropriateness. On the one hand, we don’t want to set the bar so high for a project team that the team can never reach it, but on the other hand, if we don’t take “done” seriously, then the organization will never realize the value of agile software development — not even a little.

    I don’t belong to the camp that claims that one needs to demand the impossible to get great results; but at the same time, unrealistic goals tend to be easier to reach than one thinks. I’d like to explore what it would take, even in a nastily complex large-scale corporate environment, to really deliver software to customers. In some cases, bypassing 90% of the system is easier than changing it in increments.

    I strongly recommend against letting the team’s influence determine the definition of done, except as a temporary compromise. If you stop there, then you’ll never get past test-first programming.

  12. For my current teams, Done means, “We strongly believe it is ready for production.” We support that statement with other data, including things like having deployed it in a production-like environment.

    If you’re not getting Done at the end of each iteration, then you’re not Agile. (http://kasperowski.com/2010/06/if-youre-not-done-youre-not-agile.html) If your iteration’s-worth of stuff isn’t ready for production, then you have introduced new debt that you’ll have to repay before you go into production.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.