Estimation and the Sunk Cost Fallacy

I’m not a fan of using schedule or cost estimate as a way to value the projects in your project portfolio. If you do, you are likely to miss the potentially transformative projects or programs.

In Manage Your Project Portfolio, I have an entire chapter devoted to ways to evaluate your project portfolio: business value points (not story points), waste, risk, to name just three. I wrote about cost of delay on this blog a while ago.

Last night, I heard Jutta Eckstein give a talk about Beyond Budgeting. That talk was the genesis for our Agile 2014 workshop, Diving for Hidden Treasures: Finding the Real Value in Your Project Portfolio.

Last night, I realized something while Jutta was speaking about estimation and sunk cost. One of the attendees asked about estimation. A paraphrase of that question is, “How can we start if we don’t know how long it will take?”

The projects that are worth doing have risk and are complex. What do you know about these projects? You can’t predict them. You cannot know how long they will take.

Remember the post Why Cost is the Wrong Question for Evaluating the Projects in Your Project Portfolio? If you don’t do the risky projects, if you only do the safe projects, you might maximize short-term gains. You lose the long-term market share, because you don’t do the risky projects.

Many managers find themselves in this position. The Operations committee or the PMO or “someone in charge” wants an estimate of the budget or when this project will be done. You provide a 3-point estimate: optimistic, realistic, and pessimistic estimate. Or, you provide an estimate with a percent confidence. (You have read Essays on Estimation.)

They don’t like that estimate, so they take the optimistic estimate, or they remove the percent confidence. That’s the first date you can’t prove you can’t make the project.

Never mind that the estimate is a guess. Never mind that you had caveats. Never mind that you had assumptions. Your estimate is now a commitment.

That date arrives. You are not done. You are in a variant of the 90% done schedule game. Why? Because you had an estimate. If you did not re-estimate, and update your estimate. Even if you did, your “in charge” people might not have wanted to hear your updated estimate.

Now, you might be in the sunk cost fallacy. The sunk cost fallacy says, “We have spent so much money on this, we may as well finish it. We can’t recover that cost. Our estimate says we only have xx much left to go.”

The difference between evaluating the project portfolio on value and estimates is that when you look at the business value, you ask this question first:

Should we do this project at all?

You always ask that question. It’s the zeroth question. Even if the project has been proceeding for a year. Because sunk cost doesn’t matter. You still have to support the system after you release it. Repeat that sentence: you still have to support the system after you release it.

If you evaluate the project portfolio based on estimates, you don’t ask the “Should we” question. You ask about the estimates. You don’t ask about follow-on after the release. You don’t go meta, which is the point of the question.

Sunk cost will catch you every time. Can you avoid the sunk cost fallacy? Of course. Do estimates cause the sunk cost fallacy? Of course not. They contribute to it.

In my experience, you are more likely to be caught by the sunk cost fallacy if you use estimation, because you are less likely to go meta on the value of your projects.

Is it okay to know if this project is bigger than a bread box? Of course. Should you use those estimates as a way to hold a project’s commitment to dates? No. Not if you want to work in an agile way and update the backlog as you work through iteration or flow the work. See Trust, Agile Program Management, & Being Effective for why you want to vary the backlog.

If you use estimates as a way to evaluate the projects in your project portfolio, beware of the sunk cost fallacy. You would be better off to ask, “How much value does this project have for us, compared to the other projects we have to do?”

8 Comments

  1. One to answer the ZERO’th question is to find a home of the project in the strategy for the business. It may be the project is a “hygiene” project – we have to have it in order to stay in business. But that can be in the strategy

    Here’s an approach known to be successful in enterprise IT
    http://www.slideshare.net/galleman/notes-on-balanced-scorecard

    Page 88 was our BCS for Enterprise IT on a multi-billion $ Department of Energy program. That approach has been syndicated in Health Insurance, Power Companies, Bio-Pharma, City and State Government.

    For product development, a different “perspective” is needed, since internal IT is usually not revenue generating and product are.

    Here’s a map connected the strategy with the operations of enterprise IT.
    http://www.slideshare.net/galleman/balanced-scorecard-based-enterprise-it-organization

    “Line of sight visibility to produced value” was the phrase we learned to use from the Harvard BSC team that came to help us define our value to the organization.

    Reply
    • Hi Glen, nice to see you here! Thanks for your comment.

      I have managed programs for product organizations and IT. I guess I’ve been lucky. I have always found deliverables that we can have every month or more often, so I didn’t have to have the kind of metrics that you describe. Oh, we had to have data, absolutely. We had to manage cost, especially for the hardware/software programs. Those NRE’s can kill you.

      Maybe we agree more than we disagree. (Again :-) In my project portfolio book, I have a chapter on how to define your mission. If you don’t know where you’re headed, any old place is fine. To me, the vision is project- or program-specific. The portfolio is how you decide your strategy. (Decide your strategy. The portfolio decisions are the implementation.)

      I really liked the examples of the scorecards. Thank you. If people review your presentations and read your content and see those examples, they can understand much more about how this business of a balanced scorecard can work. Especially once they move up from the actual project/program level to what managers need to know in very large organizations.

      Thanks, Glen.

      Reply
      • Johanna,

        Thanks for the quick response. We spend the majority of our time these days doing “triage” on enterprise IT and software intensive defense and space programs. Over the past several years a trend has emerged, which resulted in the latest book Performance-Based Project Management®.

        A primary root cause of project performance shortcomings is the failure to know what “done” looks like in units of measure meaningful to the decision makers. In the space and defense world “done” measures ae flowed down from the “capabilities based planning” process. These “capabilities” are mission capabilities and developed through strategy process inside the Pentagon or NASA Headquarter. Setting aside for the moment that some of these strategies are ill-conceived, (ya think!) the Measures of Effective, Measures of Performance, Key Performance Parameters, and Technical Performance Measures are “baked into” the portfolio assessment process.

        We similar – and successful – approaches on the ERP side but commercial and government. The Feds are one of the largest consumers of ERP on the planet.

        Even with those monthly deliverables some assessment of overall mission success is needed. BTW, in the formal procurement world of Earned Value Management, there is the 44 day rule. We can’t go more than 44 days (2 calendar months) without tangible evidence (physical percent complete) of progress to plan. These means the budgeted cost for the work is assessed against the Performance and Effectiveness of the deliverables. In manned space flight this is done twice a month.

        While those fined grained answer the question “how long are we willing to wait before we find out we’re late (two weeks),” a higher level assessment of progress to plan is needed. Rolling Wave, Integrated Management Plan (IMP) – maturity assessment of the capabilities (back to those units meaningful to the decision makers) must be in place.

        Those incremental and periodic deliverables have a home in the broader system architecture of the solution. For example the biggest program we’re working at the moment – http://www.gps.gov/systems/gps/control/ is narly all software, and COTS hardware. On this program there is a portfolio of offerings to the users – both military and civilian. If you take out your phone and look at Google Maps and find you location using GPS, you’re using single digit percentage of the capabilities of GPS II and smaller for GPS III. The “owners” of GPS have USAF uniforms on and live south of us in Colorado Springs.

        So the Program Manager http://gpsworld.com/col-bernard-gruber-gps-directorate-farewell-perspective-on-gps-program/ is actually a portfolio manager. He has a scorecard which he “beats” our client with monthly.

        Those scorecards in the briefing and most of that work came out of the programs we’ve worked over the years and is known to work if the client is willing to put in the effort to write down what “done” looks like, before we start work, so we’ll recognize it if it ever arrives.

        Love you blog…

        Reply
  2. Hi Johanna,

    Good read, although this is a topic I struggle with. I support initial estimations on projects to determine resource availability and scheduling, however I am also constantly bit by the “management hears what they want to hear” from the estimates. You had touch on this, to say that management will tend to discard the pessimistic or realistic dates from the estimates and count on the ‘optimistic dates’ as the solid deliverable date.

    This activity gets us in trouble with most projects, but I’m not willing to throw estimates out the window as a defensive tactic against to failing to meet dates. Like all of life, there needs to be a balance and cooperative understanding between the strategic direction teams and the implements. Maybe one day business will get there.

    I do agree on the sunk cost fallacy, throwing good money after bad is a poor strategy. The real question should be continues viability of a project as a revenue generating activity, not how much has already been invested. Maybe the market shifted? Who knows, but make sure the value is still there.

    -Kurt.

    Reply
    • Hi Kurt,

      I say more about this in Essays on Estimation, where I talk about using confidence levels and optimistic/realistic/pessimistic approaches for providing managers estimates.

      The issue here, however, is one of valuing the project. Estimates might be good for ballparks: is this project bigger than a breadbox? Does that help us decide, “Should we do this project at all?”

      However, is the estimate good enough for understanding the value as we continue to work on the project? That’s where I think not. I think estimates get us into trouble if we use them for valuing projects. It’s way too tempting to think, “Oh, we only have x weeks left, let’s just finish.” But we don’t. Our estimates get us into trouble more often than we think. Instead, let’s not use the estimate, and use some other approach to valuing projects.

      Reply
  3. Reminds me a few years ago – I asked the zeroth question (I didn’t know or call it that back then). Should the project continue? I’m a developer/lead and agile coach primarily, but hit up against a hard problem. I asked the question. The PM really never considered it. He freaked out. Maybe because we were a vendor who’d do anything to get paid or maybe it was because he never even considered it at all – sunk cost and the customer needs it. It turns out that the software is used marginally. But you don’t suggest these things in a vendor organisation with traditional arrangements in place.

    Reply
    • Nick, good for you, for asking the question. Sometimes, as you saw, managers have a difficult time hearing or answering that question.

      Reply
      • Yeah thanks Johanna. I find asking the question gets me into a lot of trouble. But that’s a culture problem that is all too prevalent still. I’ve got into trouble over questioning the detail of requirements in 1998 (now called User Stories), questioning the design and requirements for a large system in 2010, questioning the quality and the cost implication of several large products 2013…. questioning the need for large an expensive inception phase for a very simple application 2013… there are more… I wonder if this is something to be proud of – I’ve always tried my best to convey the information – something to improve on though, but it’s time to expect some healthy rebuttal and conflict as a good thing from those who need to hear it. I’m sure there is a Dee Hock quote to throw in there :)

        Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>