ISE Blog

Estimation Revisited (Part 2)

In Part 1 of Revisiting Estimations, I explored some challenges and underlying concepts relating to estimates and projections. In this blog post, I more deeply explore Uncertainty, Complexity, and Stakeholder Questions.

Dealing with Uncertainty in Estimates

An established stable team that has been doing (its own) estimates and tracking its velocity can make reasonable forward looking estimates, assuming that all else stays the same: more of the same kind of work, more of the same kind of environment, team doesn't change, etc.
  • Velocity can be used to estimate completion (build up), and can be used to help the team determine how much work to take on in a sprint. Any other use of velocity decreases its usefulness and introduces dysfunction. (e.g. team is pressured to increase velocity over time, or to a target; team is pressured to take on more stories than the team thinks it can accomplish; PO or stakeholders attempt to influence how much work the team takes on; velocity used to compare teams)
  • The team doing the work must be the one doing the estimates; the whole team must participate in estimation; there must be no pressure that influences estimates
Even those reasonable estimates are subject to uncertainty. You don't know what you don't know. The higher the complexity of the work and the situation, the higher the uncertainty. Analogy: I drove to Colorado one weekend. Along the way I saw detour signs for I29 (along the Missouri river) being closed North and South of Omaha. The detour was I35. Depending on where I was going, that situation would easily add 4-8h to a day's drive. If I had been asked to estimate that journey, these environmental changes would have made my estimates wrong by a lot.
  • Variation increases uncertainty: new kind of work, change in team members, change in technology, change in environment, ...
A new team without an established velocity is a high uncertainty situation. There are techniques to estimate what the team's velocity might be - but the guidance for using this is to express an uncertainty of x4 to x 1/4. So as a new team, it doesn't matter what you estimate, you know it will be wrong.
  • After 3 sprints in a relatively stable environment, a team velocity may start to emerge - especially if the impediments are being addressed. 

Part of the question in dealing with uncertainty is how the estimate will be used and how the uncertainty is taken into account and what is the strategy for updating the estimates / distribution of work / build-up as the work continues.  Another part of the question is how transparency is provided (how do we show what has been done, and our evolving estimate for when the remaining work will be done?)

  • If there are any forces working against transparency (e.g. A stern "why aren't you done yet?" or "Why did this estimate change?" from the organization - i.e. a "shoot the messenger" reaction), this indicates an unwillingness of the organization to face uncertainty, and is a place where SMs, coaches and leadership need to step in and work the impediment.

One interesting question (to whoever is asking for estimates) is: "These are estimates. That means that they will be wrong. What are the consequences to the team and me when we change these estimates and the work turns out to take a different amount of time and effort that we estimated?"

  • If there is any hint of "that's not ok" or "punishment" in any form, that is a red flag and needs to be addressed organizationally.

Another interesting question is "how valuable is it to you to have reduced uncertainty (vs. delivering other kinds of value such as working product)? Is it worth it to you to spend time and money reducing uncertainty?" As a PO/team, you can "buy" somewhat of a reduction in uncertainty by choosing which work to do first, or by doing proof-of-concept work that addresses "known unknowns".

  • This is one reason that vertical slices - end-to-end fully integrated functionality that is "done done done" (complete, reviewed, automated, tested, integrated, documented, and delivered) are so valuable... because they uncover and resolve the places that risk and uncertainty often lurk.  (integration; dependencies; handoffs; deployment). Such vertical slices are big revealers of impediments - anything that introduces delay between concept and customer.
  • The Scrum intention of having the product be "potentially shippable" every sprint (might not be feature complete, but we could deploy what we have) also plays into this.

Complexity

Work such as scalability is part of the Complex domain.

CynefinSee:

Complex systems are "impervious to a reductionist, take-it-apart-and-see-how-it-works approach, because your very actions change the situation in unpredictable ways."

Scalability work, for example, looks like this:

  • Run a test. See if target has been met.
  • See what the next biggest impediment to scalability looks like... what looks like the most promising change we could make now that might buy us an improvement?
  • Implement a change. Did it help? Enough? Did the change have any unexpected side effects? To performance? To maintainability? To the expertise needed to work effectively on the product? To the team of teams?...
  • Rinse and repeat.

The higher the complexity of the work and the situation, the higher the uncertainty. Forces of complexity include: complex technology dependencies; technical debt; slow feedback loops; cross team dependencies; ...

When the domain for a body of work is unknown, one useful exercise is Agreement & Certainty Matrix. This can help clarify the degree of uncertainty and complexity.

Stakeholder Discussions

Good questions for stakeholders to ask, with respect to estimates:

  • Tell me about how much uncertainty exists for these estimates? What are the sources of uncertainty?
  • For these estimates to have a high likelihood of being realized, what would have to be true and stay true?
  • What assumptions are being made? How robust are these assumptions? How do we know whether they are good assumptions?
  • What could "go wrong" or "be discovered" that would affect the likelihood of these estimates reflecting what ultimately happens?
  • What kind of buffer is built in?
  • What adjustments could be made to the approach that would "buy down" the amount of uncertainty?
  • What does our experience so far with this work tell us about how good our estimates are?
  • What are we doing as an organization that is getting in your way? What could we do differently?
  • How can I/we/the organization best support you? 

Guidance for Leaders and Managers

In my next post, I'll explore some guidance for leaders and managers.


Check back for part 3 in the Estimation Revisited series.  Until then if you have any comments or questions, join in the conversation below!

Andrew Smith, Principal Architect

Andrew Smith, Principal Architect

Andrew is a Principal Architect at ISE and leads ISE’s Agile community. He recently earned two new Agile certifications: ICAgile Certified Expert – Agile Coaching (ICE-AC) and ACI Certified Teams Transformation Coach (ACI-CTTC). Andrew is passionate about creating great teams, great software and great customer experiences, and is constantly looking for ways to adapt industry experience and best practices into ISE. In his free time, Andrew enjoys dancing Argentine Tango, public speaking with Toastmasters International, and Yoga.

Andrew Smith, Principal Architect

Latest posts by Andrew Smith, Principal Architect (see all)

Estimation Revisited (Part 2) Oct 17, 2019

Estimation Revisited (Part 1) Sep 26, 2019

Raising the Bar on Remote Collaboration Aug 29, 2019

On Agile2019 Aug 15, 2019

Hazards of the SME Jul 25, 2019