In Part 1 of Revisiting Estimations, I explored some challenges and underlying concepts relating to estimates and projections. In this blog post, I more deeply explore Uncertainty, Complexity, and Stakeholder Questions.
Dealing with Uncertainty in EstimatesAn established stable team that has been doing (its own) estimates and tracking its velocity can make reasonable forward looking estimates, assuming that all else stays the same: more of the same kind of work, more of the same kind of environment, team doesn't change, etc.
- Velocity can be used to estimate completion (build up), and can be used to help the team determine how much work to take on in a sprint. Any other use of velocity decreases its usefulness and introduces dysfunction. (e.g. team is pressured to increase velocity over time, or to a target; team is pressured to take on more stories than the team thinks it can accomplish; PO or stakeholders attempt to influence how much work the team takes on; velocity used to compare teams)
- The team doing the work must be the one doing the estimates; the whole team must participate in estimation; there must be no pressure that influences estimates
- Variation increases uncertainty: new kind of work, change in team members, change in technology, change in environment, ...
- After 3 sprints in a relatively stable environment, a team velocity may start to emerge - especially if the impediments are being addressed.
Part of the question in dealing with uncertainty is how the estimate will be used and how the uncertainty is taken into account and what is the strategy for updating the estimates / distribution of work / build-up as the work continues. Another part of the question is how transparency is provided (how do we show what has been done, and our evolving estimate for when the remaining work will be done?)
- If there are any forces working against transparency (e.g. A stern "why aren't you done yet?" or "Why did this estimate change?" from the organization - i.e. a "shoot the messenger" reaction), this indicates an unwillingness of the organization to face uncertainty, and is a place where SMs, coaches and leadership need to step in and work the impediment.
One interesting question (to whoever is asking for estimates) is: "These are estimates. That means that they will be wrong. What are the consequences to the team and me when we change these estimates and the work turns out to take a different amount of time and effort that we estimated?"
- If there is any hint of "that's not ok" or "punishment" in any form, that is a red flag and needs to be addressed organizationally.
Another interesting question is "how valuable is it to you to have reduced uncertainty (vs. delivering other kinds of value such as working product)? Is it worth it to you to spend time and money reducing uncertainty?" As a PO/team, you can "buy" somewhat of a reduction in uncertainty by choosing which work to do first, or by doing proof-of-concept work that addresses "known unknowns".
- This is one reason that vertical slices - end-to-end fully integrated functionality that is "done done done" (complete, reviewed, automated, tested, integrated, documented, and delivered) are so valuable... because they uncover and resolve the places that risk and uncertainty often lurk. (integration; dependencies; handoffs; deployment). Such vertical slices are big revealers of impediments - anything that introduces delay between concept and customer.
- The Scrum intention of having the product be "potentially shippable" every sprint (might not be feature complete, but we could deploy what we have) also plays into this.
Work such as scalability is part of the Complex domain.
Complex systems are "impervious to a reductionist, take-it-apart-and-see-how-it-works approach, because your very actions change the situation in unpredictable ways."
Scalability work, for example, looks like this:
- Run a test. See if target has been met.
- See what the next biggest impediment to scalability looks like... what looks like the most promising change we could make now that might buy us an improvement?
- Implement a change. Did it help? Enough? Did the change have any unexpected side effects? To performance? To maintainability? To the expertise needed to work effectively on the product? To the team of teams?...
- Rinse and repeat.
The higher the complexity of the work and the situation, the higher the uncertainty. Forces of complexity include: complex technology dependencies; technical debt; slow feedback loops; cross team dependencies; ...
When the domain for a body of work is unknown, one useful exercise is Agreement & Certainty Matrix. This can help clarify the degree of uncertainty and complexity.
Good questions for stakeholders to ask, with respect to estimates:
- Tell me about how much uncertainty exists for these estimates? What are the sources of uncertainty?
- For these estimates to have a high likelihood of being realized, what would have to be true and stay true?
- What assumptions are being made? How robust are these assumptions? How do we know whether they are good assumptions?
- What could "go wrong" or "be discovered" that would affect the likelihood of these estimates reflecting what ultimately happens?
- What kind of buffer is built in?
- What adjustments could be made to the approach that would "buy down" the amount of uncertainty?
- What does our experience so far with this work tell us about how good our estimates are?
- What are we doing as an organization that is getting in your way? What could we do differently?
- How can I/we/the organization best support you?
Guidance for Leaders and Managers
In my next post, I'll explore some guidance for leaders and managers.
Check back for part 3 in the Estimation Revisited series. Until then if you have any comments or questions, join in the conversation below!