Are accurate estimates costing you money?
It’s a very common scenario - as a team you have uncertainty around an estimate you need to give, possibly due to dependencies or other technical risk.. However you’ve been forced to commit to a date estimate and declaring uncertainty is not allowed… Probably frowned upon! So what do you do? The usual response is to include a buffer into your estimate an arbitrary X% contingency… Just to give yourself some breathing room!
In reality it’s probably not just one team team adding a buffer… The development team increase their buffers, the test team do the same and every other team on the critical release path.
The result is over inflated estimates in order to accommodate for uncertainty.
So what’s the problem with buffers? When adding a buffer to an estimate what you are really doing is trading certainty for increased cycle times.
Why is this bad? If you understand the economics behind your product/feature you should have a clear understanding of the Cost of Delay (CoD). By increasing cycle times in order to increase certainty what you are really doing is increasing CoD… Or in other words decreasing return on investment.
For example - Image a set of features which should deliver a recurring monthly saving of 200K. By increasing cycle times by 3 months you are also increasing the CoD by 600K
When you demand accurate estimates from teams and don’t allow for uncertainty are you in reality reducing your ROI?
It is nearly always a better economic option to decrease scope over extending timelines.