Software Estimates You Can Defend

Introduction

You don't have to be in software development for long before someone asks you "How long do you think it will take to...".  At this point you will begin to ask some questions and depending on how you are wired and how long you have been developing software, you will tend to
EITHER
Based on your experience and familiarity with the system &/or technology involved, provide a number then and there
OR
Ask for time, and come back with some highly crunched number generated from a software tool.

Neither number is likely to be particularly accurate.

Here, I want to outline an intermediate approach, which uses your experience and expertise, but uses some limited data to ensure the number is both useful and defensible.  This approach is based on having a clear and (appropriately) detailed understanding of the work required.  It is useful for when a broad estimate is required, but not a lot of time or reference material is available.  More detailed and accurate estimates require both of these.

What is an estimate for?

When someone asks for an estimate, they are actually asking some of the following questions: 
  • How much time in the schedule should I allow for this?
  • What is the cost I will accrue to attain the business benefits I ascribe to this feature?
  • What is the opportunity cost of adding this feature?
What this means is that your focus should be reliability and predictability rather than accuracy and precision.  Therefore, underestimating has more negative consequences than overestimating.  This should not lead you to pad your estimates unnecessarily, rather to provide a defensible estimate that business stakeholders can rely on for the decisions they need to make.

Consider the Cone of Uncertainty below.  At different points in the project life-cycle different questions are being asked, and differing degrees of reliability should be communicated.

Principles of Defensible Estimates

Principle 1: Invest time

Take the time you need to come up with a defensible, reliable number.
Take the time to understand the requirement, particularly the scope.  
  • A more tightly scoped requirement can be estimated more accurately than a broadly defined requirement.
Estimate overruns are most often due to development activities you did not consider when you came up with the initial estimate. 
Even getting an additional 15 minutes to think about what needs to be done will make a big difference to the reliability of your estimates.
Use a checklist to avoid missing activities e.g.
  • Environment set up
  • Tooling set up (Enterprise Architect / Resharper / Test Lab Manager)
  • Test data set up
  • SSL certificate procurement
  • Installer changes
  • Integration tests
  • Unit tests
  • Coding and development standard practices
  • Definition of done and acceptance criteria practices / "gates"
    • e.g. DBA sign-off, UX sign-off, Infrastructure sing-off, passing a Security Penetration Test
  • etc. ...
Use the appropriate level of time required, beyond a certain level, there are diminishing returns (as the graph below shows).
  • Consider lowering your confidence, rather than over-investing time-wise, unless the detail has been specifically requested.


Principle 2: Don't guess

No matter how experienced, knowledgeable  or familiar you are with the problem at hand, a number based solely on intuition is little more than a guess.

Defensible Estimation Step by Step

Step 1. Consider the approach that will be used.

  • Legacy vs new technology
  • Green field vs Brown field (reverse engineering may be required)
  • Known solution to known problem or solution unknown
    • Level of reuse
  • New tests, or reuse existing ones
  • Manual tests vs existing automated tests vs new automated tests vs new coded tests
  • What impact does the acceptance criteria have?
    • e.g. performance criteria
      • recording run time metrics (in developement & QA)
      • monitoring of runtime execution
      • performance tuning of a solution coded for maintainability

Step 2. Base your estimate on some kind of metric.

This starts to provide a defensible value.
Break the item into individual tasks
  • Think about how the items could be independent and how dependencies could be minimised
Find something to count e.g.
  • No. of web pages / widgets
  • No. of data points
  • No. of validation rules
  • No. of XML files that need to be generated
  • No. of conditions to handle
  • etc. ...

Step 3. Base your estimate on real data


Consider the approach that will be used
Access historical data (e.g. extracted from source control) for elapsed time for activities you have identified.
  • Prefer comparisons on same project
  • over same team
  • over same company
  • over same technology
  • over industry
  • over general
Once you have this historical figure, you can adjust it based on size or complexity.
 
NB: The number should be based on the "average team member" not a specific individual.
 
If you cannot access historical data, an intuitive estimate based on experience (= guess) can suffice, as long as the count of items you are using is high-ish.  In this case, the law of large numbers can compensate to some degree.

 
Step 4. Base your estimate on varying scenarios
 
Derive best case, worst case, and most likely estimates, then use the following formula:


How long will it take if everything goes perfectly?Best Case
How long will it take if everything goes as badly as possible?Worst Case
In the most likely scenario, how long will it take?Most Likely Case
(BestCase + (3 * MostLikelyCase) + (2 * WorstCase))/6Expected Case
No. of items * ExpectedCase Estimate

This formula is purposefully weighted towards the most likely case, but can be fine tuned.


Step 5. Include the level of confidence

 
Communicate the level of confidence along with the estimation value.  This should be related to the cone of uncertainty.
 
 
​Confidence​Variability of Estimate 
From
Variability of Estimate
 To​
Very Low​0.25x​​4x
Low​0.5x​2x
Medium​0.67x​1.5x​
High​​0.8x1.25​
Very High​0.9x​1.1x​


Examples

Worked example: Requirement

Requirement: Re-style the appearance of the main submit button on an existing web application.

TaskWorst case (hrs)Most likely (hrs)Best case (hrs)Expected case (hrs)
Define the new style (& update specification)3.7510.451.83
Modify existing style definition and move to stylesheet to allow reuse.
(currently defined inline)
620.53.08
Apply new style on pages (7)
(currently defined inline)
26.2573.512.83
Apply new style on pop-ups (7)
(currently defined inline)
26.2573.512.83
Create automated UI test (7)1473.58.75
QA - test design2873.513.52
QA - test execution2873.513.52
Total (hrs)
38
66.36

A guess-only based estimate of approx 70hrs would probably not be taken seriously.  Indeed, a guess-only estimate would have been much closer to 30 hours.  Such a number would probably be fine for this requirement in isolation, but over the life of the project would probably lead to a schedule over-run.
While an estimate of 67 hours (with medium confidence) may seem unduly large, broken down like this, it is defensible, and it works out at about 5 hours per button, including specification, development and QA.  
Let's also remember what this number is, it is an estimate that, given 67 hours in the project schedule, the requirement can be (almost) guaranteed to be delivered, even if "snags" are encountered during the process.
This also highlights the downstream cost of poor development approaches (i.e. inline styles) which may have been used due to unrealistic time frames in the past.

Worked example: Project

Project: A new web application, approximately 20 pages, created as a .NET MVC 4 based Single Page Application.

Historical ProjectSize (pages)Elapsed Duration (days)Learning curve / complexity / uncertainty multiplierExpected duration (days)
ASP.Net web forms administration tool30901.272
Classic ASP custom reports (12)
12
(12 * 1)
351.270
Average (days)


71

This estimation approach can be used in the early days of a project to provide stakeholders with a ballpark figure of the effort involved.



Comments