Pollsters agreeing a forecast.jpg

Getting forecasts wrong can be very painful.

How annoying are the election pollsters? They ran dozens of polls, all forecasting a similar outcome, but they got it completely wrong. Astonishingly they can’t do the one thing they exist to do: provide a reliable forecast.

The fact that the same approach was repeated many times, and by different organisations, did not make it accurate. Maybe it made it slightly less embarrassing for those who got it wrong, in that they are no worse than their competitors. But get it wrong they did, displaying the same forecasting flair as Michael Fish circa 1987.

Its madness to keep repeating the same process and expect a better outcome

In the technology industry, we are pretty dreadful at forecasting too. How long will it take? How much will it cost? Will it work? We regularly get these wrong, despite using well established and popular approaches. The result is that IT change programmes take longer, cost more and still go badly wrong. It’s exasperating.

At Acutest, we’ve been working on projects where people are taking a different approach to forecasting. In particular, we’ve been looking at how people estimate testing tasks (will we finish on time) and quality (will the product or service work).

Why are we doing this? Well, one reason is our research has shown us that more than 90 per cent of IT professionals underestimate how long testing tasks will take. No wonder there are so many painfully wrong forecasts on IT projects. This finding comes from an exercise we've been running with 100’s of groups of project staff (typically programme managers, project managers, developers and testers) over the last 12 years. The chart below is the results from a typical group, showing how long dozens of people think a simple testing task should take, as well as the sensible answer range.

Forecasting data.png

As with election pollsters, just because many people come up with similar answers does not make them right. And sadly the same story applies to forecasting future quality, but that’s probably better covered in another post.

So try a new approach instead

Rather than carrying on using the old methods, with the same painful outcomes, we’ve been working on improving forecasting, particularly on very large scale programmes. The approach harnesses many of the techniques familiar to change project staff, with some testing specific elements and some new adaptations. For example, we recommend using estimates based on first-hand experience of the activities to be completed (more like feeding in an early exit poll) to improve the forecast.

If you’d like to learn more about our forecasting research or our forecasting approach, or if you need help with your programme forecasting, please contact us.

Contact acutest