2008 IT Project Success Rates Survey
The results of the survey is summarized in my article
Software Development Success Rates.
Some findings include:
- As Figure 1 depicts, iterative and agile project
teams had higher success rates than traditional, which in turn had higher
success rates than ad-hoc.
- People really don't define success in terms of "on time, on budget, to
specification" regardless of what the theory folks may claim.
- 83% of respondents believe that meeting actual needs of stakeholders is more important
than building the system to specification.
- 82% believe that delivering high quality is more important than delivering on time and
- 70% believe that providing the best ROI is more important than delivering under
- 58% believe that delivering when the system is ready to be shipped is more
important than delivering on schedule
- For all the "RUP bashing" that goes on within the agile community, as
Figure 1 shows iterative projects are just as
successful as agile projects.
- Figure 2 depicts how iterative and agile
approaches are more effective delivering higher quality, greater ROI, better
stakeholder satisfaction, and deliver in a timely manner compared with
traditional and ad-hoc approaches. Traditional approaches were better
than ad-hoc when it came to quality, but ad-hoc approaches were better at
- Figure 3 depicts the success rate by paradigm and
distribution level. Regardless of paradigm, the more distributed the
team the lower the success rates. For any given distribution level,
agile/iterative always did as good or better than traditional approaches,
and traditional always did better than ad-hoc.
Figure 1. Success rates by development paradigm.
Figure 2. Success factors by paradigm (Scale is from
-10 to +10).
Figure 3. Success rates by paradigm and distribution
You may use this data
as you see fit, but may not sell it in whole or in part.
You may publish summaries of the findings, but if you do
so you must reference the survey accordingly (include
the name and the URL to this page). Feel free to
with questions. Better yet, if you publish,
please let me know so I can link to your work.
- I was disappointed with the number of respondents as I was really
hoping for over 1000 respondents. As a result the real success
rates will be somewhere in a larger range than with previous surveys.
I think that people are becoming fatigued at answering surveys as it's
become incredibly easy for people, such as myself, to do surveys online.
- This survey measures success as defined by the respondent, it does
not force a definition of success on them. The success rate was
calculated by summarizing the weighted average of each range (i.e.
90-100% averages to 95%) times the number of respondents.
- These figures vary significantly from those of the
Standish Group’s Chaos Report
which reports a 34% success rate and a 51% “challenged” rate. They
define success as “on time, on budget, meeting the spec”, but that
definition doesn’t seem to hold when we ask people what they actually
value. I’m not convinced that it’s appropriate to force a definition of
success on people, regardless of how easy it would the processing of the
- It is difficult to compare the numbers from the Chaos Report and
this survey. This survey is open, you have complete access to the
original questions and data, yet the Chaos Report is closed to
outsiders. I invite the Standish Group to open source their material.
- The request that went out indicated that the survey was exploring
success rates, so the success rate figures could be a bit higher as a
result due to selection bias (organizations that are really struggling
may not be reporting).
- The success rate difference between agile and traditional has
narrowed from previous surveys, which I suspect is the result of few
respondents to this one.
- The similarity in success rates between agile and iterative is
likely a reflection of the similarities between the approaches and could
be a reflection that perhaps the greatest determinant of success is the
shorter feedback cycle of the two approaches.
- This survey suffers from the
fundamental challenges faced by all surveys.
Links to Other Articles/Surveys
- My other surveys
- Answering the
"Where's the Proof that Agile Methods Work" Question
Why Share This Much Information?
I'm sharing the results, and in particular the source data, of my surveys for
- Other people can do a much better job of analysis than I can. If
they publish online, I am more than happy to include links to their
- Once I've published my column summarizing the data in DDJ, I really
don't have any reason not to share the information.
- Too many traditionalists out there like to use the "where's
the proof" question as an excuse not to adopt agile techniques. By
providing some evidence that a wide range of organizations seem to be
adopting these techniques maybe we can get them to rethink things a bit.
- I think that it's a good thing to do and I invite others to do the same.
||This is an eye-opening book for anyone who is trying
to understand how to measure concepts, such as developer productivity
levels for example, which are often perceived as difficult to measure.
If you choose to think outside of the metrics box for a bit you'll
quickly realize that you can easily measure information which is
critical to your decision making process.
||This is the latest edition of Capers Jones' classic
book summarizing the wealth of data which he has collected over the
years pertaining to software development projects and IT in general.
Although you need to read between the lines a bit, most of his data is
from traditional waterfall projects so it can be difficult to apply it
in the context of iterative or
agile projects, I still find this book to be a
valuable resource. If you're interested in improving the way that
you work as an IT professional, this book is an important resource for
making fact-based decisions.