AmbySoft.com

Ambysoft Logo

Agile Adoption Rate: 2007 Open Research

How to Measure Anything This open research into agile adoption rates was performed in early March 2007 and there was 781 respondents. The survey was announced in the blog of Jon Erickson, the Dr. Dobb’s Journal editor.

The Survey Results

The results of the survey are summarized in Survey Says… Agile Has Crossed the Chasm published in the August 2007 issue of Dr. Dobb’s Journal.

Some findings include:

  1. 69% of respondents indicated that their organizations are doing one or more agile projects. Of those that hadn’t yet started, 24% believed their organizations would do so within the next year.
  2. 44% indicated a 90%+ success rate at agile projects, 33% indicated between 75 and 90%. It appears that agile seems to be working out.
  3. Co-located agile projects are more successful on average than non-co-located, which in turn are more successful than projects involving offshoring.
  4. 98.6% of agile teams worked adopted iterations, and 83% had iteration lengths between 1 and 4 weeks.
  5. Smaller teams had higher success rates than larger teams.
  6. 85% of organizations doing agile had more than one project completed, so it’s gone beyond the pilot project stage in most organizations.

 

Downloads

The Survey Questions(174K)

Raw Data(120K)

Summary Presentation(216K)

 

What You May Do With This Information

You may use this data as you see fit, but may not sell it in whole or in part. You may publish summaries of the findings, but if you do so you must reference the survey accordingly (include the name and the URL to this page). Feel free to contact me with questions. Better yet, if you publish, please let me know so I can link to your work.

 

Discussion of the Results

  1. Pair programming didn’t rate as well as I had expected, probably because many organizations often don’t give it sufficient time to take root on a project team.
  2. Every practice that I asked about was rated above average, although I didn’t explore many traditional practices such as detailed up-front modeling and detailed up-front planning because they were covered by questions about the effectiveness of work products.
  3. The results for database refactoring and database testing did, although, to be fair, this is likely a reflection of the current lack of tool support for these concepts.
  4. Developer tests and whiteboard sketches pretty much received the same score. Yet, developer testing seems to receive at least an order of magnitude more discussion on agile forums than does whiteboard sketching. This is frustrating consider how often agilists are criticized about not modeling. We need to start talking more about what we actually do in practice.
  5. There’s clearly a loud message that detailed documentation has little value to add on agile teams. We do take agile approaches to documentation however.
  6. Both detailed and high-level Gantt charts were rated very poorly although task lists were very close to the top, an indication that agilists prefer simpler approaches to project planning
  7. The adoption rate figures, although higher than in 2006, might not be comparable to the results of my 2006 Agile Adoption Rate Survey because the questions were asked differently.
  8. Even though the success rates appear higher, it may be difficult to compare the success rates claimed in this survey with traditional success rates due to a lack of definition of success.
  9. The results may be a bit optimistic because I used a mailing list composed of IT professionals who very likely read on a regular basis. Therefore they may be more aware of new trends in IT than people who don’t read.
  10. The request that went out indicated that the survey was exploring agile adoption, so the adoption figures could be a bit higher as a result due to selection bias.
  11. The vast majority of respondents are in North America, so these results likely represent the experiences of IT professionals in North America but perhaps not other parts of the world.
  12. This survey suffers from the fundamental challenges faced by all surveys.

 

Links to Other Articles/Surveys

  1. My other surveys

 

Why Share This Much Information?

I’m sharing the results, and in particular the source data, of my surveys for several reasons:

  1. Other people can do a much better job of analysis than I can. If they publish online, I am more than happy to include links to their articles/papers.
  2. Once I’ve published my column summarizing the data in DDJ, I really don’t have any reason not to share the information.
  3. Too many traditionalists out there like to use the “where’s the proof” question as an excuse not to adopt agile techniques. By providing some evidence that a wide range of organizations seem to be adopting these techniques maybe we can get them to rethink things a bit.
  4. I think that it’s a good thing to do and I invite others to do the same.