Surveying the Agile Community

Scott W. Ambler + Associates
Home | Articles | Books | IT Surveys | Podcasts | Contact Us | Announcements | Site Map
Survey Design This article provides insight into how to effectively survey the agile community. These issues are:

My hope is that much of the advice is also pertinent for surveying other communities.


Are Agile Surveys Valuable?

Assuming that the survey is well designed, the results of an agile survey can be very valuable for people who are trying to make informed decisions around whether to adopt agile approaches, or to further expand their adoption efforts. Many people want to see industry data, and surveys are one way to get that. Of course surveys aren't the only source of information, actual experience with agile techniques also provides important insight, as does academic research and forms of anecdotal evidence such as case studies, experience reports, and conversations. All of these sources of information have their place and each has their advantages and disadvantages. One size does not fit all.

A common reason that people give for not filling out surveys is that they don't feel that the information is valuable to them. I have to assume that they're correct in their belief that they're not getting value out of surveys. However, that doesn't mean that others don't see value in survey results. Furthermore, by not filling out a survey, other than saving time of course, all you will accomplish is that you're making it harder for your voice to be heard by senior decision makers within IT departments (few decision makers are trolling agile mailing lists trying to sniff out the occasional word of wisdom). And yes, despite all the talk about self-organizing teams within the agile community, the fact still remains that senior management in your organization can and will make decisions which affect what you do and how you do it. As a community it behooves us to invest time to provide the best information that we possibly can to decision makers, and effective surveys are part of that strategy. We need to remember that there is a wealth of information available to decision makers showing that traditional strategies are effective in practice, a very good source for example is Capers Jone's Applied Software Measurement 3rd Edition, and we need to motivate senior management to start questioning some of the advice that they're getting.


Designing An Effective Survey

Here are some quick thoughts based on my experiences over the years.

  1. Know the topic. If you don't understand the topic that you're exploring, there's very little chance that you'll design an effective survey which explores the topic. Do some reading on the topic first and understand what surveys have already been run regarding agile software development. Get involved with the community, and identify what issues actually need to be explored.
  2. Keep it short. People are very busy and don't have the time to fill out long surveys. The longer the survey, the lower the chance that people will fill it out and therefore the lower the applicability of your findings because you'll have a small sample size. I realize that this is hard because you likely are interested in a lot of issues, but it's far better to explore a few targeted issues well in most cases.
  3. Explore new issues. It doesn't make a lot of sense to cover the same ground that's already been covered by others, unless your goal is to confirm their work (this can be important too). Instead, either try to extend our knowledge by exploring an issue in detail (for example, the DDJ 2008 Agile Adoption survey found that the majority of agile teams were doing some up front requirements and architecture envisioning, the DDJ 2008 Modeling and Documentation survey explored how people were going about doing so). Or, repeat an existing survey for a targeted group. For example, when I present results of various surveys at conferences (I give a presentation called Agile by the Numbers which I've given at conferences and to customers around the world) I'm often asked for the detailed numbers for a specific geographic region, such as Scandinavia or South Africa, or for a specific domain, such as banking or manufacturing. If you have access to a mailing list for a targeted group of people then it would be interesting to discover whether they exhibit different trends than the large community does.
  4. Let people opt out of questions. I typically make questions mandatory but will allow people to indicate that they don't know what the answer is, or that the question isn't applicable to their situation, or simply give an option of "other". If you don't allow people some way to opt out of a question then you run the risk that they will give the closest answer or simply choose any answer simple to move to the next question, thereby reducing the quality of your data.
  5. Get help. Get some help designing a survey from people with agile experience and with survey experience (get feedback from the Agile Survey Reviewers).
  6. Beta test it. Send it out to a small group of people that you know, hopefully one which is a reasonable representation of the group that you're targeting, to determine if they understand the questions that you're asking. There's nothing worse than finding out that you miswrote a question. For example, the DDJ 2006 Agile Adoption Survey asked about whether people were doing Feature Driven Development (FDD) and many people responded that they did. But, very few people seem to do FDD in practice, even though it's a very effective approach, but respondents indicated that they did because they were capturing their requirements in the form of feature statements (which is fine, but that doesn't mean you're doing FDD). The problem was that few people understood what was actually being asked and misinterpreted it to mean something else. If I had beta tested the survey first I likely would have noticed this abnormal result and hopefully addressed the problem appropriately.
  7. Invest some time to learn about survey design. Read some of the resources suggested at the bottom of this page.

Publishing Your Results

My advice is to:

  1. Make the source questions available. People should see what questions were asked, how they were asked, and in what order they were asked. This puts the results into context and enables people to identify any biases that you may have introduced through your wording. For all of the surveys that I run I make a PDF of the survey available online.
  2. Make your analysis available. It should be as easy as possible for people to learn about the important findings, or at least what you think is important, of your survey. For all the surveys that I run I make a PowerPoint presentation file available that people can reuse in their own presentations, with proper attribution, and often include graphic images of some results which I share on my site (usually I'm using the graphics in an article somewhere online).
  3. Make the source data available. This enables people to analyze the data for themselves, they don't have to trust your analysis (which will also potentially introduce bias). For all the surveys that I run I make a CSV file of all the source data, with the exception of identifying information (due to privacy concerns), available online. Many surveyors will not make their survey data available because they see it as a competitive resource that they shouldn't share. My philosophy is that once I've used the data for my own purposes, usually to write an article about what's going on within the IT community, then I had might as well share it with others and hopefully enable them to gain some value too.

Personally, I don't trust any survey results when the surveyor doesn't do all three of these things.


Known Challenges With Surveys

I would be remiss if I didn't discuss some of the known challenges with surveys. In addition to design and publishing-related difficulties, there are also a few inherent challenges with surveys which can be difficult to overcome:

  1. You will only get responses from people willing to be surveyed. The opinions of people not willing to be surveyed are important too. ;-) Bottom line is that this is one aspect of selection bias.
  2. You risk getting responses from people with strong feelings about the topic. Even the title of a survey can contribute to this problem This problem is one of the reasons why I now run "State of the IT Union Surveys" -- this is a fairly generic title that doesn't reveal what the specific topic is. These new surveys also address several topics, not a single theme, so as to reduce the respondent drop out rate.
  3. Very often questions capture opinions, not facts. This is perfectly fine as long as the results are presented as opinion (which can be difficult to do sometimes). For example, the 2009 Agile Practices Survey explored how people are adopting agile practices. It's fair to indicate that certain practices are believed to be more effective than others but it wouldn't be fair to state that some practices are more effective than others (this is something better left to more specific research. However, recognize that it is possible to ask factual questions such as the length of time that they've been working in IT, their age, and so on (yes, they may still choose to misrepresent information).
  4. The biases of the communities will be reflected in the results. People form communities for a reason. For example, people join the TDD mailing list because they're interested in TDD and probably even trying to learn TDD. My 2008 Test Driven Development Survey was sent out to that list because I wanted to explore what they were actually doing in practice. Because this community is biased towards TDD they wouldn't be a good source of information about TDD adoption rates but they would be a potentially good source of information for how people are actually doing TDD in practice. This is why I indicate who the surveys went out to, so that you can determine what selection bias may have been introduced due to who the survey was sent out to.

What Does the Agile Community Think About All These Surveys?

It's really easy to run a survey using online tools such as Survey Monkey so a lot of people do so. This wouldn't be such a bad thing if the surveys provided value, were designed well, and the results were properly published. However, this often isn't the case and as a result fewer people choose to fill out online surveys because they feel that their time is being wasted (and sadly it often is).

Common anti-patterns with agile surveys:

  1. Misguided students. A common problem with "agile surveys" occurs when university or college students are given an assignment to do some research pertaining to agile development because they often put together a survey which covers topics which others before them have previously surveyed or they explore issues which reflect traditional (not agile) strategies to development. The students have the best of intentions but due to lack of experience, and often lack of support from their already overworked professors, they execute the survey poorly. These surveys have almost no hope of finding out pertinent information and are inadvertently making it harder for everyone else because they're annoying the people they're hoping to survey and reducing the likelihood that they're respond to future surveys. My advice to the students is to see some help designing your survey, both from your professors and teaching assistants as well as from the agile community.
  2. Thinly disguised marketing. Every so often a survey is sent out which is nothing more than a marketing gimmick for a consultant or product vendor. My advice is to recognize that you're not fooling anyone and worse yet are running the risk that all you're going to accomplish is that you'll turn people off to whatever it is that you're trying to sell.

Suggested Resources



Disciplined Agile Delivery: The Foundation for Scaling Agile Agile Modeling: Practices for Scaling Agile Agile Data: Practices for Scaling Agile EnterpriseUP: Agility at Scale Software Development Practices Advisor Scott Ambler + Associates Follow @scottwambler on Twitter!


Copyright 2009-2014 Scott W. Ambler

This site owned by Ambysoft Inc.