Wednesday, November 30, 2011

Making the undesirable desirable - a lesson from Tom Sawyer

Today's Google home page celebrates the birthday of Mark Twain with images that tell the story of Tom Sawyer convincing his friends to whitewash his fence for him. It is an example of making the undesirable desirable. You can read more about the story on wikipedia and here are some pictures of the story courtesy of google:



Tomorrow an article I wrote on Agile Adoption will be published by InfoQ. One of the strategies and patterns for change in that article is to "make the undesirable desirable". Here is an excerpt from the article:
"A colleague of mine recently experienced some resistance from his team when he asked them to try pair programming. Instead of forcing them to do it, he simply asked ”What would it take to get you to try it?” When they joked that they would gladly try it if they had a big screen TV to use for paired programming, he quickly obliged and the rest was history. The new behaviour became fun – it became desirable."
Although Tom used this strategy for selfish gain, this strategy can also be used to affect positive change in your organization. You can find more strategies and examples useful for your agile adoption in the InfoQ article.

Monday, November 14, 2011

An alternative to bug tracking tools

One of my pet peeves is working in and with bug tracking tools. I am well aware of some of the arguments for the importance of these tools and I am not trying to address those here. Instead, I'll show you an example of an alternative that I have found useful.

First, using an approach like Specification by Example can reduce the need for bug tracking tools because communication goes up and defect counts go down. But even using this technique, defects still occasionally occur. Here is an example of how to use Specification by Example not only for 'requirements', but also for defects.

In June of this year I spoke at the Prairie Developer's Conference in Regina, Saskatchewan. Some of the speakers and volunteers were involved in creating the website, services, and mobile application for that conference. Since I was doing a Specification By Example talk I decided to use the conference web services to illustrate how easy it is to create your first automated test against a web service using FitNesse. As I was working with the services I found a small defect. Instead of writing up a defect in a bug tracker with the steps to re-produce it, I wrote a test in FitNesse to confirm the defect:


This example calls a service that returns a list of sessions and does a few basic C# calculations on that list. It counts the number of sessions (allSessions.Count) in the conference and FitNesse maps that to the NumberOfSessions variable above. It then counts the number of unique abstracts (allSessions.Abstracts.Distinct.Count) and FitNesse maps that to the Number of Unique Abstracts. FitNesse then compares the numbers and displays the results. In this case, 63 does not match 62 so it displays an error with the expected and actual results as above.

Once the test confirmed the defect, I simply communicated the failing test to the developers. When the developers reviewed the test they could clearly see what the problem was. No back and forth was required to understand the issue or to confirm the steps required to reproduce it. No one had to set the status of the defect to "working", "fixed", "duplicate", "resolved", "more information required", or anything else. One of the developers fixed the issue and even added an additional service that we could call to address the root cause - Are Session Abstracts Unique? I added the new test, ran all my tests again and was pleased to see them all go green.


This process improved communication between tester and developer, ensured that the defect would always be tested and re-tested for, and kept us from spending unnecessary time in a bug tracking tool.