The Average Time To Green Game

Something special happened last week. I was in Bangalore doing some training at the request of my good friend Olve Maudal of Tandberg (now part of Cisco). A day on Test Driven Development was scheduled for Monday and I fell asleep Saturday night thinking about how to really get across the idea and nature of TDD to a group of developers. I woke up at 2am Sunday morning with The Average Time To Green Game pre formed and named, ready in my head!


Game setup

  1. Each computer is given a label (eg Alligators, Bears, Cheetahs, etc).
  2. Minimum of two people per computer.
  3. Each computer must have a TDD framework installed (better still, use CyberDojo).

Game play

  1. Every 10-15 minutes ring the Average Time To Green Bell (we found a small brass bell in a local shop).
  2. When you ring the bell you also start a timer and project it so everyone can see it. This timer starts at zero and increments second by second.
  3. The aim at each computer is then to get to green (all tests passing).
  4. When a computer has got to green two things have to be recorded for that computer: the iteration number, the time it took to get to green since the bell (simply look at the projected timer).
  5. When your computer is green you have to cover your laptop (we provided sheets of green paper) and wait till all the computers have got to green.
  6. The data for all computers is recorded in a spreadsheet.
  7. When all computers are at green everyone briefly looks at the graph made from the spreadsheet.
  8. Then you have to swap partners and computers and a new iteration starts.

Game Goal

The goal of the game, which was clearly and explicitly printed on the instruction sheets was simply to control the average time to green across the whole group. The group naturally didn't understand that at first - they focused instead on the problem. The problem was completely trivial - Olve and I picked stripping backslash newline pairs off a character buffer - as in C/C++ preprocessor logical lines. It was utterly fascinating to watch how things progressed, and we feel it worked really well (and more to the point I think the participants did too), both in the TDD sense and in the team building sense.

Game photos

  1. Graph of the average time to green over several iterations.
  2. Helping to solve one computer that was holding the team up.
  3. Total number of tests passing split by computer.
  4. Relaxing at the end.

Game retrospective

  1. The group were all well above average ability so we could have perhaps run fewer iterations, or used a less trivial problem.
  2. We could have used staged goals. First measure the average time to green, then lower it, then control it.
  3. Once the group felt they had control of the average time to green we could have let them choose their own goals.
  4. We could have encouraged participants to write down their choice of strategies and their experience of pair programming.


Here's a follow up blog-entry/