03 August 2012

The ROI of Test-Driven Development

Leadership wants to know:  "Why should my teams be doing TDD?  What is the return on investment?"

Though TDD is, of course, not a "silver bullet," we've found that TDD solves a number of key software development challenges, including those that are exposed by the frequent releases and high visibility expected of an Agile team.

(For background on these challenges, read about Technical Debt and the Agilist's Dilemma. To grasp the root systemic problem, read my "new" twist on the old Iron Triangle.)

I'll outline some benefits, and some costs.  This is probably not a complete list, but represents the most significant benefits and costs that I've observed.  Where appropriate, I've summarized in a bold, italicized sentence, so you can absorb the main points quickly, or delve into each item in detail, depending on your current needs. 

The Biggest Benefits of Test-Driven Development


Defect Reduction

Functionality that doesn't function provides no business value.  TDD allows us to start with high quality, thus providing real value.

Often the most immediate and obvious detectable benefit of TDD is the reduction of defects (60% to 90% fewer, according to this PDF describing the IBM/Microsoft study).

The key to this benefit is the creation of faster feedback loops. If a developer can find a mistake instantly, before check-in, then this is typically weeks (or months) before the bug would otherwise be identified by traditional testing efforts or code reviews. This early detection and elimination of defects avoids the waste of rework involved in finding and fixing a bug, plus having to assign a developer to do the work, and to re-familiarize himself with that region of code.

TDD reduces both (1) defects caused by guessing (incorrectly) that a particular implementation will behave as expected, and (2) defects caused by adjusting design to add new functionality to existing code.  A freshly-written test will catch the former before it's ever checked-in, and the existing suite of previously-written tests will prevent the latter.

When a defect is found (they will still happen), we add any missing tests that describe the defect. We do this even before we fix the defect. Once fixed, that defect can never resurface.

Faster Feature Time to Market (aka Cycle Time)

TDD allows developers to add innovative features as rapidly as the founder did while hacking in her garage, without damaging the original investment.
Comprehensive testing and rapid feedback provide complete confidence that any changes to the code (for new features or to improve the design) can be made swiftly, without breaking existing functionality.

When teams don't have to worry about breaking stuff, they can add new behaviors and features much faster.  This gives us a shorter cycle time for innovative features.

I have a number of first-person stories (which I'll share in a future post) where the teams I've worked on (as a contributing XP coach) have been able to turn around a "major architectural change" or "drastic re-purposing" in less than a week.  Had we not been following our TDD engineering practices, we believe the estimates would have been measured in months, not days.

In my experience, most of this benefit is taken for granted (after all, what do we have to compare it to?). Yet even when the great delightful surprises have not yet happened, the whole team (including Product) often notices a subtle shift towards a sense of ease and pride in the team's productivity and the quality of the delivered software.

Improved Focus

TDD helps developers think clearly about the task at hand, without having to continuously hold the whole complex system in their heads. They avoid over-extrapolation and over-design.

"Thinking in tests" allows developers to craft their code by (1) stating a goal for the code (as a new test), (2) confirming that this is indeed something new, (3) implementing just enough to get it to work correctly, (4) reshaping the design to fit well within the rest of the body of code.  This approach gives them opportunity to reflect on needed behaviors and edge cases without over-engineering.

Additionally, developers avoid the frequent context-switching incurred by bug-hunting, and recoup a tremendous amount of time typically spent in the debugger.  During the entire year of 2002, working full-time on a single product, I recall our whole team firing up the debugger only once or twice (averaging once every 6 months).  Compare that to a developer I met who claimed he spent 80% of his time in the debugger prior to TDD.  Even if that's an exaggeration, his rough estimate suggests a large amount of wasted time spent bug-hunting.

Parallel Efforts Without Conflict

Since, with disciplined TDD, each developer (or pair of developers) is expected to run all of the tests in the regression suite and confirm that 100% pass before committing new changes, there is much less opportunity to disrupt or destroy someone else's hard work.

If your team is larger than two developers, then they will likely be working simultaneously on different, but possibly related, areas of the system.  Without a discipline of writing and running comprehensive, pinpoint-accuracy tests, we often damage a prior check-in when we merge or integrate files.  Some teams will instead create a complex tree of code-repository branches, which is really just a way of kicking the can down the street.

The Costs of Implementing a TDD Discipline

As with any investment, you will want to consider costs.  And TDD, like any worthwhile discipline, has its price.

More Test Code

"Twice as much code!" people often protest. Yes, unit-test code is code, and needs to be written by a developer.  Also, test-code tends to be more script-like and verbose. It is, after all, a form of developer-to-developer communication.

Actually, unit-testing code using TDD tends to result in a 2/1 ratio between test code and production code.

"Three times as much code?!" Not exactly...

If you are concerned that so much test code will create a bottleneck, you will be happy to know that TDD tends to reduce the amount of production code it takes to solve a business problem.  TDD reduces duplication by allowing teams to avoid copy-paste coding techniques and stove-pipe solutions.  (A "stove-pipe solution" is one where each activity, page, or feature has its own full set of code at each architectural layer.) Teams that do not do TDD may rely on copy-paste in order to avoid breaking anything that already works, or to avoid interfering with another developer's parallel efforts.

So, though the ratio between test code and production code may be 2/1, it may not really be three times as much code as would otherwise be written.

Besides, in knowledge work, typing speed is not your limiting factor.

Learning Curve

At first there will be a slowing of delivery of new functionality, as the team gets used to using the TDD practice, and as they pay down small high-risk portions of the technical debt.

The Microsoft/IBM study noted an initial slow down.  This is true whenever we begin a new discipline.  Think about starting a new exercise regimen, or start flossing your teeth, or learning the piano: At first, practicing slows you down, it hurts, and it may not seem worth the effort.

After a few months, teams start to rely on their safety-net:  That comprehensive suite of unit tests.  This gives them courage to make bold moves, and to check that those moves haven't reduced the value of the software.

 Developers report useful feedback resulting in an increase in "flow" in short order: usually less than 30 days. (Often, a majority of participants in my 3-day Essential Test-Driven Development course will report that a single surprising test-failure, or a well-tested refactoring, helped them see how TDD is a superior way to craft code.)

Business results can take a little longer: TDD reduces both old and new technical debt, and allows new, even unexpected, functionality to be added rapidly. It may take some reflection on a smoother release to market, or the occurrence of a very short cycle time, before the business is aware of this ROI.

TDD may at first seem like an exotic and esoteric discipline, and possibly quite challenging.  Actually, people find TDD easier to learn than good "test-after" unit-testing. I like to say learning TDD is like learning to walk on your feet.  You've been taught in college and/or in the industry that walking on your hands is "the way it's done in our industry."  I spent 13 years writing code before testing, and 13 years doing TDD. I find TDD to be more natural for my scientific/engineering/inquisitive mind, and far more satisfying.

I've trained and coached hundreds of teams on how to "walk on your feet" using TDD. The approaches that have been most effective:
  • My 3-day Essential Test-Driven Development course followed (at some point) by a day of coaching around a team's specific technologies, existing code, and other challenges.
  • -or- At least eight days of intensive coaching with a dedicated, enthusiastic (or desperate) team. I work with pairs of developers on their development tasks.  Sometimes we're writing characterization tests for legacy code, sometimes we're testing the "untestables," and sometimes I'm providing a half-technical, half-emotional system of support that allows people to take courageous steps that they would otherwise never take.
Some teams have done it on their own.  They'll hire experienced, "test-infected" developers to cross-pollinate the team over time.  Or they'll read the books and coach each other over time.  These have the added cost of creating extra technical debt until the whole team is doing TDD in a disciplined manner.

Test Maintenance

The test code does have to be maintained, and with the same relentless discipline as the production code.  But if it's done as ongoing refactoring towards maintainability (i.e., writing the next test), it can be managed such that it never grows into a big mess.

Teams that refactor regularly (every two or three minutes) find that both the tests and the code behave themselves; they never grow into monsters.

In fact, teams who have been following the discipline for a year or two have reported to me that they spend a majority of their time and efforts on test maintenance.  Though this sounds horrific (and expensive) on the surface, they are really saying that they need only expend minimal effort on refactoring or designing their production code.  These teams are, truly, "Test-Driven."

A Good Investment?

Each manager/exec/leader will need to do the cost-benefit analysis for her own teams.

In my 26 years of experience--half with teams building software prior to TDD and half with teams doing TDD wholeheartedly--I find this one discipline to be the most potent practice in the Agile Engineering toolbox. I am thoroughly convinced that a focus on quality leads to real productivity, and that "thinking in tests" ("test-driven" and "behavior-driven" are other ways of saying this) leads to clarity of requirements, just enough engineering, and the highest levels of quality.

Let your teams try walking on their feet again. You'll be pleased with the results.