Let's get something out of the way right up front. You might have extensive experience with test driven development (TDD). You might even practice it routinely and wear the red-green-refactor cadence like a comfortable work glove. For all I know, you could be a bonafide TDD expert.
If any of that describes you, then you probably don't actually misunderstand TDD. Not surprisingly, people that become adept with it, live, and breathe it tend to get it. But if that introductory paragraph doesn't describe you, then you probably have some misconceptions.
I earn my living doing a combination of training and consulting. This affords me the opportunity to visit a lot of shops and talk to a lot of people. And during the course of these visits and talks, I've noticed an interesting phenomenon. Ask people why they choose not to use TDD, and you rarely hear a frank, "I haven't learned how."
Instead, you tend to hear dismissals of the practice. And these dismissals generally arise not from practiced familiarity, but from misunderstanding TDD. While I can't discount the possibility, I can say that I've never personally witnessed someone demonstrate an expert understanding of the practice while also dismissing its value. Rather, they base the dismissal on misconception.
So if you've decided up-front that TDD isn't for you, first be sure you're not falling victim to one of these misunderstandings.
People Do TDD To Avoid Architecture and Design
I'll start with what usually constitutes a borderline willful misunderstanding. In other words, I think people who espouse this idea often present it as a strawman argument more than a genuine misunderstanding. They say that TDD means you can't reason up-front about software architecture or design. Then they say they need to reason up front about these things, so they don't do TDD.
To the extent that people genuinely misunderstand this, I suspect any confusion arises from the occasional use of "test driven design" as a synonym for test driven development. It might also come from people who practice TDD offering the YAGNI acronym as guidance.
But make no mistake. In terms of design and code, TDD only requires that you create a failing test before writing production code. How much planning you do before writing that code (and the preceding failing test) remains your business. And for non-trivial projects, you should definitely do some planning.
You Write All of Your Tests Before Writing Any Code
Going in order of ease of misunderstanding, I'll move onto one less likely to happen willfully. I'm talking about misunderstanding TDD by believing it requires you to write all of the tests for your class or module before writing any code.
Usually, I encounter this after discussing the practice for a bit. They'll acknowledge the potential benefits and talk about maybe someday making time to learn. And then they'll talk skeptically about the potential waste of writing a bunch of tests that might not wind up being needed. I'll press at this point, and realize they think TDD involves writing out every possible test case the way QA might write out a test plan.
At this point, I surprise them by agreeing with them. Anyone doing that would be wasting time! Luckily, TDD doesn't call for this. You write one test at a time -- only the test that expresses what you want your code to do, but that it doesn't yet do.
TDD Replaces QA
Now we arrive at forms of misunderstanding TDD that I understand completely. In fact, I even see some relatively novice practitioners of TDD have these misconceptions, at times. Take, for instance, the idea that a development team can practice TDD and thus replace the need for QA personnel.
When I encounter this belief in a team or department, it usually creates needless tension around TDD. Management misguidedly eyes it as a potential cost savings measure, while QA understandably views it as an existential threat. Calm down, everyone. TDD does not, in any way, replace the need for QA work.
Using TDD to develop helps steer a good modular design, and it leaves a safety net of regression tests in its wake. But it (frequently) doesn't provide the kinds of user-oriented, system-wide tests that a QA group would execute. And, it obviously can't perform exploratory testing. Everyone's job is safe.
TDD Provides an Exhaustive Set of Unit Tests
The last section dovetails nicely into this one. Just as misunderstanding TDD can lead teams to believe it replaces QA, it can also lead them to believe it produces a bullet-proof test suite. But it does not.
I frequently encounter this misunderstanding when teams have begun to adopt the practice. They test drive their code for a while, building out a nice test suite. And then, it strikes -- some kind of bug in a higher environment. "Why didn't the TDD tests catch this?! I knew this was a waste of time!"
While I completely understand the frustration, it's misplaced when directed at TDD. Driving your development with tests means that you write the tests needed to finish your development. TDD does not call for writing tests for every conceivable boundary case or for every conceivable input. It certainly does not involve things like load tests and smoke tests.
This sort of testing is important, and you do need it. But it represents an activity separate from TDD, in which you write tests not intended to drive changes in production code.
TDD Is Primarily a Testing Activity
This last TDD misunderstanding might spark some spirited discussion. I suspect that even some experienced TDD practitioners might disagree with my take here. But I'll offer it nonetheless. I submit that TDD is not, primarily, a testing activity.
To understand my reasoning, consider the name itself: test-driven development. Notice that they don't call it something like development-focused testing. The actual name and the process that flows from it involves using tests as a driving aid for software development. With test-driven development, you follow a software development methodology that happens to produce a nice unit test suite as a byproduct. You could toss all of the tests after writing and keep only the production code and realize benefit in terms of design flexibility. Obviously, I don't recommend this approach since the tests have value.
But whether you take issue with what I say here or not, bear it in mind as you practice TDD or socialize it with others. It helps clear up the other misunderstandings that call for TDD to address all of the department's testing needs in one fell swoop.
Avoid Misunderstanding TDD: What It Really Involves
I've spent a good bit of time now talking about what TDD isn't. So, I'll close by talking about what it actually is. Taken from the link at the start of the post, consider Uncle Bob's "three laws" of TDD.
- You are not allowed to write any production code unless it is to make a failing unit test pass.
- You are not allowed to write any more of a unit test than is sufficient to fail; and compilation failures are failures.
- You are not allowed to write any more production code than is sufficient to pass the one failing unit test.
Adhering to these rules defines a process abbreviated as "red-green-refactor." You see a way in which you want the production code to change -- a shortcoming. Before you can address the shortcoming, you invoke law (1) and express the shortcoming with a failing test -- the simplest test you can write (law 2). This defines the "red."
With that out of the way, you move onto green. Invoking law (3), you do the simplest thing you can to get your erstwhile failing test to pass while keeping all of your other tests passing. This defines "green."
And, finally, when all tests pass, you can refactor your code as long as all tests stay green, defining the "refactor" part. (Refactoring constitutes a different activity than "write production code" from law (1), so you refactor with your existing test suite).
That's it. As you can see, TDD makes no sweeping proclamations about QA strategy and it offers no opinion on issues like architecture. It just gives you a simple protocol for writing and cleaning your code.
But with that simple protocol comes a great deal of power and effectiveness. So it's worth working through misunderstandings to get there.
To learn how to implement TDD methods into your own workflow, check out to our Test-Driven Development Workshop.