Listening To The Tests.
Ron Jeffries, in his online article (err... upcoming book) Adventures in C#: Using NUnit, sums things up oh so nicely in this statement about test-driven development:
It occurs to me as I write this that in the course of this book, you will probably encounter examples of doing things in ways that seem almost obtusely simple. I assure you that I'm not doing it to make the book suitable for third-graders, nor because I myself am also simple. I work that way because in the half-dozen years I've been doing Extreme Programming, I've been working in simpler and simpler ways, and my work seems to be getting better and better. So, please, when you see something here that looks odd, give it a try. Often I think you'll find it interesting.
I've been seriously practicing test-driven development for over a year now. It hasn't all been in an XP setting, nor does it need to be. Indeed, I think TDD and continous integration bring a lot to the party on their own. Unfortunately, I'm afraid all too often the baby is being thrown out with the bath water.
When I'm letting tests guide my way, I almost always find something interesting that wouldn't have been illuminated had I just slapped down the code that first popped into my brain. At first it felt awkward and a bit silly: Write a failing test, write just enough code that makes it pass, then refactor in the safety of that test. Writing just enough code is sometimes the silly part as I try to tease out the simplest solution. For some reason it's always tempting to speculate and in the process over-engineer. Yet with TDD I'm usually amazed at how little code it actually takes to build something of value.
In keeping with the TDD rhythm, I'm actually cranking out more code than without testing. And I know it's always working the way I intended it should. And the designs tend to be more responsive to change than when I was coding only after dreaming up a pattern-compliant design.
I had to rewire my brain by listening to the tests. I'm sure glad I did.
[Mike Clark]
Agreed. TDD is an awesome way to write software.
I'm also a fan of Test Driven Debugging. After a piece of software has been developed and a customer or user finds an issue with it - try writing JUnit tests to replicate the bug. This is a useful, productive task on a few levels.
Your first few attempts at writing a test that recreates the bug might actually not fail - in which case, hey, you've just added another working test to your test suite which is useful work anyways. You also often end up with a better understanding of the code and their tests - its a useful time to review the existing tests, maybe enhancing and extending the tests you've got.
Eventually you should be able to write a JUnit test that does indeed replicate the bug and the test should indeed fail. It might be that during the development of the test case you find a few related bugs that were in the code. Now you've got tests you can use to know when you really have fixed the bug or not. Also these tests can ensure that the bug never comes back again.
The great thing about this approach (TDDebug?) is that by investigating user issues you end up improving your suite of test cases, helping to make your software even more reliable.
What with TDD, TDDebug and using "commons-logging", I rarely see the need for debuggers any more. All that debugger work to get to a certain point in your execution of your code - it feels like wasted time when writing tests is so quick, fruitful and reusable work.
Another idea - if you use TDDebug then you can link the test cases to your bug tracker.
11:37:30 AM
|