While working on pynocle some time ago, I found myself getting away from TDD and going back to the more traditional “run-debug-fix” pattern. Write code you think is correct, run it to see if it is, if it isn’t, stick a breakpoint and see what’s wrong, change code, repeat until there are no problems.
While this can often be the quickest way to get something working, it ultimately and always comes back to bite. I’m happy that I’ve gotten to a point with TDD where I notice this behavior and it makes me feel dirty. Though not always dirty enough to stop it, especially if I’m in a difficult-to-test environment depending on modules I can’t run from pure python.
The problems with run-debug-fix are many.
- The code you are writing is difficult enough that you didn’t write it correctly the first time. So what makes you think you or someone else is going to have an easy time debugging or understanding it in the future.
- If the bug was logical, there was obviously some context, state, or situation you had not thought of. How are you sure you will remember this context or situation when you change the code in the future? How can you communicate that your code is relying on a certain state somewhere else?
- If your design is not testable, you are making it even less testable by adding more implicit logic where you’re fixing the bug. Implicit logic that is going to be very difficult to test for when you come back later and forget about it.
- Most importantly: Every bug you fix or feature you add using run-debug-test is a doubly negative activity. -1 for the reasons above and -1 for the missed opportunity to add a test. It would be better to leave the bug there or delete the offending code entirely. You are increasing the complexity of your software by supporting another code path that did not previously work or exist, instead of increasing the stability of your software by adding tests.
The amount of time you spend under the debugger is inversely proportional to the quality of your software.
I used to pride myself on being able to quickly debug and fix problems in my or other people’s code. I now take far more pride in having code that is well tested so that other people can fix problems without spending a long time debugging them.
For the past several weeks I’ve been introducing TDD and a focus on unit testing at work to the TA group. Well I introduced it months ago but am now just convincing (forcing) people to do it. This can be an imposing subject for people that have spent their entire careers scripting inside of Maya. I think I’ve finally figured out the easiest way to ease people into TDD and unit testing in a way that is both easy to do and demonstrates immediate benefit (and in fact done it successfully with two people already).
Writing data/content validation.
Writing validation routines for data follows the TDD paradigms to a T. You just need to explain to people how to write the tests first, and then run those tests from their IDE. So instead of writing some complex function filled with if checks that tests for different things and needs to be commented to explain all the things it tests for and is constantly breaking, you just write your battery of tests for valid and invalid content. Then you run them, and keep improving your validation method(s) until the tests pass. Then, in true TDD style, only go and refactor the validation code itself if you need it- with confidence you aren’t breaking things. And then when you think of more things to validate, you add more tests, with no risk of regression.
TDD is tailor made for data/content validation.
Walk them through the first few tests yourself to get them set up, then have them write the rest of the tests for a validation routine. It will also be very clear if they ‘get it’ or not.
Once they get it, there are some other areas that TDD can be applied to with almost as much ease that I’ll go over next time.