Increasing testability, when coding with Bold for Delphi framework

828 views Asked by At

Background I work in a team of 7 developers and 2 testers that work on a logistics system. We use Delphi 2007 and modeldriven development with Bold for Delphi as framework. The system has been in production about 7 years now and has about 1,7 millions lines of code. We release to production after 4-5 weeks and after almost every release we have to do some patches for bugs we don’t found. This is of course irritating both for us and the customers.

Current testing The solution is of course more automatic testing. Currently we have manual testing. A Testdbgenerator that starts with an empty database and add data from the modelled methods. We also have Testcomplete that runs some very basic scripts for testing the GUI. Lack of time stop us from add more tests, but scripts is also sensitive for changes in the application. For some years ago I really tried unit testing with DUnit, but I gave up after some days. The units have too strong connections.

Unit testing preconditions I think I know some preconditions for unit testing:

  • Write small methods that do one thing, but do it well.
  • Don’t repeat yourself.
  • First write the test that fails, then write the code so the test pass.
  • The connections between units shold be loose. They should not know much about each other.
  • Use dependency injection.

Framework to use We may upgrade to Delphi XE2, mainly because of the 64-bit compiler. I have looked at Spring a bit but this require an update from D2007 and that will not happen now. Maybe next year.

The question Most code is still not tested automatically. So what is the best path to go for increasing testability of old code ? Or maybe it is best to start writing tests for new methods only ? I’m not sure what is the best way to increase automatic testing and comments about it is welcome. Can we use D2007 + DUnit now and then easily change to Delphi XE2 + Spring later ?

EDIT: About current test methodology for manual testing is just "pound on it and try to break it" as Chris call it.

3

There are 3 answers

1
Rob Kennedy On BEST ANSWER

You want the book by Michael Feathers, Working Effectively with Legacy Code. It shows how to introduce (unit) tests to code that wasn't written with testability in mind.

Some of the chapters are named for excuses a developer might give for why testing old code is hard, and they contain case studies and suggested ways to address each problem:

  • I don't have much time and I have to change it
  • I can't run this method in a test harness
  • This class is too big and I don't want it to get any bigger
  • I need to change a monster method and I can't write tests for it.

It also covers many techniques for breaking dependencies; some might be new to you, and some you might already know but just haven't thought to use yet.

7
Chris Thornton On

Your testing team is too small, IMO. I've worked in teams where the QA dept outnumbers the Developers. Consider working in "sprints" of manageable chunks (features, fixes) that fit in smaller cycles. "Agile" would encourage 2-week sprints, but that may be too tight. Anyway, it would keep the QA constantly busy, working farther ahead of the release window. Right now, I suspect that they are idle until you give them a huge amount of code, then they're swamped. With shorter release cycles, you could keep more testers busy.

Also, you didn't say much about their testing methodology. Do they have standard scripts that they run, where they verify appearance and behavior against expected appearance and behavior? Or do they just "pound on it and try to break it"?

IMO, Dunit testing is hard to do with lots of dependencies like databases, communication, etc.. But it's do-able. I've created DUnit classes that automatically run database setup scripts (look for a .sql file with the same name as the class being tested, run the sql, then the test proceeds), and it's been very effective. For SOAP communications, I have a SoapUI mockservice running that returns canned results, so I can test my communications.
It does take work, but it's worth it.

1
DwB On

The requirements for automated unit testing are exactly this:

  1. use an unit testing framework (for example, DUnit).
  2. use some kind of mocking framework.

Item 2 is the tough one.

DRY, small methods, start with a test, and DI are all sugar. First you need to start unit testing. Add DRY, etc. as you go along. Reduced coupling helps to make stuff more easily unit tested, but without a giant refactoring effort, you will never reduce coupling in your existing code base.

Consider writing tests for stuff that is new and stuff that is changed in the release. Over time you will get a reasonable base of unit tests. New and changes stuff can also be refactored (or written nicely).

Also, consider an automated build process that runs unit tests and sends email when the build breaks.

This only covers unit testing. For QA testers, you will need a tool (they exist, but I can't think of any) that allows them to run automated tests (which are not unit tests).