Testing DAO Methods in Java: Fake Implementations vs. In-Memory Databases

135 views Asked by At

I’m currently working on a Java project using Java 17, Dropwizard, and JUnit 5, and I’m focusing on improving my unit tests and adopting Test-Driven Development (TDD) practices. My application interacts with a database through DAO interfaces, and I’m exploring the best ways to test these interactions, especially for methods that perform operations like inserting data into the database but don’t return a value.

Given the nature of these methods, I’m considering two main approaches for testing:

  1. Using Fake Implementations: Creating fake implementations of my DAO interfaces to simulate database operations in memory
  2. Using In-Memory Databases: Utilizing an in-memory database like H2 to execute real database operations in a controlled environment.

I understand that fake implementations offer speed and simplicity by avoiding the overhead of setting up a real database connection, making them ideal for unit testing. On the other hand, in-memory databases provide a more realistic test environment, which seems beneficial for integration testing to ensure that my SQL queries and transactions behave as expected.

My Questions:

  1. In the context of TDD and considering the balance between speed and realism in testing, which approach would you recommend for testing methods that don’t return a value?
  2. Are there specific scenarios or project phases where one approach is clearly preferable over the other?

I’m aiming for a testing strategy that not only ensures reliability and maintainability but also aligns with best practices for TDD. Any insights, experiences, or recommendations you could share would be greatly appreciated.

Thank you!

2

There are 2 answers

0
ndc85430 On BEST ANSWER

In the context of TDD and considering the balance between speed and realism in testing, which approach would you recommend for testing methods that don’t return a value?

Test behaviour, not methods. As you mentioned, the behaviour you want is that you can write an item successfully and the way you test that behaviour is that you try and read the item. This might look something like

@Test
public void it_can_save_an_item() {
    MyDao dao = new MyDao();
    Item item = new Item("foo", "bar");

    dao.save(item);

    List<Item> savedItems = dao.findAll();
    assertEquals(item, savedItems.get(0));
}

Personally, I don't use in-memory databases, for the same reasons that others have mentioned above.

The approach I'd tend to take is to use a fake and to integration test the real implementation, using contracts to ensure that both versions of the abstraction behave in the same way. See, for example, https://quii.gitbook.io/learn-go-with-tests/testing-fundamentals/working-without-mocks on fakes and contracts (examples are in Go, but the advice is independent of language really).

My perspective here comes from doing outside-in (or "London style") TDD, where one drives out the design of the system from an acceptance test. That is, you're figuring out what abstractions you need and how they should look and fakes are quite a lightweight way to do help there (but there are other reasons - see the link above).

I suppose if you're doing more inside-out ("Chicago style") and you're unit-testing a component that needs the DAO, you still want the same kind of confidence - that you can test your component with a version of the DAO that behaves like the real thing.

1
Mark Seemann On

Do both. Apply the lessons from the Test Pyramid. Most tests should run fast, which often works best when running entirely in memory. On the other hand, there are going to be implementation errors or bugs that such tests can't flush out, so enhance the unit tests with a smaller set of integration tests.

I'd typically recommend that integration tests use the database technology that is also going to be used in production. Thus, unless you also plan to use an in-memory database in production, don't use one for integration testing. If you use Oracle in production, run the integration tests against Oracle, etc.

Most database technologies enable you to fully automate creation, configuration, and teardown of databases, so make sure that you automate the database as part of the standard four-phase test pattern. In other words, don't run integration tests against a shared 'test database', but make sure that each integration test runs against an isolated database that exists solely for that purpose. I usually create the database in the test's 'setup' phase and delete it again in the tests' 'teardown' phase.

xUnit Test Patterns have much useful information about that, and many other things related to unit testing.