Quickly copy big MySQL database for testing

434 views Asked by At

When running our suite of acceptance tests we would like to execute every single test on a defined database state. If one of the tests would write to the database (like create users or something else), it must of course not affect later tests.

We have thought about several options to achieve that, but copying the whole database before every single tests does not seem like the best solution (thinking of possible performance issues).

One more idea was to use MySQL transactions, but some of our tests cause many HTTP requests, so different PHP processes are spawned and they would lose the transaction too early for a clean rollback after the full test is done.

Are there better ways to guarantee a defined database state for every of our acceptance tests? We would like to keep it simpler than solutions like aufs or btrfs tackling it on system level.

1

There are 1 answers

4
xarlymg89 On

You could approach this problem using PhpUnit.

It is used for automated testing with PHP. Is not the only library, but one of the most extended ones.

You could use it with database testing as well ( https://phpunit.de/manual/current/en/database.html ). Basically, it lets you accomplish exactly what you are looking for. Import initially the whole database, and then in each test suite, load what you need and then restore to the previous state. For example, you could save temporarily the current status of the table A and after you are done with all tests of the suite, simply restore it. Instead of reloading the whole database.

By the way, having a minimal Database with only the required information for testing will help a lot as well. In that case you don't have to deal with big performance issues, and you can simply restore it after each test suite.