Less durability = faster tests
Striving for faster automated test runs is a worthwhile goal. It makes the test feedback loop shorter and hence development more efficient and pleasurable.
One way of achieving faster tests is to reduce the data durability when it is not needed. There are plenty of scenarios when that happens:
If tests work with files, we can circumvent file storage by working with files entirely in memory. The files would typically be deleted afterward anyway.
If tests work with some form of cache, we can opt in for a faster, less durable cache, or simply stub or mock the cache to bypass it entirely. Unless the cache itself is being tested, there is no need to involve it or make it durable.
If tests need to work with a database, less durable database settings can lead to speedup in the data processing.
As an example, the PostgreSQL database has a number of non-durable settings that can be leveraged for test runs. These settings typically won’t have any negative effect on the test correctness but will speed up any integration test with the database.
On the product I am working on right now, we have observed 25%-33% faster test runs when such non-durable settings were used (-c shared_buffers=512MB -c fsync=off -c full_page_writes=off -c synchronous_commit=off
). It makes a huge difference.
May your tests run fast,
Petr