Testing the file system with Jest?

15.6k views Asked by At

I'm using mock-fs to try and test a Webpack plugin I wrote, which modifies a file on my file system.

Here's the test:

test('writes chunks to build/assets.json if no json file present', () => {
   mockFs({
    [buildDir]: {},
  });

  const stats = new Stats({
    assetsByChunkName: {
      main: 'main.somecrazyhash12341213445345.js',
    },
  });
  const compiler = new Compiler(stats);
  const plugin = new ChunksToJsonPlugin(config);

  expect(fs.existsSync(assetFilePath)).toBe(false);

  plugin.apply(compiler);
  compiler.execHandler();

  expect(fs.existsSync(assetFilePath)).toBe(true);
  expect(fs.readFileSync(assetFilePath, 'utf-8')).toEqual(
    JSON.stringify({
      main: 'main.somecrazyhash12341213445345.js',
    })
  );

  mockFs.restore();
});

It works beautifully when I run it by itself, but when I run it as part of a suite, other tests (that don't use mock-fs) break.

ss http://d.pr/i/z2ne+

I notice that mock-fs is in the stacktrace, which leads me to believe that the file system is being mocked in those tests too (which I don't want).

mock-fs states that:

The mock-fs@4 release will contain breaking changes. Instead of overriding all methods of the built-in fs module, the library now overrides process.binding('fs'). The purpose of this change is to avoid conflicts with other libraries that override fs methods (e.g. graceful-fs) and to make it possible to work with multiple Node releases without maintaining copied and slightly modified versions of Node's fs module.

I don't know enough about how process.binding works, especially as it relates to Jest running tests in parallel, but I feel like this is the core issue.

How can I make this work? Is there another way to test this behavior without using mock-fs?

3

There are 3 answers

0
neezer On

Ok, so I can get this working with dependency injection (DI), ditching mock-fs in favor of memfs:

import memfs from 'memfs';

// ...

test('writes chunks to build/assets.json if no json file present', () => {
  // eslint-disable-next-line new-parens
  const fs = new memfs.Volume;

  fs.mountSync(buildDir, {});

  // same as before

  const plugin = new ChunksToJsonPlugin(config, fs);
  // ------------------------------------------ ^^

  expect(fs.existsSync(assetFilePath)).toBe(false);

  // same as before

  expect(fs.existsSync(assetFilePath)).toBe(true);
  expect(fs.readFileSync(assetFilePath, 'utf-8')).toEqual(
    JSON.stringify({
      main: 'main.somecrazyhash12341213445345.js',
    })
  );
});

Correspondingly, my API for ChunksToJsonPlugin had to change as well, so that I'd pass in the actual fs module when running live:

import fs from 'fs';

// ...

new ChunksToJsonPlugin(config, fs)

This works, and now my tests don't care about parallel/serial runs, but I feel like I might be bucking some NodeJS conventions in the process. Generally I haven't seen to much DI when using system imports, so I worry a bit about using this pattern just for the sake of tests.

Still open to knowing whether or not this is possible with mock-fs, or if DI is actually the right approach here.

0
Anthony O'Neill On

I've had a similar issue today and it turns out that my setup and tear down methods were affecting other test suites due to them them running in parallel.

To prevent this, try adding the following flag when running your test suites.

--runInBand

i.e.

jest --runInBand
0
Philippe Hebert On

If you really do need to test against the filesystem and mocking the filesystem would reduce the value of your tests, then you could run each individual test against its own folder, as follows:

Step 1: Declare a base folder

const DIR_BASE = path.resolve(__dirname, '__fixtures__/mytestedmodule');

Step 2: Create a unique subfolder name for each test

it('should ...', async () => {
  const DIR_ID = md5('should ...');
  const DIR = path.resolve(DIR_BASE, `data${DIR_ID}`);
  await mytestedmodule(DIR);
  expect(...);
});
  • The md5 here is used to create a unique hash of the test name so that tests can run independently of each other.
  • Note the data${DIR_ID}: data will be used in the next step as a pattern for post-tests cleanup

Step 3: Clean any folders that haven't been cleaned already

afterAll(async () => {
  const folders =
    (await fs.readdir(DIR_BASE))
      .filter((folder) => folder.match('data'));
  const promises = [];
  for (const folder of folders) {
    promises.push(fs.remove(path.resolve(DIR_BASE, folder)));
  }
  return Promise.all(promises);
});

This solution works as long as you have only a single Jest runner instance running on the folders. If you need to test your application multiple times in parallel, you would have to run the tests on copies of your repository. Bear in mind that on windows there is still a limitation regarding reading of the folders, so if multiple of your tests need to read a same folder, you'd most likely need to have different source folders for each test.