Docusaurus: is writing to the filesystem during loadContent poor practice for plugins?

487 views Asked by At

I'd like to populate my Docusaurus project with documentation using the "docs" plugin and a custom (JavaScript) plugin to connect it to a headless CMS. Currently, I'm using the loadContent Lifecycle API event to call my Headless CMS API and then using fs.writeFileSync to create physical markdown files in '/docs' and overwriting the ./sidebars.js file so the 'docs' plugin that comes with the classic preset works.

./my-plugin/index.js:

module.exports = function (context, options) {
return {
  name: 'my-docusaurus-plugin',
  async loadContent() {
    //calls to Headless CMS API for documentation content
    let response =  await fetchArticles('documentation');
    // Adds the markdown files for 'docs' plugin using fs.writeFileSync   
    await buildArticles(response)

    //fetch homepage and navigation sections from CMS API
    let homepage = await fetchPages('homepage');
    let sidebarSection = await fetchPages('page');

    //overwrite ./sidebars.js with API navigation data using fs.writeFileSync
    await buildSidebar(homepage, sidebarSection);
  }
};

};

This works in that I get content from my CMS and the Documentation renders, but it seems more like a workaround than an elegant solution for connecting a headless CMS with Docusaurus. Am I missing some best practices or is there a better approach using other lifecycle events?

1

There are 1 answers

0
Cole On

I ran into this same problem the other day when building my own Docusaurus plugin, and I was also unable to find a definitive answer in the documentation. It seems the intended way to implement a plugin is to:

  1. Return content from the loadContent function.
  2. Receive that returned content in the contentLoaded function, and use it there to create data to provide to routes. This is also where you'd create the routes that display the data.

However, this model doesn't seem to work for the purpose of retrieving data to be displayed in Docs. The @docusaurus/plugin-content-docs plugin doesn't seem to provide any way for another plugin to give it data to include in Docs, so it seems the only way is to write Markdown files to the /docs directory.

However, as mentioned in Gabin's comment, this results in an infinite loop. After some investigation, I'm able to explain why. The problem is that anytime a file is updated in a directory that's watched by any plugin, a reload is triggered and the loadContent function is called on all plugins. So if a custom plugin is configured to write to /docs and a reload gets triggered, the custom plugin's loadContent function is called and rewrites the files to /docs. But since /docs is watched by @docusaurus/plugin-content-docs, each of the files the custom plugin updates in /docs triggers another reload, which causes the custom plugin's loadContent function to get called again... which results in an infinite loop.

I did manage to find a workaround to this issue, which is to avoid writing to /docs if there are no mismatches between the expected and actual content, thus preventing another reload. There are three different scenarios (which are not mutually exclusive) where the plugin would need to make a change to /docs:

  1. The content of an existing file in /docs needs to change.
  2. A new file needs to be added to /docs.
  3. An old file needs to be deleted from /docs.

For scenario 1 above, I had to write some code to check if the file I'm about to write to /docs already exists there, and to check if it already contains the same content I'm about to write. If those conditions are both true, then I skip writing the file, since there's nothing to change.

For scenario 3 above, I had to move my "cleanup" code to the end of my loadContent function. Because of the sequence change, I also had to modify that code so instead of deleting all files from the destination directory, it only deletes any "extra" ones that aren't expected to be there.

Even with these changes, there will still be at least one unnecessary call to the custom plugin's loadContent function whenever it needs to write to /docs, because the write to /docs will trigger a reload, which calls loadContent again. But then it will find there are no changes to make, and it will avoid writing to /docs again and triggering another reload. Which is much better than an infinite loop, at least!

I think Docusaurus could fix this issue in one of the following ways:

  1. Create a way to for a plugin to tell Docusaurus to only call a its loadContent function on reload if the reload was triggered by a change to a directory returned by the plugin's getPathsToWatch function.
  2. Create a way to provide data to official plugins like @docusaurus/plugin-content-docs. It looks like a "middleware" feature has been proposed in this GitHub Issue, which I think would be a good solution.