Does the size of the node_modules folder matters during cold starts for cloud functions?

827 views Asked by At

I know that you should only import the required modules inside the index file's global scope to reduce the time of the cold starts.

But I haven't yet found out that if the size of the node_modules folder (or the length of the dependencies property of the package.json file) matters for cold starts in case of cloud functions or only the imported modules count?

2

There are 2 answers

1
guillaume blaquiere On BEST ANSWER

The size doesn't matter! When you publish your code, a container is built (with buildpacks). This container is cached and so, when the function start, the container don't have to be downloaded.

There, the size don't impact the startup time (because there isn't download).

HOWEVER, a large node_modules folder (and more generally, in any languages, a large number of dependencies) can mean a lot of things initialized at startup (services, connexion to database,...). If it's in eager mode, that increases the startup duration.

Prefer a lazy initialization of your components, even the global ones. Prefer also the light frameworks (or no framework) instead of big Berta!!

EDIT

There is no cache policy management for the functions version. When a version is built and active, the image is cached, that's all. When a new is built, the previous will be evicted, a day. You can't manage this, it's serverless!

To insist on the size doesn't matter, I found 2 links about Cloud Run. Yes, Cloud Run!! In fact, Cloud Run and Cloud Functions share the same backend and the same behavior (your function is packaged in a container (as explained before) and served with the same logic as the Cloud Run containers)

So, here the official Google documentation and an unofficial FAQ maintained by a guru of Cloud Run (Ahmet, Dev Advocate on Cloud Run @Google)

Eventually, the size doesn't matter, but I found that the language matters!! In this article also wrote by a Googler, there is an explanation of NodeJS startup behavior

When a module boots, node.js resolves all require() calls from the entry point using synchronous I/O, which can take a significant amount of time if the number of dependencies are large, or if the content itself requires a lot of linking.

So, it's more a language problem and optimization to perform (lazy loading?) than a platform issue. The article provides many way to optimize the code!

4
chaiyachaiya On

under the hood when you deploy a cloud function, Google will create an image from your code source located into GCS. This image will embbed all runtime libraries needed by your source code to run. Each time you need a new instance of your function Google will start a container from this image.

Now, container "physics" directly relates the startup time with the size of your image. The thiner your image is, the quicker is the startup time. So yes, it kind of matters :)

More info in the official GCP page.