How to use nodejs cluster to ensure more cores are used for processing

170 views Asked by At

I have a nodejs app that simply copies data from an SD card to a network location. I am running this on a raspberry pi 4 and depending how the workload is spread across the cpu cores the process can maximise the network connection and run at around 115MBps or it can be cpu bound and only run at around 65MBps.

In my tests I run 2 separate copy operations at the same time. These 2 copy operations run on separate nodejs cluster workers:

{"context":"CopyService (worker:2)","level":"info","message":"Copying /media/usb0 -> /mnt/videos/2022/May/2022_05_31/Cam1"}
{"context":"CopyService (worker:1)","level":"info","message":"Copying /media/usb1 -> /mnt/videos/2022/May/2022_05_31/Cam2"}

Despite them running on separate workers the 1st cpu core is often at 100% whilst the others are doing very little:

htop screen shot

There are other examples where core 1 is at 100% and the others are all at 30% but I was unable to get a screenshot.

Based on the fact that I have 2 workers in play here I would expect 2 cores to be at 100% and the rest to be under very little load but I suspect that workers are not tied to cores.

How do I ensure that I can distribute the load from my nodejs programme to make sure that teh copy process does not become CPU bound?

My cluster is created as so:

if (cluster.isPrimary && !isNaN(clusterLimit) && clusterLimit > 1) {
    const nodeCount = Math.min(cpus().length, clusterLimit);
    logger.log(`Primary node  starting ${nodeCount} workers...`);

    for (let index = 0; index < nodeCount; index++) {
        cluster.fork();
    }

    return;
}

and I am creating 4 workers.

0

There are 0 answers