I'm implementing a web server using nodejs which must serve a lot of concurrent requests. As nodejs processes the requests one by one, it keeps them in an internal queue (in libuv, I guess).
I also want to run my web server using cluster module, so there will be one requests queue per worker.
Questions:
- If any worker dies, how can I retrieve its queued requests?
- How can I put retrieved requests into other workers' queues?
- Is there any API to access to alive workers' requests queue?
By No. 3 I want to keep queued requests somewhere such as Redis (if possible), so in case of server crash, failure or even hardware restart I can retrieve them.
As you mentioned in the tags that you are-already-using/want-to-use redis, you can use queue-manager based on redis to do all the work for you.
Checkout https://github.com/OptimalBits/bull (or it's alternatives).
bullhas a concept ofqueue. you add jobs to thequeueand listen to the same queue from different processes/vms.bullwill send the same job to only one listener and you have the ability to control how many jobs each listener is processing at the same time (concurrency-level).In addition, if one of the jobs fails to run (in other words, the listener of the queue threw an error),
bullwill try to give the same job to different listener.