Heres a gist of what I'm doing: https://gist.github.com/MattCollins84/75f9ebd422ed6d1d5c91
As part of some process I generate a bash script that has a bunch of curl commands in it (around 20k commands). I want to run this script via node.
I am using spawn to do this, and it works fine. Except that after 70 or so commands, it just stops. The readstream that is created by spawn stops outputting any data. There are no errors, or anything as far as I can see.
If I do "ps x | grep curl" to see what is happening, I can see that process id is changing at first, but then it just seems to halt at a certain point and never starts again. The process just hangs. Manually killing this process doesn't let the next one begin. Also, the process that relates to my bash script is still present, again, killing that makes no difference.
Observations and things I've ruled out:
- Using minimal resources
- running the generated bash script on the terminal works fine
- doesn't seem to matter which URL I am curling (i.e. it's not my application)
I feel like there is something daft I am missing, but I didn't know what to Google to figure it out!
I was just hoping to run this file as if I was on the terminal, but it appears Node places some kind of restriction to stop it running out of control. Or something.
Any ideas?! Thanks
Although I am not familiar with Node's spawn function, I am familiar with Unix pipes. It sounds like in your first scenario, the program you ran produced output but your program did not read that output. When the pipe's buffer is full, the program you executed will block trying to write to it. It will be unblocked when your program reads from the pipe.
The solution you found (
stdio: 'inherit'
) probably tells the function to run the child process with the same stdout, stderr, and stdin streams. Thus your program doesn't need to read from the pipe because it isn't a pipe. The program you executed then writes to the terminal, which reads the output, and thus it doesn't block.