Reading stderr/stdout durring subprocess execution

43 views Asked by At

I'm trying to run a command in python using the subprocces library and read the stderr during the execution.

As an example i have the following python file:

import time

if __name__ == "__main__":
    time.sleep(1)
    print("testing stdout")
    time.sleep(10)
    print("another msg")

What i would want is to read the printed string before the entire program finishes execution. My current code to do this is as follows:

procId = subprocess.Popen('python3 teststuff.py', shell=True, close_fds=True, stdin =subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

line = procId.stdout.read()
print(line)

This works however the output is not received after the first sleep of 1 second and instead both the print messages are read at once after the entire program has executed. Is there a non blocking way to read the stdout/stderr streams?

Apart from the code above i already tried working with other methods such as using .communicate, however every attempt seems to run into the same problem. It seems like the pipes used to communicate with the subprocess are not closed until execution is finished.

1

There are 1 answers

1
Kurtis Rader On

Most programs fully buffer their stdout stream by default when it is connected to a file or pipe. They line buffer stdout when it is attached to a tty. You can see this without using Python's subprocess module. Open two shell sessions. In the first one run

mkfifo p
cat < p

In the second run your program that generates the output:

python3 -c 'import time; print("line 1"); time.sleep(5); print("line 2")' >p

Repeat the experiment by forcing the Python process to unbuffer stdout by including the -u flag:

python3 -u -c 'import time; print("line 1"); time.sleep(5); print("line 2")' >p