I am using a Linux Docker container, and the code is managed by Python. I use Python code to generate a subprocess run call that runs a compiled matlab program. This compiled matlab program is calling another external program that requires to read an environmental variable.
my_env = os.environ.copy()
cmd = f'<abs path to my compiled program>/<compiled program "{path to a file it needs to be read}"'
log.info(f"Calling the matlab function with: \n {cmd}")
res = sp.run(cmd, shell=True, capture_output=True, text=True, env=my_env)
This is not working (I am using Python 3.9.15).
If I run the matlab program in the command line without Python's subprocess, the env variables are passed and the Matlab program works ok.
If I run the same command from Python, the env variables are not passed.
In Python, all the correct env variables are there (os.getenv() returns the proper variables, and my_env has the proper variables.
I am using a workaround that it is working so far:
cmd = (
f'<abs path to my compiled program>/<compiled program '
f'"{path to a file it needs to be read}" >> log_file.txt'
)
res = os.system(cmd)
if res != 0:
die('show error message')
if os.path.isfile('log_file.txt'):
print(open('log_file.txt','r').read())
But I don't think it is ideal. I would rather keep using the stdout and stderr and the subprocess call.
Any suggestions?