Context:
- A Python application server that uses a
concurrent.futures.process.ProcessPool
to execute code - We sometimes want to hot reload imported code without restarting the entire server process
(yes I know importlib.reload
has caveats)
To get this to work I imagine I would have to execute the importlib.reload
in every multiprocessing
process that is managed by the process pool.
Is there a way to submit something to all processes in a process pool?
I don't know how this will play out with the hot reloading attempt you mentioned, but the general question you really asked is answerable.
The challenge here lies in assuring that really all processes get this
something
once and only once and no further execution takes place until every process got it.You can get this type of necessary synchronization with help of a
multiprocessing.Barrier(parties[, action[, timeout]])
. The barrier will hold back parties callingbarrier.wait()
until every party has done so and then release them all at once.Example Output:
If you are okay with keeping
barrier
a global andmultiprocessing.get_context()._name
returns"fork"
, you don't need to use theinitializer
because globals will be inherited and accessible through forking.