I need to run a bunch of parallel processes, but cannot use the standard multiprocessing package since its serialization with pickle does not work for more complex objects. Therefore I'm currently using pathos.multiprocessing which uses dill for the serialization and it works flawlessly in that regard.
However, I would like to have an exit condition, so that all procceses get terminated once the result of a process meets a certain condition (I'm computing objective values for an optimization problem and I want all procceses to stop once the result from a process is worse than any of the previous results).
For the standard multiprocessing package I found this solution (taken from https://stackoverflow.com/a/21491438/15799363). Can I do something similar with pathos.multiprocessing? I couldn't figure out how to pass a callback function to processes with pathos.
from random import random
from multiprocessing import Pool
from time import sleep
def add_something(i):
# Sleep to simulate the long calculation
sleep(random() * 30)
return i + 1
def run_my_process():
# Create a process pool
pool = Pool(100)
# Callback function that checks results and kills the pool
def check_result(result):
print(result)
if result == 90:
pool.terminate()
# Start up all of the processes
for i in range(100):
pool.apply_async(add_something, args=[i], callback=check_result)
pool.close()
pool.join()
if __name__ == '__main__':
run_my_process()