I am using pathos.multiprocessing in python2, but I think it is the same question with the standard multiprocessing. My code looks like the following:
results = pool.map(func, list_of_args, chunksize=1)
I have read that pool.map
returns results in the same order as the arguments were in, but that the order of computation is arbitrary (source: Python 3: does Pool keep the original order of data passed to map?)
However, I would like to ensure that the order of computation is not arbitrary and that it matches the order in which the arguments were presented. Something like:
results = pool.map(func, list_of_args, chunksize=1, compute_in_given_order=True)
To be clear, my question is not about the order in which the processes finish, but rather the order in which they start. I would like to ensure that the job representing argument 3 in the list begins before the job representing argument 4.
Is this possible? If not, why not?