Running multiprocessing on two different functions in Python 2.7

353 views Asked by At

I have 2 different functions that I want to use multiprocessing for: makeFakeTransactions and elasticIndexing. The function makeFakeTransactions returns a list of dictionaries, which is then added to the async_results list. So essentially, async_results is a list of lists. I want to use this list of lists as input for the elasticIndexing function, but I must wait for the first p.apply_async to finish first before I use the list of lists. How do I ensure that the first batch of multiprocessing is finished before I initiate the next one?

Also, when I run the program as is, it skips the second p.apply_async and just terminates. Do I have to declare a separate multiprocessing.Pool variable to do another multiprocessing operation?

store_num = 1
process_number = 6
num_transactions = 10

p = multiprocessing.Pool(process_number)
async_results = [p.apply_async(makeFakeTransactions, args = (store_num, num_transactions,)) for store_num in xrange(1, 10, 5)]

results = [ar.get() for ar in async_results]
async_results = [p.apply_async(elasticIndexing, args = (result_list,)) for result_list in results]

EDIT:

I tried using p.join() after async_results, but it gives this error:

Traceback (most recent call last):
File "C:\Users\workspace\Proj\test.py", line 210, in <module>
p.join()
File "C:\Python27\lib\multiprocessing\pool.py", line 460, in join
assert self._state in (CLOSE, TERMINATE)

AssertionError
0

There are 0 answers