p.join() is not waiting and going to zombie in multi-processing

38 views Asked by At

I have this script where I will create 4 different processes like this

import multiprocessing

def parallel_process():

    report = multiprocessing.Process(target = get_status)
    update_db_status = multiprocessing.Process(target = update_db)
    add_entries_to_db = multiprocessing.Process(target = add_entries_to_db)
    

    p1 = add_entries_to_db.start()
    p2 = update_db_status.start()
    p3 = report.start()
    
    for p in [p1,p2,p3]:
        p.join()

In this case add_entries_to_db function is taking longer time and has to do lot of processing to add the entries to db. So other processed which are waiting for db is going to zombie state (<defunct>)

How do I prevent this ? Ideally other processed should wait for p1 to complete it job.

0

There are 0 answers