Local variable not updated in a loop in the same way as shared memory objects in Python

528 views Asked by At

In the following Python code, the multiprocessing module starts three processes that print out the values of one local variable and two multiprocessing shared memory objects.

import multiprocessing as mp
import os,time

# local variable
count = 0

# shared memory objects (int and array)
scalar = mp.Value('i', 0)
vector = mp.Array('d', 3)

def showdata(label, val, arr):
    print(label, "==> PID:", os.getpid(), ", count:", count, ", int:", val.value, ", arr:", list(arr))

ps = []
for i in range(3):
    count += 1
    scalar.value += 1
    vector[i] += 1
    p=mp.Process(target=showdata, args=(('process %s' % i), scalar, vector))
    p.start()
    ps.append(p)
    # time.sleep(.1)

# block the main thread until all processes have finished...
for p in ps:
    p.join()

The output for this code is the following...

process 0 ==> PID: 35499 , count: 1 , int: 3 , arr: [1.0, 1.0, 1.0]
process 1 ==> PID: 35500 , count: 2 , int: 3 , arr: [1.0, 1.0, 1.0]
process 2 ==> PID: 35501 , count: 3 , int: 3 , arr: [1.0, 1.0, 1.0]

If I change the code to add a delay by uncommenting the time.sleep(0.1) object, then the output changes to the following:

process 0 ==> PID: 35499 , count: 1 , int: 1 , arr: [1.0, 0.0, 0.0]
process 1 ==> PID: 35500 , count: 2 , int: 2 , arr: [1.0, 1.0, 0.0]
process 2 ==> PID: 35501 , count: 3 , int: 3 , arr: [1.0, 1.0, 1.0]

It makes sense that without any delay (ie, the first output above) the shared memory objects would be the same values for all three processes, since once they are started the "for" loop completes rapidly and updates the shared objects' values before the separate processes can run their target "showdata" functions.

However, I am not seeing why the local "count" variable is allowed to update incrementally. I would expect it to be treated like the shared memory objects, where without any delays, the count would be incremented three times rapidly before the "showdata" functions run in the separate processes. By this logic "count" should have a value of 3 for all three processes.

Can anyone explain why this is not occurring?

This is running in Python 3.4.3 in OS X 10.10.3.

3

There are 3 answers

1
Anand S Kumar On BEST ANSWER

I believe that is because mp.Process starts a new process (Please note this is not a new thread of the same process , its a completely new process with its own PID , as you can see in your output) for each of the functions, with each process having its own memory (stack/heap). The Local variables are stored in each process's its own memory, hence when that particular process is called it access its own stack , which contains the count that was present when the process was started.

But for shared memory , created by multiprocessing thread, they are shared between each of the child processes that were spawned by the multiprocessing.Process and the parent process.

1
Blckknght On

The differing behaviors you see from your code is because it has a race condition. The race is between the main process updating the shared memory values and each subprocesses printing out those values. Depending on the timings of the various parts of the code you might get any of several different results (the two you show are the extreme cases, and probably the easiest ones to get, but it's not impossible for some but not all of the subprocesses to see the shared data while it is only partially updated). Try adding a random delay into the showdata function and you might get some more variations.

The reason the "local" variable (which is actually a global variable, but that's not really important) doesn't behave the same way is that it gets copied into each subprocess memory as part of the mp.Process call. There's no race condition there, as it is not possible for the parent process to run ahead and change the value again before the subprocess has received its copy.

0
Gecko On

The non-shared variables are copied when you create the new process, so the for loop does not continue until this copy has been made.