I am trying to multiprocess a function with constant arguments as well as varying ones.

I have tried creating a large iterable that appends the constant arguments to the varying ones and passing that to Pool.starmap() but this seems messy.

from multiprocessing import Pool
import itertools

def combine_iterables(foo, bar, iterable_list):
    foobar = (foo, bar)
    arg_list = list(itertools.product(*iterable_list))

    for i in range(len(arg_list)):
        arg_list[i] = foobar+arg_list[i]

    return arg_list
    # Returns [('foo', 'bar', 10, 140), ('foo', 'bar', 10, 150)...]

def process(foo, bar, small=[10,20,30], big=[140, 150, 160]):
    iterable_list = [small,big]
    p = Pool()
    data = p.starmap(
                func,
                combine_iterables(foo,bar,iterable_list)
                )
    p.close()
    p.join()


def func(foo, bar, small, big):
    #Do Stuff

This method works but it feels really messy and I was wondering if there's a better way of doing this?

2 Answers

1
Aaron On Best Solutions

This can be done quite easily with functools.partial

from multiprocessing import Pool
import itertools
import functools

def func(foo, bar, small, big):
    return [foo, bar, small, big]

def process(foo, bar, small=[10,20,30], big=[140, 150, 160]):
    iterable_list = [small,big]
    p = Pool()
    data = p.starmap(
                functools.partial(func, foo, bar),
                itertools.product(*iterable_list)
                )
    p.close()
    p.join()
    print(data)

if __name__ == '__main__':
    process('foo', 'bar')
2
blhsing On

You can use functools.partial to create a function that provides default arguments to another function:

from functools import partial

def process(foo, bar, small=[10,20,30], big=[140, 150, 160]):
    iterable_list = [small,big]
    with Pool() as p:
        data = p.starmap(partial(func, foo, bar), product(*iterable_list))