"ValueError: Empty module name" when using pathos.multiprocessing

295 views Asked by At

As a preface, I'm utilizing Monte Carlo Tree Search to run a model-based reinforcement learning task. Basically I have an agent foraging in a discrete environment, where the agent can see out some number of spaces around it (I'm assuming perfect knowledge of its observation space for simplicity, so the observation is the same as the state). The agent has an internal transition model of the world represented by an MLP (I'm using tf.keras). Basically, for each step in the tree, I use the model to predict the next state given the action, and I let the agent calculate how much reward it would receive based on the predicted change in state. From there it's the familiar MCTS algorithm, with selection, expansion, rollout, and backprop.

I'd like to run multiple trials of this in parallel to save time. I tried at first using vanilla multiprocessing, but it utilizes pickle, which can't serialize a lot of things (including my code). I'm thus using pathos.multiprocessing which apparently solves this problem because it uses dill. However, when I run my code, instead of the "cannot pickle" errors that accompany vanilla multiprocessing, I get this (sorry for the long length of the trace, I'd cut out some stuff but I'm not sure what's relevant or not, so maybe just scroll to the bottom):


Traceback (most recent call last):
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/multiprocess/pool.py", line 424, in _handle_tasks
    put(task)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/multiprocess/connection.py", line 209, in send
    self._send_bytes(_ForkingPickler.dumps(obj))
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/multiprocess/reduction.py", line 54, in dumps
    cls(buf, protocol, *args, **kwds).dump(obj)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/dill/_dill.py", line 446, in dump
    StockPickler.dump(self, obj)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 409, in dump
    self.save(obj)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 751, in save_tuple
    save(element)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 751, in save_tuple
    save(element)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 521, in save
    self.save_reduce(obj=obj, *rv)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 634, in save_reduce
    save(state)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 821, in save_dict
    self._batch_setitems(obj.items())
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 847, in _batch_setitems
    save(v)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/dill/_dill.py", line 1119, in save_instancemethod0
    pickler.save_reduce(MethodType, (obj.__func__, obj.__self__), obj=obj)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 610, in save_reduce
    save(args)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/dill/_dill.py", line 1408, in save_function
    if not _locate_function(obj): #, pickler._session):
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/dill/_dill.py", line 856, in _locate_function
    found = _import_module(obj.__module__ + '.' + obj.__name__, safe=True)
  File "/Users/~/anaconda3/envs/discrete_foraging/lib/python3.6/site-packages/dill/_dill.py", line 847, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
ValueError: Empty module name

I assume it may have something to do with why it can't be pickled under vanilla multiprocessing, but I'm not sure. Here's the relevant part of the code I'm trying to run:



def trial_runner(args):
# the function I'm trying to parallelize

if __name__ == '__main__':

    # generate the environment
    env = MultiAgentEnv()

    # acquire some sample data

    # create a list of tuple of arguments of length equal to the number of trials 
    input_data = [
        (env, env.world, env.world.agent, env.world.agent.input_elev_memory, env.world.agent.input_food_memory,
         env.world.agent.input_energy_memory, env.world.agent.input_action_memory,
         env.world.agent.output_elev_memory,
         env.world.agent.output_food_memory, env.world.agent.output_history) for _ in range(ep.num_trials)]

    # run the pool 
    results = ProcessingPool().map(trial_runner, input_data)

trial_runner is the function I'm trying to parallelize, which runs the algorithm as described in the preface. Two thoughts I had were:

  1. It may be that it is too complicated to parallelize since it invokes a number of classes and class functions.
  2. They all import the same file (which contains some variable values) and maybe the different processes are getting confused.

Any help would be greatly appreciated.

0

There are 0 answers