How do I get Rllib to use the GPU on my MacBook Pro?

82 views Asked by At

I have a MacBook Pro with the Apple M3 Pro chip running OS X 14.1.2. I am using RLlib to train reinforcement learning models, following the Getting Started with RLlib instructions. I want to use the GPU in this machine for training but I can't figure out how to do this.

I have both tensorflow and torch installed, and both can see the GPU.

import tensorflow as tf
tf.config.list_physical_devices()    
[PhysicalDevice(name='/physical_device:CPU:0', device_type='CPU'),
PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]

import torch
torch.backends.mps.is_available()
True

I am running the following command

rllib train --algo DQN --env CartPole-v1 --stop '{"training_iteration": 30}'

This trains a model, but I see the following line in the output

Logical resource usage: 1.0/12 CPUs, 0/0 GPUs

and Activity Monitor shows no GPU being used for this process.

I tried adding a --ray-num-gpus 1 to this command. The output now reads

Logical resource usage: 1.0/12 CPUs, 0/1 GPUs

but Activity Monitor still shows no GPU usage.

How do I get RLlib to use my laptop's GPU during training?

0

There are 0 answers