Can i clear up gpu vram in colab

10k views Asked by At

I'm trying to use aitextgen to finetune 774M gpt 2 on a dataset. unfortunately, no matter what i do, training fails because there are only 80 mb of vram available. how can i clear the vram without restarting the runtime and maybe prevent the vram from being full?

2

There are 2 answers

3
Joyanta J. Mondal On
  1. Run the command !nvidia-smi inside a notebook block.
  2. Look for the process id for the GPU that is unnecessary for you to remove for cleaning up vram. Then run the command !kill process_id

It should help you.

2
Joyanta J. Mondal On

Another solution can be using these code snippets.

1.

!pip install numba
  1. Then:
from numba import cuda
# all of your code and execution
cuda.select_device(0)
cuda.close()

Your problem is discussed in Tensorflow official github. https://github.com/tensorflow/tensorflow/issues/36465

Update: @alchemy reported this to be unrecoverable in terms of turning on. You can try below code.

device = cuda.get_current_device() 
device.reset()