Tensorflow not allocate tensor/op to all available GPUs

121 views Asked by At

I have two GPUs, a 6GB GTX Titan and a 11GB GTX 1080Ti.

Expectation: Tensorflow should automatically use all GPUs' memory.

Reality: Tensorflow map the two devices as gpu:0 and gpu:1, but it only use gpu:0 and never use gpu:1, when I increase the memory demand it run into OutOfMemory exception without using the memory of gpu:1.

What I want to do is use the 11+6=17 GB memory of two devices. I think maybe tensorflow only support same GPU types, and it will only use one of them if they are different types?

1

There are 1 answers

0
fuzhi On

I think use two different types of GPUs for speed up is a bad practice, since tensorflow won't place ops on different devices. I should've synchronize parameters on CPU, and distribute different data for asynchronous training on multiple GPUs.