how to set max gpu memory use for each device when using deepspeed for distributed training?

69 views Asked by At

I am newer to deepspeed, and have some experience in deeplearning. I want to know how to set the max gpu memory to use for each device when using deepspeed?.

I have done nothong. I have no thoughts

my gpu device is about 46G, I want to run long llama. the max input length is about 8000-10000. th default llama sequence length is 2048, which cannot support my task

0

There are 0 answers