I have some Huggingface models that I'd like to finetune with PEFT LORA. I'd also like to use Pytorch Lightning Fabric for FSDP distributed training. However, I'm not sure if they'd be compatible with each other. Does anyone have experience using these two together?
Is it ok to use Huggingface with Pytorch lightning?
42 views Asked by JobHunter69 At
2
You don't need to use Lightning Fabric just for FSDP. You can do so within the Hugging Face ecosystem through their Accelerate package. In the HF accelerate
config, set thedistributed_typetoFSDPand set afsdp_config.Take a look at the entire documentation here.