List Question
20 TechQA 2024-03-22T16:28:34.097000How to prevent DataCollatorForLanguageModelling from using input_ids as labels in CLM tasks?
11 views
Asked by Kirk Walla
Repo id must use alphanumeric chars : while performing auto training on llm
36 views
Asked by Ankur Kumar
Upgrading accelerate while using Trainer class
32 views
Asked by 0xD4rky
How can I get the class token out of the output_hidden_states?
12 views
Asked by Mohammad Elghandour
Can't run fine-tuning for llama 7b with LORA (OOM)
45 views
Asked by CyBer CyBer
HuggingFace Trainer starts distributed training twice
33 views
Asked by Florian Rudaj
'CTCTrainer' object has no attribute 'use_amp'
43 views
Asked by stanley101
Why does Seq2SeqTrainer produces error during evaluation when using T5?
34 views
Asked by Kirk Walla
Having "torch.distributed.elastic.multiprocessing.errors.ChildFailedError:" error when using accelerator
263 views
Asked by Dia-di
How to make the Trainer (transformer) load the data batch by batch during training?
18 views
Asked by liam jrj
Huggingface Seq2seqTrainer freezes on evaluation
65 views
Asked by InvalidHop
Why I am unable to import trl package in Jupyter?
168 views
Asked by Srinivas
Plotting train accuracy and loss with Trainer
38 views
Asked by Rakha
Fine-tuning a model on sequences longer than the max sequence input length
125 views
Asked by WackMingo
While using Seq2SeqTrainingArguments function, This error is displayed: Using the `Trainer` with `PyTorch` requires `accelerate>=0.21.0`
133 views
Asked by Disha Dinesh Agarwal
Fine tuning a model with add_adapter==True resulted in training loss=0 , validation loss=nan
23 views
Asked by Natali Gzraryan
How can I train an LLM through Hugging Face using all the computational power at my disposal?
102 views
Asked by Giacomo Saracchi
Save and load the nsql-llama-2-7B - AutoModelForCausalLM model
133 views
Asked by or b
Cannot change training arguments when resuming from a checkpoint
142 views
Asked by Jon Flynn
How to modify the code below to adapt to multi-task using bert
41 views
Asked by Glinty