How to fix a bug about Accelerate which is reported 'AcceleratorState' object has no attribute 'distributed_type' by the Pycharm

51 views Asked by At

I was fine-tuning a bert-base model using codes from huggingface's example.I just copied the code from the website and then the bug occured. Code and bug are listed below. btw,the pycharm told me that accelerator's arguments should be list but I give it Dataloader.

Bug:
AttributeError: 'AcceleratorState' object has no attribute 'distributed_type'

Code:

    model = AutoModelForSequenceClassification.from_pretrained(checkpoint, num_labels=2)
    train_dataloader = DataLoader(datasets["train"], shuffle=True, batch_size=8, collate_fn=data_collator)
    eval_dataloader = DataLoader(datasets["validation"], batch_size=8, collate_fn=data_collator)
    optimizer = torch.optim.AdamW(model.parameters(), lr=3e-5)
    model, optimizer, train_dataloader, eval_dataloader = accelerator.prepare(model, optimizer, train_dataloader, eval_dataloader)

I tried upgrade transformers and accelerate.But it didn't work. transformers's version is 4.37.2 accelerate's version is 0.28.0

0

There are 0 answers