How to initialize BertForSequenceClassification for different input rather than [CLS] token?

405 views Asked by At

BertForSequenceClassification uses [CLS] token's representation to feed a linear classifier. I want to leverage another token (say [X] in the input sequence) rather than [CLS]. What's the most straightforward way to implement that in Transformers?

1

There are 1 answers

0
Eric Lozano On

You can define the special tokens when creating the tokenizer.

This is an example of how to modify an special token of a pretrained tokenizer:

from transformers import BertTokenizer

tokenizer = BertTokenizer.from_pretrained("bert-base-uncased", cls_token="[X]")

Please check the BertTokenizer documentation to see which other special tokens you can modify (unk_token, sep_token, pad_token, cls_token, mask_token, etc.).