How to fine-tune pre-trained embeddings in embedding layer?

1.3k views Asked by At

How can I fine tune pre-trained embeddings in embedding layer in tf.keras?

# embedding layer with pre trained weights
embedding_layer = layers.Embedding(
    input_dim=self.vocab_size + 2,
    output_dim=self.emb_size,
    embeddings_initializer=initializers.Constant(embedding_matrix),
    mask_zero=mask_zero,
    trainable=False
)

If I just change trainable = True will it fine tune the pre-trained embeddings that I have? Or should I also have to remove initializers.Constant as initializer?

1

There are 1 answers

8
David On BEST ANSWER

You can refer to the following answer:

When setting the trainable to true you let the embedding layer to fine-tune.

Setting the embeddings_initializer will contradict the trained flag. you should either don't set it to constant, or you can just set the weights with the embedding_matrix. You can refer to the following links:

  1. setting the weights
  2. meaning of constat

Regarding to your comment where is the weights refer to the following link: Keras Embedding ,where is the "weights" argument?