How can I fine tune pre-trained embeddings in embedding layer in tf.keras?
# embedding layer with pre trained weights
embedding_layer = layers.Embedding(
input_dim=self.vocab_size + 2,
output_dim=self.emb_size,
embeddings_initializer=initializers.Constant(embedding_matrix),
mask_zero=mask_zero,
trainable=False
)
If I just change trainable = True
will it fine tune the pre-trained embeddings that I have? Or should I also have to remove initializers.Constant
as initializer?
You can refer to the following answer:
When setting the
trainable
totrue
you let the embedding layer to fine-tune.Setting the
embeddings_initializer
will contradict thetrained
flag. you should either don't set it to constant, or you can just set the weights with theembedding_matrix
. You can refer to the following links:constat
Regarding to your comment where is the
weights
refer to the following link: Keras Embedding ,where is the "weights" argument?