I understand that using Trainable = False, I can freeze all the weights of a specific layer. But I want to work on the components (kernel, recurrent_kernel and bias) of each layer as well. I want to freeze the kernel and recurrent_kernel of a specific layer to be Trainable = False and Trainable = True. How can I do it?
I tried the following code but got an error. Can anyone suggest to me how can I make kernel and recurrent_kernel Trainable = False using standard Keras?
#transfer model layer
lstm_layer = modelTL.layers[0]
#kernel non-trainable
lstm_layer.cell.kernel.trainable = False
#Error : lstm_layer.cell.kernel.trainable = False
AttributeError: can't set attribute
After building the layer, you can separate the different variables and set them to
trainableor not.