How to access submodel layers encapsulated in Time distributed layer in Keras model

25 views Asked by At

I have developed a model that consists of extracting spatial features from video frames using VGG-19 and, from these spatial features, extracting temporal features using LSTM. For this purpose, I have encapsulated the VGG-19 algorithm in a Time distributed layer. Below is the code that builds the model.

# Extract spatial features with VGG-19
vgg19_extract_features = VGG19(weights='imagenet', include_top=False, input_shape=(224, 224, 3))

# Be able to train some VGG-19 layers
for layer in vgg19_extract_features.layers[:frozen_layers]:
    layer.trainable = False

#Create model encapsulating VGG-19 in Time Dsitributed Layer and adding LSTM and dense layers
model = Sequential()
model.add(TimeDistributed(vgg19_extract_features, input_shape=(sequence_length, 224, 224, 3)))
model.add(TimeDistributed(GlobalAveragePooling2D()))
model.add(LSTM(lstm_units[0], return_sequences=True))
model.add(LSTM(lstm_units[1]))
model.add(Dense(dense_units[0], activation='sigmoid'))
model.add(Dense(dense_units[1], activation='sigmoid'))
model.add(Dense(2, activation='sigmoid'))

#Compile model
optimizer = Adam(learning_rate=learning_rate)
model.compile(optimizer=optimizer, loss='binary_crossentropy', metrics=['accuracy']

This model compiles perfectly. However, I am now trying to create a submodel to be able to extract the output from the VGG-19 sublayers. To do this, we can first check the layers that make up the main model:

for layer in model.layers:
    print(layer.name)

As a result: time_distributed time_distributed_1 lstm lstm_1 dense dense_1 dense_2

And the layers within VGG-19:

for layer in model.layers[0].layer.layers:
    print(layer.name)

As a result: input_1 block1_conv1 block1_conv2 block1_pool block2_conv1 block2_conv2 block2_pool block3_conv1 block3_conv2 block3_conv3 block3_conv4 block3_pool block4_conv1 block4_conv2 block4_conv3 block4_conv4 block4_pool block5_conv1 block5_conv2 block5_conv3 block5_conv4 block5_pool

However, when I want to create a submodel to get the output of the VGG-19 layers with a statement like:

partial_model = Model(inputs=model.layers[0].layer.layers[0], outputs=model.layers[0].layer.layers[20].output)

The error is constantly displayed: ValueError: Graph disconnected: cannot obtain value for tensor.

I understand that this is because when encapsulating a model, inside a Time Distributed Layer, Tensor Flow is not able to access those sublayers. However, I have not found an alternative way to mediate the problem, so I am grateful for any possible solution or help.

Best regards and thanks in advance.

0

There are 0 answers