ValueError: Shape (None, 17) must have rank 1

304 views Asked by At

I am working on a hand character recognition model. I created a CNN+BiLSTM+CTC Loss model. But getting error when I run model.fit(). Please help me fix this error.

My Model

    # input with shape of height=32 and width=128 
    inputs = Input(shape=(32,128,1))
 
    # convolution layer with kernel size (3,3)
    conv_1 = Conv2D(64, (3,3), activation = 'relu', padding='same')(inputs)
    # poolig layer with kernel size (2,2)
    pool_1 = MaxPooling2D(pool_size=(2, 2), strides=2)(conv_1)
 
    conv_2 = Conv2D(128, (3,3), activation = 'relu', padding='same')(pool_1)
    pool_2 = MaxPooling2D(pool_size=(2, 2), strides=2)(conv_2)
 
    conv_3 = Conv2D(256, (3,3), activation = 'relu', padding='same')(pool_2)
 
    conv_4 = Conv2D(256, (3,3), activation = 'relu', padding='same')(conv_3)
    # poolig layer with kernel size (2,1)
    pool_4 = MaxPooling2D(pool_size=(2, 1))(conv_4)
 
    conv_5 = Conv2D(512, (3,3), activation = 'relu', padding='same')(pool_4)
    # Batch normalization layer
    batch_norm_5 = BatchNormalization()(conv_5)
 
    conv_6 = Conv2D(512, (3,3), activation = 'relu', padding='same')(batch_norm_5)
    batch_norm_6 = BatchNormalization()(conv_6)
    pool_6 = MaxPooling2D(pool_size=(2, 1))(batch_norm_6)
 
    conv_7 = Conv2D(512, (2,2), activation = 'relu')(pool_6)
 
    squeezed = Lambda(lambda x: K.squeeze(x, 1))(conv_7)
 
    # bidirectional LSTM layers with units=128
    blstm_1 = Bidirectional(LSTM(128, return_sequences=True, dropout = 0.2))(squeezed)
    blstm_2 = Bidirectional(LSTM(128, return_sequences=True, dropout = 0.2))(blstm_1)
 
    outputs = Dense(len(char_dict)+1, activation = 'softmax')(blstm_2)
 
    act_model = Model(inputs, outputs)

Define a CTC loss model that takes the outputs of previous model as inputs

    labels = Input(name='the_labels', shape=[max_length], dtype='float32')
    input_length = Input(name='input_length', shape=[1], dtype='int64')
    label_length = Input(name='label_length', shape=[1], dtype='int64')
    def ctc_lambda_func(args):
       y_pred, labels, input_length, label_length = args
     return K.ctc_batch_cost(labels, y_pred, input_length, label_length)

    loss_out = Lambda(ctc_lambda_func, output_shape=(1,), name='ctc')([outputs, labels, input_length, 
    label_length])
    model = Model(inputs=[inputs, labels, input_length, label_length], outputs=loss_out)
    model.compile(loss={'ctc': lambda y_true, y_pred: y_pred}, optimizer = 'adam')
    model.fit(x=[input_array,
            output_array, 
            train_input_length,
            train_label_length],
            y=np.zeros(input_array.shape[0]),
            batch_size=256,
            epochs = 100,
            validation_data = ([test_input_array, test_output_array, valid_input_length, 
            valid_label_length], [np.zeros(test_input_array.shape[0])]),
            verbose = 1,
            callbacks = callbacks_list)

The error I am getting is

      ValueError: Shape (None, 17) must have rank 1
0

There are 0 answers