I am receiving the following error when attempting to fit my model for training: Can not squeeze dim[2], expected a dimension of 1, got 9 [Op:Squeeze]

The model runs on a singular sample batch but throws the error when I try to fit the entire dataset

batch_size = 64
vocab_size = 42
embedding_dim = 256
rnn_units = 1024
steps_per_epoch = examples_per_epoch//BATCH_SIZE

model = tf.keras.Sequential([


tf.keras.layers.Flatten(input_shape=(64,900),batch_input_shape=[batch_size, None]),        
tf.keras.layers.Embedding(vocab_size, embedding_dim, 
                          batch_input_shape=[batch_size, None]),
rnn(rnn_units,
    return_sequences=True, 
    recurrent_initializer='glorot_uniform',
    stateful=True),    
tf.keras.layers.Dense(vocab_size),

])

model.compile(
optimizer = tf.train.AdamOptimizer(),
loss = tf.losses.sparse_softmax_cross_entropy)



model.fit(dataset.repeat(), epochs=EPOCHS, steps_per_epoch=steps_per_epoch, callbacks=[checkpoint_callback])

each batch input had the following shape (64,100,9) I suspect the issue is somewhere in and around where I have flattened this.

 tf.keras.layers.Flatten(input_shape=(64,900),batch_input_shape=[batch_size, None]),

Any suggestions would be greatly appreciated, thanks

0 Answers