Wrong dense layer output shape after moving from TF 1.12 to 1.10

125 views Asked by At

I'm migrating from Tensorflow 1.12 to Tensorflow 1.10 (Collaboratory -> AWS sagemaker), the code seems to be working fine in Tensorflow 1.12 but in 1.10 i get an error ValueError: Error when checking target: expected dense to have 2 dimensions, but got array with shape (52692,)

Input example - strings with no whitespaces:

["testAbc", "aaDD", "roam"]

which I preprocess by changing small letters into 1, capital letters 2, digits - 3, '-' - 4, '_' - 5 and padding so they are equal length with 0s

and 4 labels a - 0, b - 1, c - 2, d - 3

Assuming max length for each word is 10 (in my code it's 20):

features - [[1 1 1 1 2 1 1 0 0 0][1 1 2 2 0 0 0 0 0 0][1 1 1 1 0 0 0 0 0 0]]

labels - [1, 1, 2, 3]

expected output: [a: 0%, b: 0%, c: 1%, d: 99%] (example)

model = keras.Sequential()
model.add(
    keras.layers.Embedding(6, 8, input_length=maxFeatureLen))
model.add(keras.layers.LSTM(12))
model.add(keras.layers.Dense(4, activation=tf.nn.softmax))
model.compile(tf.train.AdamOptimizer(0.001), loss="sparse_categorical_crossentropy")
model.fit(train["featuresVec"],
            train["labelsVec"],
            epochs=1,
            verbose=1,
            callbacks=[],
            validation_data=(evale["featuresVec"], evale["labelsVec"],),
            validation_steps=evale["count"],
            steps_per_epoch=train["count"])

Shapes of train and evale - 2D arrays

train["featuresVec"]=
[[1 2 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0]
 [1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0]
 [1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
 [1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
 [2 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0]]

evale["featuresVec"]=
[[1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0]
 [1 1 1 1 1 1 2 1 1 1 1 1 0 0 0 0 0 0 0 0]
 [1 1 1 1 1 2 1 1 1 1 1 1 2 1 1 1 1 1 1 0]
 [1 1 1 1 1 2 1 1 1 1 1 2 1 1 1 1 1 1 0 0]
 [1 1 1 1 1 2 1 1 1 1 1 1 0 0 0 0 0 0 0 0]]

train["labelsVec"] = [1 0 0 0 2]
evale["labelsVec"] = [0 1 1 1 1]

Shapes:

train["featuresVec"] = [52692, 20]
evale["featuresVec"] = [28916, 20]
train["labelsVec"] = [52692]
evale["labelsVec"] = [28916]
1

There are 1 answers

2
Anna Krogager On BEST ANSWER

Probably your labels vector needs to be of shape (batch_size, 1) instead of just (batch_size,).

Note: Since you are using sparse_categorical_crossentropy as loss function instead of categorical_crossentropy, it is correct to not one-hot encode the labels.