Keras word embedding in four gram model

1.2k views Asked by At

I am following coursera neural network class and I am trying to pass the assignments using python+keras instead of octave.

I want to predict the fourth word given the previous three ones. My input documents total 250 unique words.

The model should have an embedding layer that maps each word to a 50-d vector space, a hidden layer with 200 neurons with sigmoid activation function and an output layer of 250 units scoring the probability of the forth word to be equal to those in my vocabulary through a softmax activation.

I am having troubles with dimensions. Here is my code:

    from keras.models import Sequential

    from keras.layers import Dense, Activation, Embedding


    model = Sequential([Embedding(250,50),
                Dense(200, activation='sigmoid'),
                Dense(250, activation='softmax')

    ])



    model.compile(optimizer='rmsprop',
          loss='categorical_crossentropy',
          metrics=['accuracy'])

Yet I never get to compile the model since I am encountering the following error:

    Exception: Input 0 is incompatible with layer dense_1: expected ndim=2, found ndim=3

Any hint will be much appreciated. Thanks in advance

1

There are 1 answers

0
Fernando H'.' On

From https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html

"All that the Embedding layer does is to map the integer inputs to the vectors found at the corresponding index in the embedding matrix, i.e. the sequence [1, 2] would be converted to [embeddings[1], embeddings[2]]. This means that the output of the Embedding layer will be a 3D tensor of shape (samples, sequence_length, embedding_dim)."

Your embedding layer outputs 3 dimension vectors while the dense layers expects 2 dim vecs.

You can follow the links tutorial and with some mods it will fit your problem.