How can I do a sequence-to-sequence model (RNN / LSTM) with Keras with fixed length data?

589 views Asked by At

What I'm trying to do seems so simple, but I can't find any examples online. First, I'm not working in language, so all of the embedding stuff adds needless complexity to my task.

I have input, in the form of a (1, 1000) vector. They are time-series data, so I'll have 10 of them in sequence. Which, if I understand tensors correctly, gives me something of shape (10, 1, 1000), right?

I want to pass this through an RNN/LSTM and the output should be also of the same shape (10, 1, 1000). Namely, 10 vectors of 1000 dimensions each.

2

There are 2 answers

3
Daniel Möller On BEST ANSWER

The first thing you need is to know "what" you consider a sequence there.

What are the steps? Are they 10 time steps? Or are they 1000 time steps?


I'll initially assume you have 1000 time steps.

Then the next question is: what are the 10 things? Are they 10 different independent examples of the same nature? Or are they 10 parallel things of different nature (features) from the same example?


These questions are the most important part, you need to know if you have:

  • (10, 1000, 1): 10 individual examples, 1000 timesteps per example, measuring a single variable/feature
  • (1, 1000, 10): 1 long sequence, of 1000 timesteps, measuring 10 independent vars/veatures
    • You will need to divide your data in sliding windows because there are too few examples for the two cases above (there are many examples of how to do this online). If you don't, your model will overfit wildly.
    • Sliding windows you will turn the data into (more_examples, shorter_length, same_features)
  • (1000, 10, 1): 1000 different sequences of 10 time steps measuring a single var/feature
    • Good data, you're good to go
  • (1, 10, 1000): 1 single sequence of 10 time steps, measuring 1000 independent vars/features
    • I'm not sure this data could be trained, you don't have enough examples even if you try sliding windows
  • (10, 1, 1000): 10 individual examples, 1 timestep, measuring 1000 vars/features
  • (1000, 1, 10): 1000 different sequences of 1 time step measuring ten vars.
    • The two cases above are not sequences for working with LSTM.

Once you decided this, then it's time to work:

Stack the input data correctly according to your case and start a model.

I'll consider you have data with the shape (samples, timesteps, features), then your model can go like:

inputs = Input((timesteps, features)) #or (None,features) for variable length
outputs = LSTM(any_units, return_sequences=True)(inputs)
.... can add more LSTM layers with different units, and return_sequences=True
.... can add Conv1D layers with padding='same', any number of filters
outputs = Dense(desired_features, activation=some_useful_activation)(outputs)

Notice that your output is necessarily (samples, timesteps, desired_features). If you want a different final shape, reshape it outside the model.

5
Thibault Bacqueyrisses On

If you only want a LSTM model that takes an input of shape (nb_seq, 1, 1000) (with nb_seq beeing your number of sequences, 10 in your case) and outputs the same shape, here is a basic model that you can adapt :

input_x = Input(shape=(1, 1000))
x = LSTM(64, return_sequences=True)(input_x)
x = LSTM(64, return_sequences=True)(x)
x = Dense(1000)(x)

Model(inputs=input_x, outputs=x)

The LSTM layer with return_sequence=True, will return a tensor of shape (nb_seq, 1, 64) (with 64 beeing the number of neurones of your LSTM layer), so in order to find the original shape you can either pass this tensor throught a Dense layer that will give you a shape of (nb_seq, 1, 1000) or you can directly have 1000 neurones on your last LSTM layer ( i don't recommand because it will generate many parameters).

You can modify this as you wish.

EDIT after precisions

As keras LSTM takes only 3D input you can trick it by passing a Timedistributed flatten layer at the begining like this :

input_x = Input(shape=(10, 1, 1000))
x = TimeDistributed(Flatten())(input_x)
x = LSTM(64, return_sequences=True)(x)
x = LSTM(64, return_sequences=True)(x)
x = Dense(1000)(x)
x = Reshape(target_shape=(10, 1, 1000))(x)

Model(inputs=input_x, outputs=x)

That gives you this summary :

Model summary