Initializing Hidden State for GRU RNN using feed forward neural network

26 views Asked by At

I saw a paper where someone was able to initialize the hidden state of a RNN by using a feed forward NN. I was trying to figure out how this could by done but keep getting error messages while developing the model. I have time series data that is at least 100 values for 2000 independent runs. I want the input to be used to both develop the hidden state, and also be used in the RNN.

Currently this is how I was trying to create the model:

from tensorflow.keras import layers, Model
from tensorflow.keras import GRU, Dense

units = 200
N_inputs = 4
N_outputs = 10


inputs = Input(shape = (None, N_inputs))
state_init = layers.Dense(units)(inputs)
GRU_layer = layers.GRU(units = units, input_shape = (None, N_inputs), return_sequences = True)\
(inputs, initial_state = state_init)
outputs = layers.Dense(units = N_outputs)
model = Model(inputs, outputs)

I am getting this error:

ValueError: An 'initial_state' was passed that is not compatible with 'cell.state_size'. Received 'state_spec'=ListWrapper([InputSpec(shape=(None, None, 200), ndim=3)]); however 'cell.state_size is [200]

Is this even possible or do I have to create some custom code for this? Any help would be greatly appreciated.

The paper is here: https://ieeexplore.ieee.org/document/7966138

0

There are 0 answers