Linked Questions

Popular Questions

I was trying to use an RNN (specifically, LSTM) for sequence prediction. However, I ran into an issue with variable sequence lengths. For example,

sent_1 = "I am flying to Dubain"
sent_2 = "I was traveling from US to Dubai"

I am trying to predicting the next word after the current one with a simple RNN based on this Benchmark for building a PTB LSTM model.

However, the num_steps parameter (used for unrolling to the previous hidden states), should remain the same in each Tensorflow's epoch. Basically, batching sentences is not possible as the sentences vary in length.

 # inputs = [tf.squeeze(input_, [1])
 #           for input_ in tf.split(1, num_steps, inputs)]
 # outputs, states = rnn.rnn(cell, inputs, initial_state=self._initial_state)

Here, num_steps need to be changed in my case for every sentence. I have tried several hacks, but nothing seems working.

Related Questions