I was trying to use Keras' TimeSeriesGenerator function. As stated in Keras' document, my data and target lengths are equal. I defined the length input as 200.
Using, fit_generator and predict_generator, I fitted the data and predict the outcome. The problem starts here. When I tried to calculate MAE with values the prediction and real data, mae function of scikit gave me an error for lengths of the inputs.
The predicted data is less than the y_test data by the amount of lookback. (.ie.
len(predicted_values) = 1000, len(y_test) = 1200).
Therefore, I can not calculate the mean absolute error. Is there a way to change how Keras handle this case ? I assume, the algorithm just ignore the first 200 rows.
(Data is scaled by MinMaxScaler. So, I have to inverse transform the data in order to calculate the real MAE value instead of the scaled version. That's why I am not using the evaulate_generator for MAE)