Using Keras for Simple Linear regression: Model not predicting correctly

27 views Asked by At

Please I need help concerning why my simple linear regression model is not able to predict correctly. I am trying to understand keras

I am trying to use keras to build a simple linear regression model. I tried to predict the incorrect output. Please, I need your help

import numpy as np
import tensorflow as tf
from tensorflow import keras

# Data
x = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
y = [10, 20, 30, 40, 50, 60, 70, 80, 90, 100]

# Define the model
model = Sequential()
model.add(Dense(units=1, input_shape=(1,)))
model.compile(optimizer=tf.keras.optimizers.RMSprop(learning_rate=0.005), loss='mse')

model.fit(x,y, epochs=100, verbose=0)

x1=[11,12,13]
predy= model.predict(x1)

print(predy)
1

There are 1 answers

0
Onyambu On

The issue at hand here is that you have not minimized your loss function. ie you have not converged. You cannot get good predictions with a model that has not minimized the loss function. Note that ML enjoys big data. Since you have fewer points, you need more iterations to minimize the loss. Try implementing the gradient descent yourself for linear regression and you will note that it takes time before convergence. Give the same time to the ML code above. below is the code:

# Data
x = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
y = [10, 20, 30, 40, 50, 60, 70, 80, 90, 100]

# Define the model
model = tf.keras.Sequential(
  [tf.keras.layers.Dense(units=1, input_shape=(1,))]
)

model.compile(optimizer=tf.keras.optimizers.RMSprop(learning_rate=0.01), loss='mse')

model.fit(x,y, epochs=2000, verbose=0)

x1=[11,12,13]
predy= model.predict(x1)

print(predy)

[[110.059975]
 [120.06498 ]
 [130.06998 ]]

You can always determine that the model did converge by looking at the metrics. If the loss is not close to 0, then you have not converged.

model.get_metrics_result()['loss']
<tf.Tensor: shape=(), dtype=float32, numpy=0.0012610962>

Try this for your example, and you will note that you have very large number