Difference between statsmodel OLS and scikit linear regression; different models give different r square

2.8k views Asked by At

I am new to python and trying to calculate a simple linear regression. My model has one dependent variable and one independent variable. I am using linear_model.LinearRegression() from sklearn package. I got an R square value of .16 Then I used import statsmodels.api as sm mod = sm.OLS(Y_train,X_train) and I got an R square of 0.61. below is the code starting from getting data from big query

****Code for linear regression**** 
    train_data_df = pd.read_gbq(query,project_id)
    train_data_df.head()

    X_train = train_data_df.revisit_next_day_rate[:, np.newaxis]
    Y_train = train_data_df.demand_1yr_per_new_member[:, np.newaxis]

#scikit-learn version to get prediction R2
    model_sci = linear_model.LinearRegression()
    model_sci.fit(X_train, Y_train)


    print model_sci.intercept_
    print ('Coefficients: \n', model_sci.coef_)
    print("Residual sum of squares %.2f"
         % np.mean((model_sci.predict(X_train) - Y_train ** 2)))
    print ('Variance score: %.2f' %model_sci.score(X_train, Y_train))
    Y_train_predict = model_sci.predict(X_train)
    print ('R Square', r2_score(Y_train,Y_train_predict) )


****for OLM****

    print Y_train[:3]
    print X_train[:3]
    mod = sm.OLS(Y_train,X_train)
    res = mod.fit()
    print res.summary()

I am very new to this. Trying to understand which linear regression package should i use?

1

There are 1 answers

0
SAM244776 On

Found out the difference. It was the intercept. OLS does not take it by default. so by adding below code the answers matched.

X = sm.add_constant(X)
sm.OLS(y,X)