why step() returns weird results from backward elimination for full model using lmerTest

1k views Asked by At

I am confused that why the results from processing step(model) in lmerTest are abnormal.

m0 <- lmer(seed ~ connection*age + (1|unit), data = test)

step(m0)

note: Both "connection" and "age" have been set as.factor()


 Random effects:
      Chi.sq Chi.DF elim.num p.value
 unit   0.25      1        1  0.6194

 Fixed effects:
 Analysis of Variance Table

 Response: y
                Df  Sum Sq  Mean Sq F value  Pr(>F)  
 connection      1 0.01746 0.017457  1.5214 0.22142  
 age             1 0.07664 0.076643  6.6794 0.01178 *
 connection:age  1 0.04397 0.043967  3.8317 0.05417 .
 Residuals      72 0.82617 0.011475                  
 ---
 Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

 Least squares means:
      Estimate Standard Error DF t-value Lower CI Upper CI p-value

 Final model:

 Call:
 lm(formula = fo, data = mm, contrasts = l.lmerTest.private.contrast)

 Coefficients:
      (Intercept)       connectionD              ageB  connectionD:ageB  
         -0.84868          -0.07852           0.01281           0.09634 

Why it does not show me the Final model?

1

There are 1 answers

0
alku On BEST ANSWER

The thing here is that random effect was eliminated as being NS according to the LR test. Then the anova method for the fixed effects model, the "lm" object was applied and no elimination of NS fixed effects was done. You are right, that the output is different from "lmer" objects and there are no (differences of ) least squares means. If you want to get the latter you may try the lsmeans package. For the backward elimination of NS effect of the final model you may use stats::step function.