Is Gradient Boosting regression be more accurate (lower MSE) than the random forest?

594 views Asked by At

I just created a Gradient Boosting model whose out-of-sample prediction is worse than the random forest. The MSE of GBM is 10% higher than the random forest. Below is my sample code. I am sure whether there is any wrong with it.

gbm1 <- gbm(as.formula(paste0(Y.idx ,'~', paste0(colnames(rf.tmp.train[c(-1,-2)],collapse=""))),
data=rf.tmp.train,distribution="gaussian",n.trees=3000,         
shrinkage=0.001,interaction.depth=1,bag.fraction = 0.5,          
train.fraction = 1,n.minobsinnode = 10, cv.folds = 10,       
keep.data=TRUE, verbose=FALSE,n.cores=1)
1

There are 1 answers

1
yuanhangliu1 On

From my working experience, gbm usually perform better than random forest and random forest usually perform better than other algorithms. In your case, you might want to tune the parameters for both gbm and random forest. To start, I recommend caret package which carry out the tuning process automatically.

Cheers