bagging/boosting using vowel wabbit

273 views Asked by At

how do i use bagging or boosting in vowpal wabbit with SVM.

My current results are 90% recall and 10% precision.

vw -d train.dat -c compressed --passes 10 --oaa 3 -f train.model --loss_function hinge

I would like to use bagging/boosting to increase precision

1

There are 1 answers

0
Martin Popel On BEST ANSWER

For boosting use --boosting N (added recently, so use VW from GitHub). For bagging, use --bootstrap M. See Gradient boosting on Vowpal Wabbit.

I don't see how recall and precision can be defined for classification into 3 classes. Let's assume for now, you have a standard binary classification (with two classes: positive and negative) and you want to optimize the F1-score (harmonic mean of precision and recall) and you have precision=10%, recall=90%. So only 10% of the positively predicted examples are truly positive. (This can be caused by imbalanced data or different proportion of positive examples in the test data compared to the training data.) In that case, I recommend to increase the importance weight (see [Importance] at VW wiki) of the negative examples (or decrease the importance of positive examples).