support DART - new regularization = dropout trees during learning · Issue #809 · dmlc/xgboost

There is nice article about dropout from neural nets, applied to gradient boosting: http://arxiv.org/pdf/1505.01866.pdf It is about drop out some trees during learning process and rescale weight of...