Gradient Boosting

LGB, the winning Gradient Boosting model

Last time, we tried the Kaggle’s TalkingData Click Fraud Detection challenge. And we used limited resources to handle a 200 million records sized dataset. Although we can make our classification with Random Forest model, we still want a better scoring result.  Inside the Click Fraud Detection challenge’s leaderboard, I find that most of the high scoring […]

LGB, the winning Gradient Boosting model Read More »