Skip to content

mayer79/gradient_boosting_comparison

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 

Repository files navigation

Features of Major Gradient Boosting Implementations

Selection of features of major gradient boosting implementations XGBoost, LightGBM, and CatBoost for Python and R.

Aspect XGBoost LightGBM CatBoost
Speed: CPU 💕
Speed: GPU ❤️ 💕
Standard losses ✔️ ✔️ ✔️
Special loss: Poisson/Gamma/Tweedie ✔️ ✔️ ✔️
Special loss: Survival ✔️ ✔️
Special loss: Robust ✔️ ✔️ ✔️
Special loss: Quantile ✔️ ✔️ ✔️
Tree size regularization ✔️ ✔️ ✔️
Categorical input handling Python ❤️ 💕
Constraints: monotonic ✔️ ✔️ ✔️
Constraints: interaction ✔️ ✔️
Case weights ✔️ ✔️ ✔️
Missing values ✔️ ✔️ ✔️
Interpretation: Importance ✔️ ✔️ ✔️
Interpretation: SHAP ✔️ ✔️ ✔️
Cross-validation ✔️ ✔️ Python
Special mode: Random Forest ✔️ ✔️
Special mode: Linear booster ✔️ ✔️
Installation easy ✔️ ✔️ ✔️
Initial public release 2014 2016 2017

This compilation as per Feb 10, 2025 is neither complete nor does it claim to be correct.

About

Comparison of major gradient boosting implementations

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published