Selection of features of major gradient boosting implementations XGBoost, LightGBM, and CatBoost for Python and R.
| Aspect | XGBoost | LightGBM | CatBoost |
|---|---|---|---|
| Speed: CPU | 💕 | ||
| Speed: GPU | ❤️ | 💕 | |
| Standard losses | ✔️ | ✔️ | ✔️ |
| Special loss: Poisson/Gamma/Tweedie | ✔️ | ✔️ | ✔️ |
| Special loss: Survival | ✔️ | ✔️ | |
| Special loss: Robust | ✔️ | ✔️ | ✔️ |
| Special loss: Quantile | ✔️ | ✔️ | ✔️ |
| Tree size regularization | ✔️ | ✔️ | ✔️ |
| Categorical input handling | Python | ❤️ | 💕 |
| Constraints: monotonic | ✔️ | ✔️ | ✔️ |
| Constraints: interaction | ✔️ | ✔️ | |
| Case weights | ✔️ | ✔️ | ✔️ |
| Missing values | ✔️ | ✔️ | ✔️ |
| Interpretation: Importance | ✔️ | ✔️ | ✔️ |
| Interpretation: SHAP | ✔️ | ✔️ | ✔️ |
| Cross-validation | ✔️ | ✔️ | Python |
| Special mode: Random Forest | ✔️ | ✔️ | |
| Special mode: Linear booster | ✔️ | ✔️ | |
| Installation easy | ✔️ | ✔️ | ✔️ |
| Initial public release | 2014 | 2016 | 2017 |
This compilation as per Feb 10, 2025 is neither complete nor does it claim to be correct.