⇤ ← Revision 1 as of 2025-03-20 13:10:10
Size: 798
Comment: Initial commit
|
← Revision 2 as of 2025-04-08 14:47:27 ⇥
Size: 912
Comment: Cleanup
|
Deletions are marked like this. | Additions are marked like this. |
Line 3: | Line 3: |
'''XGBoost''' is a gradient boosting library. | '''XGBoost''' is a software implementation of [[Statistics/GradientBoosting|gradient boosting]] for estimating [[Statistics/DecisionTrees|decision trees]]. |
Line 19: | Line 19: |
== Importance == | == Interpretation == |
XGBoost
XGBoost is a software implementation of gradient boosting for estimating decision trees.
Contents
Usage
XGBoost is written in C++, but generally is used through the official bindings for either R or Python.
Interpretation
Features of an XGBoost model are evaluated according to three attributes:
Gain: relative contribution of a feature to the overall model
Cover: proportion of observations related to a feature
- the number of observations where this feature is the deciding tree split, then normalized across all features
Frequence: proportion of tree splits formed by a feature
All of these attributes are relative, sum to 1, and can be directly compared.