XGBoost Documentation — xgboost 3.1.0-dev documentation (original) (raw)
XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.
Contents
- Installation Guide
- Building From Source
- Get Started with XGBoost
- XGBoost Tutorials
- Introduction to Boosted Trees
- Introduction to Model IO
- Slicing Models
- Learning to Rank
- DART booster
- Monotonic Constraints
- Feature Interaction Constraints
- Survival Analysis with Accelerated Failure Time
- Categorical Data
- Multiple Outputs
- Random Forests(TM) in XGBoost
- Distributed XGBoost on Kubernetes
- Distributed XGBoost with XGBoost4J-Spark
- Distributed XGBoost with XGBoost4J-Spark-GPU
- Distributed XGBoost with Dask
- Distributed XGBoost with PySpark
- Distributed XGBoost with Ray
- Using XGBoost External Memory Version
- C API Tutorial
- Text Input Format of DMatrix
- Notes on Parameter Tuning
- Custom Objective and Evaluation Metric
- Advanced Usage of Custom Objectives
- Intercept
- Privacy Preserving Inference with Concrete ML
- Frequently Asked Questions
- GPU Support
- XGBoost Parameters
- Prediction
- Tree Methods
- Python Package
- R Package
- JVM Package
- Ruby Package
- Swift Package
- Julia Package
- C Package
- C++ Interface
- CLI Interface
- Contribute to XGBoost
- Release Notes