XGBoost
XGBoost[2] is an open-source software library which provides a gradient boosting framework for C++, Java, Python,[3] R,[4] Julia,[5] Perl,[6] and Scala. It works on Linux, Windows,[7] and macOS.[8] From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks Apache Hadoop, Apache Spark, and Apache Flink. It has gained much popularity and attention recently as the algorithm of choice for many winning teams of machine learning competitions.[9]
Developer(s) | The XGBoost Contributors |
---|---|
Initial release | March 27, 2014 |
Stable release | 1.1.1[1]
/ July 7, 2019 |
Repository | |
Written in | C++, Python, Java, R |
Operating system | Linux, macOS, Windows |
Type | Machine learning |
License | Apache License 2.0 |
Website | xgboost |
History
XGBoost initially started as a research project by Tianqi Chen[10] as part of the Distributed (Deep) Machine Learning Community (DMLC) group. Initially, it began as a terminal application which could be configured using a libsvm configuration file. It became well known in the ML competition circles after its use in the winning solution of the Higgs Machine Learning Challenge. Soon after, the Python and R packages were built, and XGBoost now has package implementations for Java, Scala, Julia, Perl, and other languages. This brought the library to more developers and contributed to its popularity among the Kaggle community, where it has been used for a large number of competitions.[9]
It was soon integrated with a number of other packages making it easier to use in their respective communities. It has now been integrated with scikit-learn for Python users and with the caret package for R users. It can also be integrated into Data Flow frameworks like Apache Spark, Apache Hadoop, and Apache Flink using the abstracted Rabit[11] and XGBoost4J.[12] XGBoost is also available on OpenCL for FPGAs.[13] An efficient, scalable implementation of XGBoost has been published by Tianqi Chen and Carlos Guestrin.[14]
Features
Salient features of XGBoost which make it different from other gradient boosting algorithms include:[15][16][17]
- Clever penalization of trees
- A proportional shrinking of leaf nodes
- Newton Boosting
- Extra randomization parameter
Awards
- John Chambers Award (2016)[18]
- High Energy Physics meets Machine Learning award (HEP meets ML) (2016)[19]
See also
References
- "Release 1.1.1 Patch Release · dmlc/xgboost". GitHub. Retrieved 2020-08-08.
- "GitHub project webpage".
- "Python Package Index PYPI: xgboost". Retrieved 2016-08-01.
- "CRAN package xgboost". Retrieved 2016-08-01.
- "Julia package listing xgboost". Retrieved 2016-08-01.
- "CPAN module AI::XGBoost". Retrieved 2020-02-09.
- "Installing XGBoost for Anaconda in Windows". Retrieved 2016-08-01.
- "Installing XGBoost on Mac OSX". Retrieved 2016-08-01.
- "XGBoost - ML winning solutions (incomplete list)". Retrieved 2016-08-01.
- "Story and Lessons behind the evolution of XGBoost". Retrieved 2016-08-01.
- "Rabit - Reliable Allreduce and Broadcast Interface". Retrieved 2016-08-01.
- "XGBoost4J". Retrieved 2016-08-01.
- "XGBoost on FPGAs". Retrieved 2019-08-01.
- Chen, Tianqi; Guestrin, Carlos (2016). "XGBoost: A Scalable Tree Boosting System". In Krishnapuram, Balaji; Shah, Mohak; Smola, Alexander J.; Aggarwal, Charu C.; Shen, Dou; Rastogi, Rajeev (eds.). Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, August 13-17, 2016. ACM. pp. 785–794. arXiv:1603.02754. doi:10.1145/2939672.2939785.
- Gandhi, Rohith (2019-05-24). "Gradient Boosting and XGBoost". Medium. Retrieved 2020-01-04.
- "Boosting algorithm: XGBoost". Towards Data Science. 2017-05-14. Retrieved 2020-01-04.
- "Tree Boosting With XGBoost – Why Does XGBoost Win "Every" Machine Learning Competition?". Synced. 2017-10-22. Retrieved 2020-01-04.
- "John Chambers Award Previous Winners". Retrieved 2016-08-01.
- "HEP meets ML Award". Retrieved 2016-08-01.