InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now. Learn more →
C++ gbrt Projects
-
xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Boosting Boosting is not a separate ML model but a technique that combines multiple weak learners to create a single model that can generate highly accurate predictions. Xgboost is a common boosting model that supports distributed training, resulting in faster training. According to research by Intel, Xgboost can be more effective than a neural network-based approach for tabular data. In addition, Xgboost is faster to train and doesn’t require as much data as neural networks need.
-
InfluxDB
InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.
C++ gbrt discussion
C++ gbrt related posts
-
XGBoost: The Scalable and Distributed Gradient Boosting Library
-
stackgbm VS xgboost - a user suggested alternative
2 projects | 5 May 2024 -
XGBoost 2.0
-
XGBoost2.0
-
Xgboost: Banding continuous variables vs keeping raw data
-
XGBoost Save and Load Error
-
For XGBoost (in Amazon SageMaker), one of the hyper parameters is num_round, for number of rounds to train. Does this mean cross validation?
-
A note from our sponsor - InfluxDB
www.influxdata.com | 12 May 2025