C++ Gbm

Open-source C++ projects categorized as Gbm

Top 4 C++ Gbm Projects

  1. xgboost

    Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

    Project mention: Predicting Tomorrow's Tremors: A Machine Learning Approach to Earthquake Nowcasting in California | dev.to | 2025-07-03

    XGBoost Documentation: https://xgboost.readthedocs.io/

  2. Stream

    Stream - Scalable APIs for Chat, Feeds, Moderation, & Video. Stream helps developers build engaging apps that scale to millions with performant and flexible Chat, Feeds, Moderation, and Video APIs and SDKs powered by a global edge network and enterprise-grade infrastructure.

    Stream logo
  3. LightGBM

    A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.

    Project mention: The Grug Brained Developer | news.ycombinator.com | 2025-06-17

    That's a tough bouncing ball to follow as it seems the resolution was to upgrade to a newer version of a dependency, but if we look at that dependency, the fix seems to be found somewhere in this https://github.com/microsoft/LightGBM/compare/v2.2.1...v2.2....

    But, although there is admittedly a lot in there and I may have missed it, I don't see any updates to tests to denote that a problem was discovered. Which, while not knowing much about the project, makes me think that there really isn't any meaningful testing going on, which is interesting for what it is, but not in the vein of the discussion here.

  4. catboost

    A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. Supports computation on CPU and GPU.

    Project mention: 🚀 Why Your ML Service Needs Rust + CatBoost: A Setup Guide That Actually Works | dev.to | 2025-01-19

    [package] name = "MLApp" version = "0.1.0" edition = "2021" [dependencies] catboost = { git = "https://github.com/catboost/catboost", rev = "0bfdc35"}

  5. fairgbm

    Train Gradient Boosting models that are both high-performance *and* Fair!

NOTE: The open source projects on this list are ordered by number of github stars. The number of mentions indicates repo mentiontions in the last 12 Months or since we started tracking (Dec 2020).

C++ Gbm discussion

Log in or Post with

C++ Gbm related posts

  • XGBoost: The Scalable and Distributed Gradient Boosting Library

    1 project | news.ycombinator.com | 14 Aug 2024
  • XGBoost 2.0

    1 project | news.ycombinator.com | 13 Oct 2023
  • XGBoost2.0

    1 project | news.ycombinator.com | 9 Oct 2023
  • SIRUS.jl: Interpretable Machine Learning via Rule Extraction

    2 projects | /r/Julia | 29 Jun 2023
  • [D] RAM speeds for tabular machine learning algorithms

    1 project | /r/MachineLearning | 9 Jun 2023
  • Xgboost: Banding continuous variables vs keeping raw data

    1 project | /r/datascience | 1 Jun 2023
  • [P] LightGBM but lighter in another language?

    1 project | /r/MachineLearning | 4 May 2023
  • A note from our sponsor - InfluxDB
    www.influxdata.com | 12 Jul 2025
    InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now. Learn more →

Index

What are some of the best open-source Gbm projects in C++? This list will help you:

# Project Stars
1 xgboost 27,093
2 LightGBM 17,372
3 catboost 8,464
4 fairgbm 105

Sponsored
Stream - Scalable APIs for Chat, Feeds, Moderation, & Video.
Stream helps developers build engaging apps that scale to millions with performant and flexible Chat, Feeds, Moderation, and Video APIs and SDKs powered by a global edge network and enterprise-grade infrastructure.
getstream.io

Did you know that C++ is
the 7th most popular programming language
based on number of references?