awesome-cold-showers VS lleaves

Compare awesome-cold-showers vs lleaves and see what are their differences.

awesome-cold-showers

For when people get too hyped up about things (by hwayne)

lleaves

Compiler for LightGBM gradient-boosted trees, based on LLVM. Speeds up prediction by ≥10x. (by siboehm)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
awesome-cold-showers lleaves
5 4
7,245 297
- -
3.8 6.7
4 months ago about 1 month ago
Python
GNU General Public License v3.0 or later MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

awesome-cold-showers

Posts with mentions or reviews of awesome-cold-showers. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-06-18.

lleaves

Posts with mentions or reviews of lleaves. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-06-18.
  • LLeaves: A LLVM-based compiler for LightGBM decision trees
    1 project | news.ycombinator.com | 8 Jul 2023
  • Cold Showers
    4 projects | news.ycombinator.com | 18 Jun 2022
    I built this decision tree (LightGBM) compiler last summer: https://github.com/siboehm/lleaves

    It get's you ~10x speedups for batch predictions, more if your model is big. It's not complicated, it ended up being <1K lines of Python code. I heard a couple of stories like yours, where people had multi-node spark clusters running LightGBM, and it always amused me because by if you compiled the trees instead you could get rid of the whole cluster.

  • Tree compiler that speeds up LightGBM model inference by ~30x
    2 projects | /r/dataengineering | 22 Aug 2021
    In a near-future version I'll expose some of the compilation parameters, I was somewhat afraid of having an API that's too complicated deterring people who just want a no-fuzz drop-in replacement for LightGBM. But as long as I keep sane defaults and have the parameters optional it should be fine. Relevant parameters are definitely block size (needs to adjust to L1i size and tree size) as well as the LLVM codemodel (a smaller adress space increases single-batch prediction speeds but doesn't work for large models). The thread-size specific compilation I'm still looking into, it makes the API more complicated and so might not be worth it.

What are some alternatives?

When comparing awesome-cold-showers and lleaves you can also consider the following projects:

p1xt-guides - Programming curricula

mljar-supervised - Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation

ngboost - Natural Gradient Boosting for Probabilistic Prediction

m2cgen - Transform ML models into a native code (Java, C, Python, Go, JavaScript, Visual Basic, C#, R, PowerShell, PHP, Dart, Haskell, Ruby, F#, Rust) with zero dependencies

miceforest - Multiple Imputation with LightGBM in Python

catboost - A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. Supports computation on CPU and GPU.