Spearmint VS yggdrasil-decision-forests

Compare Spearmint vs yggdrasil-decision-forests and see what are their differences.

Spearmint

Spearmint Bayesian optimization codebase (by HIPS)

yggdrasil-decision-forests

A library to train, evaluate, interpret, and productionize decision forest models such as Random Forest and Gradient Boosted Decision Trees. (by google)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
Spearmint yggdrasil-decision-forests
2 4
1,529 423
0.1% 4.0%
0.0 9.5
over 4 years ago 6 days ago
Python C++
GNU General Public License v3.0 or later Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Spearmint

Posts with mentions or reviews of Spearmint. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-08-03.
  • Why do tree-based models still outperform deep learning on tabular data?
    5 projects | news.ycombinator.com | 3 Aug 2022
    It occurs to me that a system, trained on peer-reviewed applied-machine-learning literature and Kaggle winners, that generates candidates for structured feature-engineering specifications, based on plaintext descriptions of columns' real-world meaning, should be considered a requisite part of the "meta" here.

    Ah, and then you could iterate within the resulting feature-engineering-suggestion space as a hyper-parameter between experiments, which could be optimized with e.g. https://github.com/HIPS/Spearmint . The papers write themselves!

  • [D] What kind of Hyperparameter Optimisation do you use?
    3 projects | /r/MachineLearning | 30 Aug 2021
    This was some time ago but I had some promising results with Bayesian optimization using a Gaussian Process prior. The method was developed by the guys who wrote Spearmint. That library doesn't support parallelization but I implemented the same technique in Scala without too much difficulty.

yggdrasil-decision-forests

Posts with mentions or reviews of yggdrasil-decision-forests. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-03-05.

What are some alternatives?

When comparing Spearmint and yggdrasil-decision-forests you can also consider the following projects:

optuna - A hyperparameter optimization framework

LightGBM - A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.

srbench - A living benchmark framework for symbolic regression

tensorflow - An Open Source Machine Learning Framework for Everyone

axe-testcafe - The helper for using Axe in TestCafe tests

decision-tree-classifier - Decision Tree Classifier and Boosted Random Forest

youtube-react - A Youtube clone built in React, Redux, Redux-saga

flashlight - A C++ standalone library for machine learning [Moved to: https://github.com/flashlight/flashlight]

decision-forests - A collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models in Keras.

interpret - Fit interpretable models. Explain blackbox machine learning.

spaceopt - Hyperparameter optimization via gradient boosting regression