mljar-supervised VS PySR

Compare mljar-supervised vs PySR and see what are their differences.

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
mljar-supervised PySR
51 7
2,927 1,850
1.2% -
8.5 9.6
8 days ago 8 days ago
Python Python
MIT License Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

mljar-supervised

Posts with mentions or reviews of mljar-supervised. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-08-24.

PySR

Posts with mentions or reviews of PySR. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-04.
  • Potential of the Julia programming language for high energy physics computing
    10 projects | news.ycombinator.com | 4 Dec 2023
    > Yes, julia can be called from other languages rather easily

    This seems false to me. StaticCompiler.jl [1] puts in their limitations that "GC-tracked allocations and global variables do not work with compile_executable or compile_shlib. This has some interesting consequences, including that all functions within the function you want to compile must either be inlined or return only native types (otherwise Julia would have to allocate a place to put the results, which will fail)." PackageCompiler.jl [2] has the same limitations if I'm not mistaken. So then you have to fall back to distributing the Julia "binary" with a full Julia runtime, which is pretty heavy. There are some packages which do this. For example, PySR [3] does this.

    There is some word going around though that there is an even better static compiler in the making, but as long as that one is not publicly available I'd say that Julia cannot easily be called from other languages.

    [1]: https://github.com/tshort/StaticCompiler.jl

    [2]: https://github.com/JuliaLang/PackageCompiler.jl

    [3]: https://github.com/MilesCranmer/PySR

  • Symbolicregression.jl – High-Performance Symbolic Regression in Julia and Python
    2 projects | news.ycombinator.com | 15 Jul 2023
  • [D] Is there any research into using neural networks to discover classical algorithms?
    2 projects | /r/MachineLearning | 1 Jan 2023
    I first learned about it with PySR https://github.com/MilesCranmer/PySR, they have an arxiv paper with some use cases as well.
  • Symbolic Regression is NP-hard
    1 project | news.ycombinator.com | 13 Nov 2022
    I encourage everyone to read this paper. It's well written and easy to follow along. To the uninitiated, SR is the problem of finding a mathematical (symbolic) expression that most accurately describes a dataset of input-output examples (regression). The most naive implementation of SR is basically a breath first search starting from the simplest program tree: x -> sin(x) -> cos(x) ... sin(cos(tan(x))) until timeout. However, we can prune out equivalent expressions and, in general, the problem is embarrassingly parallel which alludes to some hope that we can solve this pretty fast (check out PySR[1] for a modern implementation). I find SR fascinating because it can be used for model distillation: learn a DNN approximation and "distill" it to a symbolic program.

    Note that the paper talks about the decision version of the SR problem. ie: can we discover the global optimum expression. I think this proof is important for the SR community but not particularly surprising (to me). However, I'm excited by the potential future work for this paper! A couple of discussion points:

    * First, SR is technically a bottom up program synthesis problem where the DSL (math) has an equivalence operator. Can we use this proof to impose stronger guarantees on the "hyperparameters" for bottom up synthesis. Conversely, does the theoretical foundation of the inductive synthesis literature [2] help us define tighter bounds?

    * Second, while SR itself is NP hard, can we say anything about the approximate algorithms (eg: distilling a deep neural network to find a solution[3])? Specifically, what proof tell us about the PAC learnability of SR?

    Anyhow, pretty cool seeing such work getting more attention!

    [1] https://github.com/MilesCranmer/PySR

    [2] https://susmitjha.github.io/papers/togis17.pdf

    [3] https://astroautomata.com/paper/symbolic-neural-nets/

  • ‘Machine Scientists’ Distill the Laws of Physics from Raw Data
    8 projects | news.ycombinator.com | 10 May 2022
    I found it curious that one of the implementations of symbolic regression (the "machine scientist" referenced in the article) is a Python wrapper on Julia: https://github.com/MilesCranmer/PySR

    I don't think I've seen a Python wrapper on Julia code before.

  • Is it possible to create a Python package with Julia and publish it on PyPi?
    6 projects | /r/Julia | 23 Apr 2022
  • [D] Inferring general physical laws from observations in 300 lines of code
    1 project | /r/MachineLearning | 2 Aug 2021
    This is really neat! Since you're interested in this subject, you may also appreciate PySR and the corresponding paper which uses Graph Neural Networks to perform symbolic regression.

What are some alternatives?

When comparing mljar-supervised and PySR you can also consider the following projects:

optuna - A hyperparameter optimization framework

GeneticAlgorithmPython - Source code of PyGAD, a Python 3 library for building the genetic algorithm and training machine learning algorithms (Keras & PyTorch).

autokeras - AutoML library for deep learning

TorchGA - Train PyTorch Models using the Genetic Algorithm with PyGAD

LightGBM - A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.

nni - An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.

AutoViz - Automatically Visualize any dataset, any size with a single line of code. Created by Ram Seshadri. Collaborators Welcome. Permission Granted upon Request.

diffeqpy - Solving differential equations in Python using DifferentialEquations.jl and the SciML Scientific Machine Learning organization

mljar-examples - Examples how MLJAR can be used

python-bigsimr

Auto_ViML - Automatically Build Multiple ML Models with a Single Line of Code. Created by Ram Seshadri. Collaborators Welcome. Permission Granted upon Request.

ModelingToolkitStandardLibrary.jl - A standard library of components to model the world and beyond