diffeqpy
PySR
Our great sponsors
diffeqpy | PySR | |
---|---|---|
4 | 7 | |
494 | 1,882 | |
3.8% | - | |
7.7 | 9.6 | |
about 1 month ago | 7 days ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
diffeqpy
-
How Julia ODE Solve Compile Time Was Reduced From 30 Seconds to 0.1
With Python you have to write packages in some other language anyways, so you might as well do that with Julia. One of the reasons for getting all of this precompilation going is to eventually ship precompiled system images with things like https://github.com/SciML/diffeqpy, effectively using Julia as a replacement for where C/Fortran is traditionally used there. If I can make that pipeline smooth, then I think Julia as a Python package building source will be a good option for a lot of folks. Right now it's a very manual, but it could easily improve with a bit of tooling.
- ‘Machine Scientists’ Distill the Laws of Physics from Raw Data
- Is it possible to create a Python package with Julia and publish it on PyPi?
-
Julia vs R/Python
10-100x speed increase was not an exaggeration for me. With julia I was able to run things quickly on my own machine which I had been running on a compute cluster. I agree that numba could be just as fast as julia. I also just saw that you can run that DE library from julia that I like so much from python using this package. https://github.com/SciML/diffeqpy
PySR
-
Potential of the Julia programming language for high energy physics computing
> Yes, julia can be called from other languages rather easily
This seems false to me. StaticCompiler.jl [1] puts in their limitations that "GC-tracked allocations and global variables do not work with compile_executable or compile_shlib. This has some interesting consequences, including that all functions within the function you want to compile must either be inlined or return only native types (otherwise Julia would have to allocate a place to put the results, which will fail)." PackageCompiler.jl [2] has the same limitations if I'm not mistaken. So then you have to fall back to distributing the Julia "binary" with a full Julia runtime, which is pretty heavy. There are some packages which do this. For example, PySR [3] does this.
There is some word going around though that there is an even better static compiler in the making, but as long as that one is not publicly available I'd say that Julia cannot easily be called from other languages.
[1]: https://github.com/tshort/StaticCompiler.jl
[2]: https://github.com/JuliaLang/PackageCompiler.jl
[3]: https://github.com/MilesCranmer/PySR
- Symbolicregression.jl – High-Performance Symbolic Regression in Julia and Python
-
[D] Is there any research into using neural networks to discover classical algorithms?
I first learned about it with PySR https://github.com/MilesCranmer/PySR, they have an arxiv paper with some use cases as well.
-
Symbolic Regression is NP-hard
I encourage everyone to read this paper. It's well written and easy to follow along. To the uninitiated, SR is the problem of finding a mathematical (symbolic) expression that most accurately describes a dataset of input-output examples (regression). The most naive implementation of SR is basically a breath first search starting from the simplest program tree: x -> sin(x) -> cos(x) ... sin(cos(tan(x))) until timeout. However, we can prune out equivalent expressions and, in general, the problem is embarrassingly parallel which alludes to some hope that we can solve this pretty fast (check out PySR[1] for a modern implementation). I find SR fascinating because it can be used for model distillation: learn a DNN approximation and "distill" it to a symbolic program.
Note that the paper talks about the decision version of the SR problem. ie: can we discover the global optimum expression. I think this proof is important for the SR community but not particularly surprising (to me). However, I'm excited by the potential future work for this paper! A couple of discussion points:
* First, SR is technically a bottom up program synthesis problem where the DSL (math) has an equivalence operator. Can we use this proof to impose stronger guarantees on the "hyperparameters" for bottom up synthesis. Conversely, does the theoretical foundation of the inductive synthesis literature [2] help us define tighter bounds?
* Second, while SR itself is NP hard, can we say anything about the approximate algorithms (eg: distilling a deep neural network to find a solution[3])? Specifically, what proof tell us about the PAC learnability of SR?
Anyhow, pretty cool seeing such work getting more attention!
[1] https://github.com/MilesCranmer/PySR
[2] https://susmitjha.github.io/papers/togis17.pdf
[3] https://astroautomata.com/paper/symbolic-neural-nets/
-
‘Machine Scientists’ Distill the Laws of Physics from Raw Data
I found it curious that one of the implementations of symbolic regression (the "machine scientist" referenced in the article) is a Python wrapper on Julia: https://github.com/MilesCranmer/PySR
I don't think I've seen a Python wrapper on Julia code before.
- Is it possible to create a Python package with Julia and publish it on PyPi?
-
[D] Inferring general physical laws from observations in 300 lines of code
This is really neat! Since you're interested in this subject, you may also appreciate PySR and the corresponding paper which uses Graph Neural Networks to perform symbolic regression.
What are some alternatives?
DifferentialEquations.jl - Multi-language suite for high-performance solvers of differential equations and scientific machine learning (SciML) components. Ordinary differential equations (ODEs), stochastic differential equations (SDEs), delay differential equations (DDEs), differential-algebraic equations (DAEs), and more in Julia.
GeneticAlgorithmPython - Source code of PyGAD, a Python 3 library for building the genetic algorithm and training machine learning algorithms (Keras & PyTorch).
ModelingToolkit.jl - An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
TorchGA - Train PyTorch Models using the Genetic Algorithm with PyGAD
DiffEqBase.jl - The lightweight Base library for shared types and functionality for defining differential equation and scientific machine learning (SciML) problems
mljar-supervised - Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation
DiffEqSensitivity.jl - A component of the DiffEq ecosystem for enabling sensitivity analysis for scientific machine learning (SciML). Optimize-then-discretize, discretize-then-optimize, and more for ODEs, SDEs, DDEs, DAEs, etc. [Moved to: https://github.com/SciML/SciMLSensitivity.jl]
nni - An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
csvzip - A standalone CLI tool to reduce CSVs size by converting categorical columns in a list of unique integers.
python-bigsimr
ModelingToolkitStandardLibrary.jl - A standard library of components to model the world and beyond