PySR
DaemonMode.jl
PySR | DaemonMode.jl | |
---|---|---|
7 | 22 | |
1,911 | 269 | |
- | - | |
9.6 | 4.7 | |
4 days ago | 5 months ago | |
Python | Julia | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
PySR
-
Potential of the Julia programming language for high energy physics computing
> Yes, julia can be called from other languages rather easily
This seems false to me. StaticCompiler.jl [1] puts in their limitations that "GC-tracked allocations and global variables do not work with compile_executable or compile_shlib. This has some interesting consequences, including that all functions within the function you want to compile must either be inlined or return only native types (otherwise Julia would have to allocate a place to put the results, which will fail)." PackageCompiler.jl [2] has the same limitations if I'm not mistaken. So then you have to fall back to distributing the Julia "binary" with a full Julia runtime, which is pretty heavy. There are some packages which do this. For example, PySR [3] does this.
There is some word going around though that there is an even better static compiler in the making, but as long as that one is not publicly available I'd say that Julia cannot easily be called from other languages.
[1]: https://github.com/tshort/StaticCompiler.jl
[2]: https://github.com/JuliaLang/PackageCompiler.jl
[3]: https://github.com/MilesCranmer/PySR
- Symbolicregression.jl – High-Performance Symbolic Regression in Julia and Python
-
[D] Is there any research into using neural networks to discover classical algorithms?
I first learned about it with PySR https://github.com/MilesCranmer/PySR, they have an arxiv paper with some use cases as well.
-
Symbolic Regression is NP-hard
I encourage everyone to read this paper. It's well written and easy to follow along. To the uninitiated, SR is the problem of finding a mathematical (symbolic) expression that most accurately describes a dataset of input-output examples (regression). The most naive implementation of SR is basically a breath first search starting from the simplest program tree: x -> sin(x) -> cos(x) ... sin(cos(tan(x))) until timeout. However, we can prune out equivalent expressions and, in general, the problem is embarrassingly parallel which alludes to some hope that we can solve this pretty fast (check out PySR[1] for a modern implementation). I find SR fascinating because it can be used for model distillation: learn a DNN approximation and "distill" it to a symbolic program.
Note that the paper talks about the decision version of the SR problem. ie: can we discover the global optimum expression. I think this proof is important for the SR community but not particularly surprising (to me). However, I'm excited by the potential future work for this paper! A couple of discussion points:
* First, SR is technically a bottom up program synthesis problem where the DSL (math) has an equivalence operator. Can we use this proof to impose stronger guarantees on the "hyperparameters" for bottom up synthesis. Conversely, does the theoretical foundation of the inductive synthesis literature [2] help us define tighter bounds?
* Second, while SR itself is NP hard, can we say anything about the approximate algorithms (eg: distilling a deep neural network to find a solution[3])? Specifically, what proof tell us about the PAC learnability of SR?
Anyhow, pretty cool seeing such work getting more attention!
[1] https://github.com/MilesCranmer/PySR
[2] https://susmitjha.github.io/papers/togis17.pdf
[3] https://astroautomata.com/paper/symbolic-neural-nets/
-
‘Machine Scientists’ Distill the Laws of Physics from Raw Data
I found it curious that one of the implementations of symbolic regression (the "machine scientist" referenced in the article) is a Python wrapper on Julia: https://github.com/MilesCranmer/PySR
I don't think I've seen a Python wrapper on Julia code before.
- Is it possible to create a Python package with Julia and publish it on PyPi?
-
[D] Inferring general physical laws from observations in 300 lines of code
This is really neat! Since you're interested in this subject, you may also appreciate PySR and the corresponding paper which uses Graph Neural Networks to perform symbolic regression.
DaemonMode.jl
-
Potential of the Julia programming language for high energy physics computing
Thats for an entry point, you can search `Base.@main` to see a little summary of it. Later it will be able to be callable with `juliax` and `juliac` i.e. `~juliax test.jl` in shell.
DynamicalSystems looks like a heavy project. I don't think you can do much more on your own. There have been recent features in 1.10 that lets you just use the portion you need (just a weak dependency), and there is precompiletools.jl but these are on your side.
You can also look into https://github.com/dmolina/DaemonMode.jl for running a Julia process in the background and do your stuff in the shell without startup time until the standalone binaries are there.
-
Julia 1.9.0 lives up to its promise
> If I were to use e.g. Rust with polars, load time would be virtually none.
Because you're compiling...
And if you need to do the same in Julia, you should also pre-compile or some other method like https://github.com/dmolina/DaemonMode.jl (their demo shows loading a database, with subsequent loads after the first one taking roughly ~0.2% of the first)
- Administrative Scripting with Julia
- GNU Octave 8.1
-
Ask HN: Why is Julia so underrated?
Well, not nicely certainly, but:
https://github.com/dmolina/DaemonMode.jl
> portable
Neither is python - it just relies on universal availability. Over time…
-
Is Julia suitable today as a scripting language?
You can get around a lot of these problems with DaemonMode.jl though.
-
Julia performance, startup.jl, and sysimages
You might want DaemonMode.jl
-
Can I execute code in Julia REPL if I'm connected to a remote server?
https://github.com/dmolina/DaemonMode.jl can possibly help in the future. Leaving it here so that people know this is planned.
- Ask HN: Why hasn't the Deep Learning community embraced Julia yet?
-
Compile for faster execution?
If you strongly prefer to run scripts though, then you can use the package https://github.com/dmolina/DaemonMode.jl in order to re-use a Julia session between multiple scripts, saving you recompilation time.
What are some alternatives?
GeneticAlgorithmPython - Source code of PyGAD, a Python 3 library for building the genetic algorithm and training machine learning algorithms (Keras & PyTorch).
julia - The Julia Programming Language
TorchGA - Train PyTorch Models using the Genetic Algorithm with PyGAD
Makie.jl - Interactive data visualizations and plotting in Julia
mljar-supervised - Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation
HTTP.jl - HTTP for Julia
nni - An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
FromFile.jl - Julia enhancement proposal (Julep) for implicit per file module in Julia
diffeqpy - Solving differential equations in Python using DifferentialEquations.jl and the SciML Scientific Machine Learning organization
julia-numpy-fortran-test - Comparing Julia vs Numpy vs Fortran for performance and code simplicity
python-bigsimr
DataFramesMeta.jl - Metaprogramming tools for DataFrames