FromFile.jl
SymbolicRegression.jl
FromFile.jl | SymbolicRegression.jl | |
---|---|---|
6 | 3 | |
131 | 535 | |
- | - | |
1.5 | 9.7 | |
about 1 year ago | 8 days ago | |
Julia | Julia | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
FromFile.jl
-
A Programming language ideal for Scientific Sustainability and Reproducibility?
On include-- you might like FromFile.jl as an alternative.
- Modules in Julia
-
How to import an own module from the current directory?
For this and other oddities with Julia's include/import system (and especially as you're coming from Python), I'd recommend FromFile as a readable way to approach things.
-
Why not Julia?
You might like FromFile.jl.
-
Problems with nested `include`s and solutions?
However, if you prefer a Python-like experience, checkout FromFile.jl
-
Julia 1.6: what has changed since Julia 1.0?
I'm not using modules. I usually start with one file with a demo or similarly named function that is called if the file is called as an entry point (like if __name__ == '__main__', except Julia makes it even worse).
I tend to refactor code out of there to separate files, and then somehow import it. An ugly way is include, and I've tried Revise.jl with includet.
But I think the least ugly approach is the @from macro from here: https://github.com/Roger-luo/FromFile.jl Judging from some opinion in bug trackers, this is probably gonna get totally shunned by core devs and they'll keep on bikeshedding about the import stuff forever.
With this setup I have about 400 lines of code in three files. It compiles for 15 seconds. After every single change, and actually without any changes too.
I think performance wise this should be equivalent to using modules, but saving some pointless ceremony.
SymbolicRegression.jl
- Symbolicregression.jl – High-Performance Symbolic Regression in Julia and Python
- Do Simpler Machine Learning Models Exist and How Can We Find Them?
-
Modules in Julia
This is an example of a package that relies on it heavily: https://github.com/MilesCranmer/SymbolicRegression.jl
What are some alternatives?
julia - The Julia Programming Language
symreg - A Symbolic Regression engine
DaemonMode.jl - Client-Daemon workflow to run faster scripts in Julia
hlb-CIFAR10 - Train CIFAR-10 in <7 seconds on an A100, the current world record.
JET.jl - An experimental code analyzer for Julia. No need for additional type annotations.
Metatheory.jl - General purpose algebraic metaprogramming and symbolic computation library for the Julia programming language: E-Graphs & equality saturation, term rewriting and more.
DataFramesMeta.jl - Metaprogramming tools for DataFrames
SymbolicNumericIntegration.jl - SymbolicNumericIntegration.jl: Symbolic-Numerics for Solving Integrals
TwoBasedIndexing.jl - Two-based indexing
ModelingToolkit.jl - An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
HTTP.jl - HTTP for Julia
PySR - High-Performance Symbolic Regression in Python and Julia