MacroTools.jl
Metatheory.jl
MacroTools.jl | Metatheory.jl | |
---|---|---|
2 | 5 | |
302 | 334 | |
0.3% | 1.2% | |
6.8 | 8.1 | |
9 days ago | 7 days ago | |
Julia | Julia | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
MacroTools.jl
-
Split up Julia expression into sub-expressions
Is macrotools.jl a package that could be helpful?
- From Julia to Rust
Metatheory.jl
-
[ANN] E-graphs and equality saturation: hegg 0.1
I'd love to see something in the lines of Julia's https://juliasymbolics.github.io/Metatheory.jl/dev/
-
Twitter Thread: Symbolic Computing for Compiler Optimizations in Julia
From that example you can see how this makes some rather difficult compiler questions all be subsumed in the e-graph saturation solve. That solve itself isn't easy, it's an NP-hard problem that requires good heuristics and such, and that's what Metatheory.jl, and that's what chunks of the thesis are about. But given a good enough solver, the ability to write such transformation passes becomes rather trivial and you get an optimal solution in the sense of the chosen cost function. So problems like enabling automatic FMA on specific codes is rather simple with this tool: just declare a*b + c = fma(a,b,c), the former is a cost of 2 the latter is a cost of one, and let it rip.
-
Show HN: prometeo – a Python-to-C transpiler for high-performance computing
Well IMO it can definitely be rewritten in Julia, and to an easier degree than python since Julia allows hooking into the compiler pipeline at many areas of the stack. It's lispy an built from the ground up for codegen, with libraries like (https://github.com/JuliaSymbolics/Metatheory.jl) that provide high level pattern matching with e-graphs. The question is whether it's worth your time to learn Julia to do so.
You could also do it at the LLVM level: https://github.com/JuliaComputingOSS/llvm-cbe
For interesting takes on that, you can see https://github.com/JuliaLinearAlgebra/Octavian.jl which relies on loopvectorization.jl to do transforms on Julia AST beyond what LLVM does. Because of that, Octavian.jl beats openblas on many linalg benchmarks
- From Julia to Rust
- Algebraic Metaprogramming in Julia with Metatheory.jl
What are some alternatives?
Catlab.jl - A framework for applied category theory in the Julia language
JET.jl - An experimental code analyzer for Julia. No need for additional type annotations.
StaticArrays.jl - Statically sized arrays for Julia
Dagger.jl - A framework for out-of-core and parallel execution
julia - The Julia Programming Language
acados - Fast and embedded solvers for nonlinear optimal control
glow - Compiler for Neural Network hardware accelerators
Octavian.jl - Multi-threaded BLAS-like library that provides pure Julia matrix multiplication
egg - egg is a flexible, high-performance e-graph library
SciMLBenchmarks.jl - Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R
Symbolics.jl - Symbolic programming for the next generation of numerical software