Metatheory.jl
IRTools.jl
Metatheory.jl | IRTools.jl | |
---|---|---|
5 | 2 | |
334 | 107 | |
1.2% | 0.9% | |
8.1 | 5.4 | |
7 days ago | 8 days ago | |
Julia | Julia | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Metatheory.jl
-
[ANN] E-graphs and equality saturation: hegg 0.1
I'd love to see something in the lines of Julia's https://juliasymbolics.github.io/Metatheory.jl/dev/
-
Twitter Thread: Symbolic Computing for Compiler Optimizations in Julia
From that example you can see how this makes some rather difficult compiler questions all be subsumed in the e-graph saturation solve. That solve itself isn't easy, it's an NP-hard problem that requires good heuristics and such, and that's what Metatheory.jl, and that's what chunks of the thesis are about. But given a good enough solver, the ability to write such transformation passes becomes rather trivial and you get an optimal solution in the sense of the chosen cost function. So problems like enabling automatic FMA on specific codes is rather simple with this tool: just declare a*b + c = fma(a,b,c), the former is a cost of 2 the latter is a cost of one, and let it rip.
-
Show HN: prometeo – a Python-to-C transpiler for high-performance computing
Well IMO it can definitely be rewritten in Julia, and to an easier degree than python since Julia allows hooking into the compiler pipeline at many areas of the stack. It's lispy an built from the ground up for codegen, with libraries like (https://github.com/JuliaSymbolics/Metatheory.jl) that provide high level pattern matching with e-graphs. The question is whether it's worth your time to learn Julia to do so.
You could also do it at the LLVM level: https://github.com/JuliaComputingOSS/llvm-cbe
For interesting takes on that, you can see https://github.com/JuliaLinearAlgebra/Octavian.jl which relies on loopvectorization.jl to do transforms on Julia AST beyond what LLVM does. Because of that, Octavian.jl beats openblas on many linalg benchmarks
- From Julia to Rust
- Algebraic Metaprogramming in Julia with Metatheory.jl
IRTools.jl
- From Julia to Rust
-
Ask HN: Show me your Half Baked project
Which is the concept behind Cassette.jl (https://github.com/jrevels/Cassette.jl) and IRTools.jl (https://github.com/MikeInnes/IRTools.jl).
What are some alternatives?
JET.jl - An experimental code analyzer for Julia. No need for additional type annotations.
Dagger.jl - A framework for out-of-core and parallel execution
MacroTools.jl - MacroTools provides a library of tools for working with Julia code and expressions.
Juleps - Julia Enhancement Proposals
acados - Fast and embedded solvers for nonlinear optimal control
pyodide - Pyodide is a Python distribution for the browser and Node.js based on WebAssembly
Octavian.jl - Multi-threaded BLAS-like library that provides pure Julia matrix multiplication
StaticArrays.jl - Statically sized arrays for Julia
SciMLBenchmarks.jl - Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R
glow - Compiler for Neural Network hardware accelerators