Metatheory.jl
Dagger.jl
Metatheory.jl | Dagger.jl | |
---|---|---|
5 | 4 | |
334 | 578 | |
1.2% | 1.2% | |
8.1 | 8.9 | |
6 days ago | 9 days ago | |
Julia | Julia | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Metatheory.jl
-
[ANN] E-graphs and equality saturation: hegg 0.1
I'd love to see something in the lines of Julia's https://juliasymbolics.github.io/Metatheory.jl/dev/
-
Twitter Thread: Symbolic Computing for Compiler Optimizations in Julia
From that example you can see how this makes some rather difficult compiler questions all be subsumed in the e-graph saturation solve. That solve itself isn't easy, it's an NP-hard problem that requires good heuristics and such, and that's what Metatheory.jl, and that's what chunks of the thesis are about. But given a good enough solver, the ability to write such transformation passes becomes rather trivial and you get an optimal solution in the sense of the chosen cost function. So problems like enabling automatic FMA on specific codes is rather simple with this tool: just declare a*b + c = fma(a,b,c), the former is a cost of 2 the latter is a cost of one, and let it rip.
-
Show HN: prometeo – a Python-to-C transpiler for high-performance computing
Well IMO it can definitely be rewritten in Julia, and to an easier degree than python since Julia allows hooking into the compiler pipeline at many areas of the stack. It's lispy an built from the ground up for codegen, with libraries like (https://github.com/JuliaSymbolics/Metatheory.jl) that provide high level pattern matching with e-graphs. The question is whether it's worth your time to learn Julia to do so.
You could also do it at the LLVM level: https://github.com/JuliaComputingOSS/llvm-cbe
For interesting takes on that, you can see https://github.com/JuliaLinearAlgebra/Octavian.jl which relies on loopvectorization.jl to do transforms on Julia AST beyond what LLVM does. Because of that, Octavian.jl beats openblas on many linalg benchmarks
- From Julia to Rust
- Algebraic Metaprogramming in Julia with Metatheory.jl
Dagger.jl
- Dagger: a new way to build CI/CD pipelines
-
DTable a new distributed table implementation in Julia using Dagger.jl
Firstly, I'll say that we already have work started to implement out-of-core directly in Dagger: https://github.com/JuliaParallel/Dagger.jl/pull/289.
With that PR in place, it should be possible to define a "storage device" which is backed by a database. I haven't had a chance to actually try this, since the PR still needs quite some work and testing, but it's definitely something on my radar!
- From Julia to Rust
-
Cerebras’ New Monster AI Chip Adds 1.4T Transistors
I'm not sure that's necessarily the domain of a low-level package like CUDA.jl though (which I assume you're referring to). That kind of interface is more the domain of higher-level packages like https://github.com/JuliaParallel/Dagger.jl/ and to a lesser extent https://juliagpu.github.io/KernelAbstractions.jl/stable/. Moreover, the jury is still out on whether the built-in Distributed module is an ideal abstraction for every use-case (clusters, heterogeneous compute, etc.)
WRT Nx, my biggest question is how they'll crack the problem of still needing big balls of C++ and the shims everywhere to get acceleration. Creating a compiler that generates efficient GPU or other accelerator code is a massive research project with no clear winners, never mind the challenge of reconciling the very mutation-heavy needs of GPU compute with a mostly immutable language model.
What are some alternatives?
JET.jl - An experimental code analyzer for Julia. No need for additional type annotations.
earthly - Super simple build framework with fast, repeatable builds and an instantly familiar syntax – like Dockerfile and Makefile had a baby.
MacroTools.jl - MacroTools provides a library of tools for working with Julia code and expressions.
julia - The Julia Programming Language
acados - Fast and embedded solvers for nonlinear optimal control
DuckDB.jl
Octavian.jl - Multi-threaded BLAS-like library that provides pure Julia matrix multiplication
determined - Determined is an open-source machine learning platform that simplifies distributed training, hyperparameter tuning, experiment tracking, and resource management. Works with PyTorch and TensorFlow.
SciMLBenchmarks.jl - Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R
Symbolics.jl - Symbolic programming for the next generation of numerical software
dagger-for-github - GitHub Action for Dagger