JET.jl
Metatheory.jl
Our great sponsors
JET.jl | Metatheory.jl | |
---|---|---|
13 | 5 | |
688 | 334 | |
- | 1.2% | |
9.1 | 8.1 | |
5 days ago | about 21 hours ago | |
Julia | Julia | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
JET.jl
-
Prospects of utilising Rust in scientific computation?
An informative discussion on julia forum. Have you tried using https://github.com/aviatesk/JET.jl to minimize type instabilities?
-
Julia v1.9.0 has been released
For instance, https://github.com/aviatesk/JET.jl is still in its relative infancy, but it's played a big role in detecting quite a few potential bugs that had never been reported to use by users or caught in our testing infrastructure. There's also been a lot developments like interfaces to RR the time travelling debugger https://rr-project.org/ which helps us better understand and catch some very hard to debug non-deterministic bugs.
-
Julia Computing Raises $24M Series A
Have you seen Shuhei Tadowaki's work on JET.jl (?)
If you're curious: https://github.com/aviatesk/JET.jl
This may seem more about performance (than IDE development) but Shuhei is one of the driving contributors behind developing the capabilities to use compiler capabilities for IDE integration -- and indeed JET.jl contains the kernel of a number of these capabilities.
-
I Hate Programming Language Advocacy (2000)
This is sort of being done right now, as dynamic languages have begun to adopt gradual typing... at least Python and Julia, that I know of.
If something like [JET.jl](https://github.com/aviatesk/JET.jl) become ubiquitous in Julia, one could add a function that pointed out all the places in the code where types are not fully inferred by the compiler.
It'll never be quite the same level of safety as a static language, however.
-
From Julia to Rust
- Pattern matching (sometimes you don't want the overhead of a method lookup)
[1]: https://github.com/aviatesk/JET.jl
-
Julia is the best language to extend Python for scientific computing
You can use the `@code_warntype` macro to check for type stability, which is very helpful for detecting such performance pitfalls on single function level. In the future, https://github.com/aviatesk/JET.jl may give a more powerful way to do it.
- Jet.jl: experimental type checker for Julia
- Jet.jl: A WIP compile time type checker for Julia
Metatheory.jl
-
[ANN] E-graphs and equality saturation: hegg 0.1
I'd love to see something in the lines of Julia's https://juliasymbolics.github.io/Metatheory.jl/dev/
-
Twitter Thread: Symbolic Computing for Compiler Optimizations in Julia
From that example you can see how this makes some rather difficult compiler questions all be subsumed in the e-graph saturation solve. That solve itself isn't easy, it's an NP-hard problem that requires good heuristics and such, and that's what Metatheory.jl, and that's what chunks of the thesis are about. But given a good enough solver, the ability to write such transformation passes becomes rather trivial and you get an optimal solution in the sense of the chosen cost function. So problems like enabling automatic FMA on specific codes is rather simple with this tool: just declare a*b + c = fma(a,b,c), the former is a cost of 2 the latter is a cost of one, and let it rip.
-
Show HN: prometeo – a Python-to-C transpiler for high-performance computing
Well IMO it can definitely be rewritten in Julia, and to an easier degree than python since Julia allows hooking into the compiler pipeline at many areas of the stack. It's lispy an built from the ground up for codegen, with libraries like (https://github.com/JuliaSymbolics/Metatheory.jl) that provide high level pattern matching with e-graphs. The question is whether it's worth your time to learn Julia to do so.
You could also do it at the LLVM level: https://github.com/JuliaComputingOSS/llvm-cbe
For interesting takes on that, you can see https://github.com/JuliaLinearAlgebra/Octavian.jl which relies on loopvectorization.jl to do transforms on Julia AST beyond what LLVM does. Because of that, Octavian.jl beats openblas on many linalg benchmarks
- From Julia to Rust
- Algebraic Metaprogramming in Julia with Metatheory.jl
What are some alternatives?
julia - The Julia Programming Language
Dagger.jl - A framework for out-of-core and parallel execution
Enzyme.jl - Julia bindings for the Enzyme automatic differentiator
MacroTools.jl - MacroTools provides a library of tools for working with Julia code and expressions.
StaticArrays.jl - Statically sized arrays for Julia
acados - Fast and embedded solvers for nonlinear optimal control
HTTP.jl - HTTP for Julia
Octavian.jl - Multi-threaded BLAS-like library that provides pure Julia matrix multiplication
FromFile.jl - Julia enhancement proposal (Julep) for implicit per file module in Julia
SciMLBenchmarks.jl - Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R
IRTools.jl - Mike's Little Intermediate Representation
Symbolics.jl - Symbolic programming for the next generation of numerical software