Clang.jl
Lux.jl


Clang.jl | Lux.jl | |
---|---|---|
2 | 4 | |
229 | 537 | |
1.3% | 4.5% | |
7.5 | 9.9 | |
27 days ago | 2 days ago | |
Julia | Julia | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Clang.jl
-
Julia 1.10 Released
Are there solid C interfaces that can be used?
A large part of why I started using Julia is because calling into other languages through the C FFI is pretty easy and efficient. Most of the wrappers are a single line. If there is not existing driver support, I would pass the C headers through Clang.jl, which automatically wraps the C API in a C header.
https://github.com/JuliaInterop/Clang.jl
I most recently did this with libtiff. Here is the Clang.jl code to generate the bindings. It's less than 30 lines of sterotypical code.
https://github.com/mkitti/LibTIFF.jl/tree/main/gen
The generated bindings with a few tweaks is here:
https://github.com/mkitti/LibTIFF.jl/blob/main/src/LibTIFF.j...
-
A new C++ <-> Julia Wrapper: jluna
If you are interested in C++ interop you can also have a look at Clang.jl and CxxWrap.jl (the usual Julia package chaos applies, where the package mentioned in old talks and docs that you find on google is superseded by some others...)
Lux.jl
- Julia 1.10 Released
-
[R] Easiest way to train RNN's in MATLAB or Julia?
There is also the less known Lux.jl package: https://github.com/avik-pal/Lux.jl
- “Why I still recommend Julia”
-
The Julia language has a number of correctness flaws
Lots of things are being rewritten. Remember we just released a new neural network library the other day, SimpleChains.jl, and showed that it gave about a 10x speed improvement on modern CPUs with multithreading enabled vs Jax Equinox (and 22x when AVX-512 is enabled) for smaller neural network and matrix-vector types of cases (https://julialang.org/blog/2022/04/simple-chains/). Then there's Lux.jl fixing some major issues of Flux.jl (https://github.com/avik-pal/Lux.jl). Pretty much everything is switching to Enzyme which improves performance quite a bit over Zygote and allows for full mutation support (https://github.com/EnzymeAD/Enzyme.jl). So an entire machine learning stack is already seeing parts release.
Right now we're in a bit of an uncomfortable spot where we have to use Zygote for a few things and then Enzyme for everything else, but the custom rules system is rather close and that's the piece that's needed to make the full transition.
What are some alternatives?
threads - Threads for Lua and LuaJIT. Transparent exchange of data between threads is allowed thanks to torch serialization.
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
jluna - Julia Wrapper for C++ with Focus on Safety, Elegance, and Ease of Use
Enzyme - High-performance automatic differentiation of LLVM and MLIR.
CxxWrap.jl - Package to make C++ libraries available in Julia
StatsBase.jl - Basic statistics for Julia
Torch.jl - Sensible extensions for exposing torch in Julia.
Optimization.jl - Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
LibTIFF.jl - Clang.jl generated wrapper around Libtiff_jll.jl
Enzyme.jl - Julia bindings for the Enzyme automatic differentiator
oorb - An open-source orbit-computation package for Solar System objects.
BetaML.jl - Beta Machine Learning Toolkit

