dex-lang
equinox
Our great sponsors
dex-lang | equinox | |
---|---|---|
25 | 31 | |
1,534 | 1,789 | |
0.1% | - | |
8.8 | 9.3 | |
18 days ago | 5 days ago | |
Haskell | Python | |
BSD 3-clause "New" or "Revised" License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dex-lang
-
Thinking in an Array Language
A really nice approach to this I've seen recently is Google's research on [Dex](https://github.com/google-research/dex-lang).
- Function Composition in Programming Languages – Conor Hoekstra – CppNorth 2023 [video]
- Dex Lang: Research language for array processing in the Haskell/ML family
-
[D] Have their been any attempts to create a programming language specifically for machine learning?
Dex
-
[D] PyTorch 2.0 Announcement
Have you tried Dex? https://github.com/google-research/dex-lang It is in a relatively early stage, but it is exploring some interesting parts of the design space.
- Mangle, a programming language for deductive database programming
-
Looking for languages that combine algebraic effects with parallel execution
I think [Dex](https://github.com/google-research/dex-lang) might be along the lines of what you're looking for, although its focus is on SIMD GPU-style parallelism rather than thread-level parallelism.
-
“Why I still recommend Julia”
Dex proves indexing correctness without a full dependent type system, including loops.
See: https://github.com/google-research/dex-lang/pull/969
-
Haskell for Artificial Intelligence?
In case you want to see one research direction that's combining practical machine learning and functional programming, one of the authors of JAX (and the main author of its predecessor, Autograd) is writing Dex (https://github.com/google-research/dex-lang), a functional language for array processing. The compiler itself is written in Haskell. JAX is one of the most popular libraries for doing a lot of machine learning these days, along with Tensorflow and PyTorch. You might also want to see the bug in the JAX repo about adding Haskell support, for some context: https://github.com/google/jax/issues/185
equinox
-
Ask HN: What side projects landed you a job?
I wrote a JAX-based neural network library (Equinox [1]) and numerical differential equation solving library (Diffrax [2]).
At the time I was just exploring some new research ideas in numerics -- and frankly, procrastinating from writing up my PhD thesis!
But then one of the teams at Google starting using them, so they offered me a job to keep developing them for their needs. Plus I'd get to work in biotech, which was a big interest of mine. This was a clear dream job offer, so I accepted.
Since then both have grown steadily in popularity (~2.6k GitHub stars) and now see pretty widespread use! I've since started writing several other JAX libraries and we now have a bit of an ecosystem going.
[1] https://github.com/patrick-kidger/equinox
-
[P] Optimistix, nonlinear optimisation in JAX+Equinox!
The elevator pitch is Optimistix is really fast, especially to compile. It plays nicely with Optax for first-order gradient-based methods, and takes a lot of design inspiration from Equinox, representing the state of all the solvers as standard JAX PyTrees.
-
JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
If you like PyTorch then you might like Equinox, by the way. (https://github.com/patrick-kidger/equinox ; 1.4k GitHub stars now!)
- Equinox: Elegant easy-to-use neural networks in Jax
- Show HN: Equinox (1.3k stars), a JAX library for neural networks and sciML
-
Pytrees
You're thinking of `jax.closure_convert`. :)
(Although technically that works by tracing and extracting all constants from the jaxpr, rather than introspecting the function's closure cells -- it sounds like your trick is the latter.)
When you discuss dynamic allocation, I'm guessing you're mainly referring to not being able to backprop through `jax.lax.while_loop`. If so, you might find `equinox.internal.while_loop` interesting, which is an unbounded while loop that you can backprop through! The secret sauce is to use a treeverse-style checkpointing scheme.
https://github.com/patrick-kidger/equinox/blob/f95a8ba13fb35...
-
Writing Python like it’s Rust
I'm a big fan of using ABCs to declare interfaces -- so much so that I have an improved abc.ABCMeta that also handles abstract instance variables and abstract class variables: https://github.com/patrick-kidger/equinox/blob/main/equinox/_better_abstract.py
-
[D] JAX vs PyTorch in 2023
For the daily research, I use Equinox (https://github.com/patrick-kidger/equinox) as a DL librarry in JAX.
- [Machinelearning] [D] État actuel de JAX vs Pytorch?
-
Training Deep Networks with Data Parallelism in Jax
It sounds like you're concerned about how downstream libraries tend to wrap JAX transformations to handle their own thing? (E.g. `haiku.grad`.)
If so, then allow me to make my usual advert here for Equinox:
https://github.com/patrick-kidger/equinox
This actually works with JAX's native transformations. (There's no `equinox.vmap` for example.)
On higher-order functions more generally, Equinox offers a way to control these quite carefully, by making ubiquitous use of callables that are also pytrees. E.g. a neural network is both a callable in that it has a forward pass, and a pytree in that it records its parameters in its tree structure.
What are some alternatives?
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
flax - Flax is a neural network library for JAX that is designed for flexibility.
futhark - :boom::computer::boom: A data-parallel functional programming language
dm-haiku - JAX-based neural network library
julia - The Julia Programming Language
torchtyping - Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
treex - A Pytree Module system for Deep Learning in JAX
hasktorch - Tensors and neural networks in Haskell
diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
CIPs
extending-jax - Extending JAX with custom C++ and CUDA code