torchsde
equinox
torchsde | equinox | |
---|---|---|
5 | 31 | |
1,473 | 1,809 | |
2.0% | - | |
4.8 | 9.2 | |
7 months ago | 12 days ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
torchsde
-
Google Research • Differentiable SDE solvers with GPU support and efficient sensitivity analysis in PyTorch. For stochastic differential equations in your deep learning models
Github: https://github.com/google-research/torchsde
-
[D] Ideal deep learning library
So not just that paper, but also our follow-up papers on the same topic: Neural SDEs as Infinite-Dimensional GANs Efficient and Accurate Gradients for Neural SDEs are in fact implemented in PyTorch, specifically the torchsde library. (Disclaimer: of which I am a developer.)
-
[D] Is there any way for GAN to generate arbitrary length of time series signal?
Code: SDE-GAN example in torchsde.
-
[P] Final Year Computer Science Project Suggestions
If you're interested in finance then I'd recommend Neural SDEs: https://arxiv.org/abs/2102.03657 https://arxiv.org/abs/2105.13493 https://github.com/google-research/torchsde/blob/master/examples/sde_gan.py
-
Simple & Fast GAN Training [D]
This may or may not fit what you're after.
equinox
-
Ask HN: What side projects landed you a job?
I wrote a JAX-based neural network library (Equinox [1]) and numerical differential equation solving library (Diffrax [2]).
At the time I was just exploring some new research ideas in numerics -- and frankly, procrastinating from writing up my PhD thesis!
But then one of the teams at Google starting using them, so they offered me a job to keep developing them for their needs. Plus I'd get to work in biotech, which was a big interest of mine. This was a clear dream job offer, so I accepted.
Since then both have grown steadily in popularity (~2.6k GitHub stars) and now see pretty widespread use! I've since started writing several other JAX libraries and we now have a bit of an ecosystem going.
[1] https://github.com/patrick-kidger/equinox
-
[P] Optimistix, nonlinear optimisation in JAX+Equinox!
The elevator pitch is Optimistix is really fast, especially to compile. It plays nicely with Optax for first-order gradient-based methods, and takes a lot of design inspiration from Equinox, representing the state of all the solvers as standard JAX PyTrees.
-
JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
If you like PyTorch then you might like Equinox, by the way. (https://github.com/patrick-kidger/equinox ; 1.4k GitHub stars now!)
- Equinox: Elegant easy-to-use neural networks in Jax
- Show HN: Equinox (1.3k stars), a JAX library for neural networks and sciML
-
Pytrees
You're thinking of `jax.closure_convert`. :)
(Although technically that works by tracing and extracting all constants from the jaxpr, rather than introspecting the function's closure cells -- it sounds like your trick is the latter.)
When you discuss dynamic allocation, I'm guessing you're mainly referring to not being able to backprop through `jax.lax.while_loop`. If so, you might find `equinox.internal.while_loop` interesting, which is an unbounded while loop that you can backprop through! The secret sauce is to use a treeverse-style checkpointing scheme.
https://github.com/patrick-kidger/equinox/blob/f95a8ba13fb35...
-
Writing Python like it’s Rust
I'm a big fan of using ABCs to declare interfaces -- so much so that I have an improved abc.ABCMeta that also handles abstract instance variables and abstract class variables: https://github.com/patrick-kidger/equinox/blob/main/equinox/_better_abstract.py
-
[D] JAX vs PyTorch in 2023
For the daily research, I use Equinox (https://github.com/patrick-kidger/equinox) as a DL librarry in JAX.
- [Machinelearning] [D] État actuel de JAX vs Pytorch?
-
Training Deep Networks with Data Parallelism in Jax
It sounds like you're concerned about how downstream libraries tend to wrap JAX transformations to handle their own thing? (E.g. `haiku.grad`.)
If so, then allow me to make my usual advert here for Equinox:
https://github.com/patrick-kidger/equinox
This actually works with JAX's native transformations. (There's no `equinox.vmap` for example.)
On higher-order functions more generally, Equinox offers a way to control these quite carefully, by making ubiquitous use of callables that are also pytrees. E.g. a neural network is both a callable in that it has a forward pass, and a pytree in that it records its parameters in its tree structure.
What are some alternatives?
torchdyn - A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods
flax - Flax is a neural network library for JAX that is designed for flexibility.
pysindy - A package for the sparse identification of nonlinear dynamical systems from data
dm-haiku - JAX-based neural network library
tabnet - PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf
torchtyping - Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.
SSD-pytorch - SSD: Single Shot MultiBox Detector pytorch implementation focusing on simplicity
treex - A Pytree Module system for Deep Learning in JAX
NeuralCDE - Code for "Neural Controlled Differential Equations for Irregular Time Series" (Neurips 2020 Spotlight)
extending-jax - Extending JAX with custom C++ and CUDA code
functorch - functorch is JAX-like composable function transforms for PyTorch.
diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/