diffrax
vectorflow
Our great sponsors
diffrax | vectorflow | |
---|---|---|
21 | 12 | |
1,230 | 1,289 | |
- | 0.3% | |
8.3 | 0.0 | |
4 days ago | 10 months ago | |
Python | D | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
diffrax
- Ask HN: What side projects landed you a job?
-
[P] Optimistix, nonlinear optimisation in JAX+Equinox!
Optimistix has high-level APIs for minimisation, least-squares, root-finding, and fixed-point iteration and was written to take care of these kinds of subroutines in Diffrax.
-
Show HN: Optimistix: Nonlinear Optimisation in Jax+Equinox
Diffrax (https://github.com/patrick-kidger/diffrax).
Here is the GitHub: https://github.com/patrick-kidger/optimistix
The elevator pitch is Optimistix is really fast, especially to compile. It
-
Scientific computing in JAX
Sure. So I've got some PyTorch benchmarks here. The main take-away so far has been that for a neural ODE, the backward pass takes about 50% longer in PyTorch, and the forward (inference) pass takes an incredible 100x longer.
-
[D] JAX vs PyTorch in 2023
FWIW this worked for me. :D My full-time job is now writing JAX libraries at Google. Equinox for neural networks, Diffrax for differential equation solvers, etc.
-
Returning to snake's nest after a long journey, any major advances in python for science ?
It's relatively early days yet, but JAX is in the process of developing its nascent scientific computing / scientific machine learning ecosystem. Mostly because of its strong autodifferentiation capabilities, excellent JIT compiler etc. (E.g. to show off one of my own projects, Diffrax is the library of diffeq solvers for JAX.)
-
What's the best thing/library you learned this year ?
Diffrax - solving ODEs with Jax and computing it's derivatives automatically functools - love partial and lru_cache fastprogress - simpler progress bar than tqdm
-
PyTorch 2.0
At least prior to this announcement: JAX was much faster than PyTorch for differentiable physics. (Better JIT compiler; reduced Python-level overhead.)
E.g for numerical ODE simulation, I've found that Diffrax (https://github.com/patrick-kidger/diffrax) is ~100 times faster than torchdiffeq on the forward pass. The backward pass is much closer, and for this Diffrax is about 1.5 times faster.
It remains to be seen how PyTorch 2.0 will compare, or course!
Right now my job is actually building out the scientific computing ecosystem in JAX, so feel free to ping me with any other questions.
-
Python 3.11 is much faster than 3.8
https://github.com/patrick-kidger/diffrax
Which are neural network and differential equation libraries for JAX.
[Obligatory I-am-googler-my-opinions-do-not-represent- your-employer...]
-
Ask HN: What's your favorite programmer niche?
Autodifferentiable programming!
Neural networks are the famous example of this, of course -- but this can be extended to all of scientific computing. ODE/SDE solvers, root-finding algorithms, LQP, molecular dynamics, ...
These days I'm doing all my work in JAX. (E.g. see Equinox or Diffrax: https://github.com/patrick-kidger/equinox, https://github.com/patrick-kidger/diffrax). A lot of modern work is now based around hybridising such techniques with neural networks.
I'd really encourage anyone interested to learn how JAX works under-the-hood as well. (Look up "autodidax") Lots of clever/novel ideas in its design.
vectorflow
-
Programming languages endorsed for server-side use at Meta
>> Mozilla (of course)
Mozilla is a c++ and javascript shop. What do they ship in Rust? How much of Firefox is written in rust for example?
>> Microsoft, Meta, Google/Acrobat, Amazon
Large firms have lots of devs and consequently lots of toy projects. Is their usage of rust more significant than their use of D? I mean Meta was churning out projects in D a while back (warp, flint, etc) and looked like it might be going all in at one point (they even hired one of the leads on D lang).
>> That's practically all of FAANG
Who were we missing? Netflix, they’ve dabbled with D too: https://github.com/Netflix/vectorflow
Don’t misunderstand my point - it’s not that D is more popular than rust, it’s that rust is not used for real work in any significant capacity yet.
Where’s the big project written in rust? Servo and the rust compiler are the only two large rust projects on github.
-
Cloud TPU VMs are generally available
Thanks Zak, already applied.
Just wondering does TPU VM support Vectorflow?
https://github.com/Netflix/vectorflow
- Vectorflow is a minimalist neural network library optimized for sparse data and single machine environments open sourced by Netflix (r/MachineLearning)
- [P] Vectorflow is a minimalist neural network library optimized for sparse data and single machine environments open sourced by Netflix
- Vectorflow is a minimalist neural network library optimized for sparse data and single machine environments open sourced by Netflix
- Vectorflow: Minimalist neural network library faster than TensorFlow in D
-
Small Neural networks in Julia 5x faster than PyTorch
A library I designed a few years ago (https://github.com/Netflix/vectorflow) is also much faster than pytorch/tensorflow in these cases.
In "small" or "very sparse" setups, you're memory bound, not compute bound. TF and Pytorch are bad at that because they assume memory movements are worth it and do very little in-place operations.
Different tools for different jobs.
What are some alternatives?
deepxde - A library for scientific machine learning and physics-informed learning
tiny-cuda-nn - Lightning fast C++/CUDA neural network framework
dcompute - DCompute: Native execution of D on GPUs and other Accelerators
flax - Flax is a neural network library for JAX that is designed for flexibility.
LeNetTorch - PyTorch implementation of LeNet for fitting MNIST for benchmarking.
juliaup - Julia installer and version multiplexer
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
blis - BLAS-like Library Instantiation Software Framework
dm-haiku - JAX-based neural network library
ugrep - NEW ugrep 5.1: an ultra fast, user-friendly, compatible grep. Ugrep combines the best features of other grep, adds new features, and searches fast. Includes a TUI and adds Google-like search, fuzzy search, hexdumps, searches nested archives (zip, 7z, tar, pax, cpio), compressed files (gz, Z, bz2, lzma, xz, lz4, zstd, brotli), pdfs, docs, and more