jax-md
jax-md | jax-experiments | |
---|---|---|
2 | 1 | |
1,093 | 2 | |
- | - | |
7.5 | 3.5 | |
17 days ago | 8 months ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jax-md
- JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
-
PyTorch 2.0
On the other hand, there is just no MD implemented with PyTorch.
[1]: https://github.com/jax-md/jax-md
jax-experiments
-
JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
Jax is super useful for scientific computing. Although nbody sims might not be the best application. A naive nbody sim is very easy to implement and accelerate in jax (here’s my version: https://github.com/PWhiddy/jax-experiments/blob/main/nbody.i...), but it can be tricky to scale it. This is because efficient nbody sims usually either rely on trees or spatial hashing/sorting which are tricky to efficiently implement with jax.
What are some alternatives?
torchmd - End-To-End Molecular Dynamics (MD) Engine using PyTorch
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
jaxonnxruntime - A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.
thinc - 🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
tinygrad - You like pytorch? You like micrograd? You love tinygrad! ❤️ [Moved to: https://github.com/tinygrad/tinygrad]
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
autograd - Efficiently computes derivatives of numpy code.