jax-md
jaxonnxruntime
jax-md | jaxonnxruntime | |
---|---|---|
2 | 1 | |
1,093 | 71 | |
- | - | |
7.5 | 8.0 | |
17 days ago | 24 days ago | |
Jupyter Notebook | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jax-md
- JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
-
PyTorch 2.0
On the other hand, there is just no MD implemented with PyTorch.
[1]: https://github.com/jax-md/jax-md
jaxonnxruntime
What are some alternatives?
torchmd - End-To-End Molecular Dynamics (MD) Engine using PyTorch
autograd - Efficiently computes derivatives of numpy code.
tinygrad - You like pytorch? You like micrograd? You love tinygrad! ❤️ [Moved to: https://github.com/tinygrad/tinygrad]
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
jax-experiments
thinc - 🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
faster-cpython - How to make CPython faster.