Our great sponsors
-
jaxtyping
Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
diffrax
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
sympy2jax: sympy->JAX conversion;
jaxtyping: rich shape & dtype annotations for arrays and tensors (also supports PyTorch/TensorFlow/NumPy);
Eqxvision: computer vision.
Sure. So I've got some PyTorch benchmarks here. The main take-away so far has been that for a neural ODE, the backward pass takes about 50% longer in PyTorch, and the forward (inference) pass takes an incredible 100x longer.
Related posts
- [P] Optimistix, nonlinear optimisation in JAX+Equinox!
- Returning to snake's nest after a long journey, any major advances in python for science ?
- [R] Introduction to Diffusion Models in JAX
- Julia GPU-based ODE solver 20x-100x faster than those in Jax and PyTorch
- Show HN: Python-Type-Challenges, master Python typing with online exercises