jax
dex-lang
jax | dex-lang | |
---|---|---|
89 | 26 | |
31,945 | 1,611 | |
1.6% | 0.4% | |
10.0 | 0.0 | |
2 days ago | 3 months ago | |
Python | Haskell | |
Apache License 2.0 | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jax
- I want a good parallel computer
-
Show HN: Localscope–Limit scope of Python functions for reproducible execution
localscope is a small Python package that disassembles functions to check if they access global variables they shouldn't. I wrote this a few years ago to detect scope bugs which are common in Jupyter notebooks. It's recently come in handy writing jax code (https://github.com/jax-ml/jax) because it requires pure functions. Thought I'd share.
- Zest
-
KlongPy: High-Performance Array Programming in Python
If you like high-performance array programming a la "numpy with JIT" I suggest looking at JAX. It's very suitable for general numeric computing (not just ML) and a very mature ecosystem.
https://github.com/jax-ml/jax
-
PyTorch is dead. Long live Jax
Nope, changing graph shape requires recompilation: https://github.com/google/jax/discussions/17191
- cuDF – GPU DataFrame Library
-
Rebuilding TensorFlow 2.8.4 on Ubuntu 22.04 to patch vulnerabilities
I found a GitHub issue that seemed similar (missing ptxas) and saw a suggestion to install nvidia-cuda-toolkit. Alright: but that exploded the container size from 6.5 GB to 12.13 GB … unacceptable 😤 (Incidentally, this is too large for Cloud Shell to build on its limited persistent disk.)
-
The Elements of Differentiable Programming
The dual numbers exist just as surely as the real numbers and have been used well over 100 years
https://en.m.wikipedia.org/wiki/Dual_number
Pytorch has had them for many years.
https://pytorch.org/docs/stable/generated/torch.autograd.for...
JAX implements them and uses them exactly as stated in this thread.
https://github.com/google/jax/discussions/10157#discussionco...
As you so eloquently stated, "you shouldn't be proclaiming things you don't actually know on a public forum," and doubly so when your claimed "corrections" are so demonstrably and totally incorrect.
-
Julia GPU-based ODE solver 20x-100x faster than those in Jax and PyTorch
On your last point, as long as you jit the topmost level, it doesn't matter whether or not you have inner jitted functions. The end result should be the same.
Source: https://github.com/google/jax/discussions/5199#discussioncom...
-
Apple releases MLX for Apple Silicon
The design of MLX is inspired by frameworks like NumPy, PyTorch, Jax, and ArrayFire.
dex-lang
-
Bayesian Neural Networks
I still am excited by Dex (https://github.com/google-research/dex-lang/) and still write code in it! I have a bunch of demos and fixes written, and am just waiting for Dougal to finish his latest re-write before I can merge them.
-
Thinking in an Array Language
A really nice approach to this I've seen recently is Google's research on [Dex](https://github.com/google-research/dex-lang).
- Function Composition in Programming Languages – Conor Hoekstra – CppNorth 2023 [video]
- Dex Lang: Research language for array processing in the Haskell/ML family
-
[D] Have their been any attempts to create a programming language specifically for machine learning?
Dex
-
[D] PyTorch 2.0 Announcement
Have you tried Dex? https://github.com/google-research/dex-lang It is in a relatively early stage, but it is exploring some interesting parts of the design space.
- Mangle, a programming language for deductive database programming
-
Looking for languages that combine algebraic effects with parallel execution
I think [Dex](https://github.com/google-research/dex-lang) might be along the lines of what you're looking for, although its focus is on SIMD GPU-style parallelism rather than thread-level parallelism.
-
“Why I still recommend Julia”
Dex proves indexing correctness without a full dependent type system, including loops.
See: https://github.com/google-research/dex-lang/pull/969
What are some alternatives?
Numba - NumPy aware dynamic Python compiler using LLVM
Co-dfns - High-performance, Reliable, and Parallel APL
julia - The Julia Programming Language
CIPs - Cardano Improvement Proposals (CIPs)
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
futhark - :boom::computer::boom: A data-parallel functional programming language