|5 days ago||6 days ago|
|BSD 3-clause "New" or "Revised" License||Apache License 2.0|
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
The Dex Programming Language: "Getting to the Point. Index Sets and Parallelism-Preserving Autodiff for Pointful Array Programming"
1 project | reddit.com/r/ProgrammingLanguages | 25 Nov 2021
Should I learn Haskell or C first in my situation?
2 projects | reddit.com/r/haskell | 9 Nov 2021
Is C the only right option for implementing an array language?
3 projects | reddit.com/r/apljk | 24 Aug 2021
Not sure if you consider Futhark or Dex to be array languages, but both are written in haskell. Dex compiles to llvm.
Questions for Charles Hoskinson - post from Lex Fridman
4 projects | reddit.com/r/cardano | 10 Jun 2021
Good luck parsing Dex to distinguish from confusion with Decentralized Exchanges (DEX).4 projects | reddit.com/r/cardano | 10 Jun 2021
So now we are seeing ML research tools like Dex, a Haskell-based research language developed to have support provided by the features of a functional language.
Matrix Multiplication Inches Closer to Mythic Goal
3 projects | news.ycombinator.com | 23 Mar 2021
Not that it has bindings to other tools, but it sounds like Dex would be relevant to your interests!
Dex: A research language for array processing in the Haskell/ML family
1 project | reddit.com/r/apljk | 11 Mar 2021
Looking to get into Parallel Computing and System Programming Research
3 projects | reddit.com/r/Compilers | 11 Mar 2021
Dex is a research language for typed, functional array processing
2 projects | news.ycombinator.com | 11 Mar 2021
[N] PyTorch 1.8 Release, including Compiler and Distributed Training updates, New Mobile Tutorials
7 projects | reddit.com/r/MachineLearning | 4 Mar 2021
When I've interacted with this kind of thing before -- for example the PyTorch JIT (which I believe also parses an AST to produce an IR? Is this the same parser/IR as fx or different?), the JAX JIT, or something like Zygote in Julia -- I've always hit these kinds of issues. I think the fundamental problem is choosing to parse an AST for something inherently more flexible, rather than building a graph-based DSL a la TensorFlow v1 (despite its flaws), or though I've not tried it, maybe building something like dex.
PyTorch: Where we are headed and why it looks a lot like Julia (but not exactly)
19 projects | news.ycombinator.com | 26 Nov 2021
JAX on WSL2 - The "Couldn't read CUDA driver version." problem.
1 project | reddit.com/r/JAX | 18 Nov 2021
As noted here, the path (file):
Show HN: How does Jax allocate memory on a TPU? An interactive C++ walkthrough
4 projects | news.ycombinator.com | 6 Nov 2021
> The downside of Jax is it’s not easy to debug. PyTorch, for better or for worse, will actually run your Python code as you wrote it.
Hmm. Jax's ease of debugging was the very first thing that caught my attention: https://blog.gpt4.org/jaxtpu#:~:text=pdb.set_trace()
> I ran it on the TPU VM, saw the loss curve go down, and it was like an electric shock. "Wow! That actually... worked? Huh. that's weird. Things never work on the first try. I'm impressed."
> Then I plopped `import pdb; pdb.set_trace()` in the middle of the `loss` function and ran it again. It dropped me into the Python debugger.
> There was a tensor named `X_bt`. I typed `X_bt`. The debugger printed the value of `X_bt`.
> I was able to print out all the values of every variable, just like you'd expect Python to be able to do.
> There was a tensor named `Y_bt`. I typed `X_bt + Y_bt`. I was now staring at exactly what I expected: the sum of those two tensors.
> I could write `x + y`, or create new variables, or anything else I wanted.
> Now I was real impressed.
> If it sounds weird that I'm so easily impressed, it's because, you godda understand: until now, TPUs were a complete pain in the ass to use. I kept my feelings to myself, because I understood that the Cloud TPU team were working hard to improve TPUs, and the TFRC support team was wonderful, and I had so many TPUs to play with. But holy moly, if you were expecting any of the above examples to just work on the first try when using Tensorflow V1 on TPUs, you were in for a rude awakening. And if you thought "Well, Tensorflow v2 is supposedly a lot better, right? Surely I'll be able to do basic things without worrying...."
> ... no. Not even close. Not until Jax + TPU VMs.
In the subsequent year, it's been nothing but joy.
If the problem is that you want to see tensor values in a JIT'ed function, use a host callback. You can run actual Python wherever you want: https://jax.readthedocs.io/en/latest/jax.experimental.host_c...
> This module introduces the host callback functions call(), id_tap(), and id_print(), that send their arguments from the device to the host and invoke user-defined Python functions on the host, optionally returning results back to the device computation.
The nice part is, there's no "magic" under the hood. If you get a chance, I highly recommend reading through Autodidax: https://jax.readthedocs.io/en/latest/autodidax.html
Autodidax is a pure-python implementation of jax. (Literally in one file, on that page.) It walks you through how every aspect of jax works.
Delightfully, I found a secret branch where autodidax also implements host callbacks: https://github.com/google/jax/blob/effect-types/docs/autodid...
If you scroll to the very bottom of that file, you'll see an example of compiling your own XLA JIT'ed code which subsequently calls back into Python. TPUs do precisely the same thing.
> PyTorch, for better or for worse, will actually run your Python code as you wrote it.
... is also true of jax, to within a rounding error less than "I personally don't mind writing id_print(x) instead of print(x)." :)
[N] Jax now Supports Apple Silicon [CPU ONLY]
1 project | reddit.com/r/MachineLearning | 30 Oct 2021
Check this thread to install jaxlib: https://github.com/google/jax/issues/5501
An Introduction to Probabilistic Programming
2 projects | news.ycombinator.com | 22 Oct 2021
note that these are not exclusive. you could divide ML into a traditional statistical approach and a probabilistic one that is concerned with deriving the underlying probability distribution. probabilistic programming is kind of like a domain specific language for achieving this. there is also differential programming that works on the same principle. there are certainly industrial usages of this paradigm. look up pyro (http://pyro.ai/examples/intro_part_i.html) for ppl and jax (https://github.com/google/jax) for differential programming
[R] Google AI 0pen Sources ‘FedJAX’, A JAX-based Python Library for Federated Learning Simulations
2 projects | reddit.com/r/MachineLearning | 5 Oct 2021
A new google study introduces FedJAX, a JAX-based open-source library for federated learning simulations that emphasizes ease-of-use in research. FedJAX intends to construct and assess federated algorithms faster and easier for academics by providing basic building blocks for implementing federated algorithms, preloaded datasets, models, and algorithms, and fast simulation speed.
[P] Training a spiking NN to produce images
2 projects | reddit.com/r/MachineLearning | 10 Sep 2021
We used the spiking neuron modules in Rockpool to build the network. These particular modules are based on Jax, which gives us compilation to GPU/TPU/CPU as well as automatic differentiation "for free". There are similar modules in Rockpool based on Torch, if you prefer to use a torch training pipeline.
Running AlphaFold on a IBM Power9 cluster?
1 project | reddit.com/r/bioinformatics | 7 Sep 2021
much of alphafold2 is implemented in jax, and jax does not have power9 builds. see this related issue https://github.com/google/jax/issues/4493, which is still open at the time of writing.
1 project | reddit.com/r/CryptocurrencyICO | 5 Sep 2021
JAX is a numerical computing library that combines NumPy, automatic differentiation, and first-class GPU/TPU support.
Jax and Haskell
1 project | news.ycombinator.com | 27 Aug 2021
What are some alternatives?
mesh-transformer-jax - Model parallel transformers in JAX and Haiku
futhark - :boom::computer::boom: A data-parallel functional programming language
julia - The Julia Programming Language
hasktorch - Tensors and neural networks in Haskell
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
tensorflow - An Open Source Machine Learning Framework for Everyone
functorch - functorch is a prototype of JAX-like composable function transforms for PyTorch.
mesh-transformer-jax - Model parallel transformers in JAX and Haiku