flax
deepmind-research
flax | deepmind-research | |
---|---|---|
10 | 29 | |
5,520 | 12,802 | |
1.9% | 0.7% | |
9.7 | 0.6 | |
6 days ago | 11 days ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
flax
-
Maxtext: A simple, performant and scalable Jax LLM
Is t5x an encoder/decoder architecture?
Some more general options.
The Flax ecosystem
https://github.com/google/flax?tab=readme-ov-file
or dm-haiku
https://github.com/google-deepmind/dm-haiku
were some of the best developed communities in the Jax AI field
Perhaps the “trax” repo? https://github.com/google/trax
Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...
Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py
-
What is the JAX/Flax equivalent of torch.nn.Parameter?
https://github.com/google/flax/discussions/919 https://flax.readthedocs.io/en/latest/_modules/flax/linen/attention.html
-
Announcing flax 0.2 - A fully featured ECS
Just as an FYI, you might be competing against another big open source project with the same name https://github.com/google/flax
-
Flax: How to use one linen module inside another for training?
I have asked the same question on the Flax discussion page on Github as well.
-
[D] Should We Be Using JAX in 2022?
What's your favorite Deep Learning API for JAX - Flax, Haiku, Elegy, something else?
-
PyTorch vs. TensorFlow in 2022
As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
- [D] Getting Started with Deep Learning in JAX with Treex in 16 lines
-
[D] JAX learning resources?
- https://github.com/google/flax/tree/main/examples
- Why would I want to develop yet another deep learning framework?
-
[D] Why is tensorflow so hated on and pytorch is the cool kids framework?
Any thoughts on Flax?
deepmind-research
- This A.I. Subculture's Motto: Go, Go, Go. The eccentric pro-tech movement known as "Effective Accelerationism" wants to unshackle powerful A.I., and party along the way.
-
How worried are you about AI taking over music?
Deepmind 63
-
Are there Notebooks of AlphaFold 1?
Found some here and here.
-
Trying to port this non-standard Tensorflow model to Pytorch and not sure if I'm missing anything
I am trying to make a physics-simulation model based on DeepMind's research, with its source code found here https://github.com/deepmind/deepmind-research/tree/master/learning_to_simulate . The thing that mainly confuses me is how to properly implement the embedding situation found at https://github.com/deepmind/deepmind-research/blob/master/learning_to_simulate/learned_simulator.py on lines 78 and 152.
-
[D] Is it possible to use machine learning to create 3D images for the purpose of 3D printing?
Yes. There's a fair bit of research into using ML to generate 3D models. Early work, like Neural Radiance Fields (NeRF) generated a voxel model, which could be used for 3D printing, but it would be low resolution, like blowing up a tiny image vs an SVG vector file. However, more recent research can generate polygonal models from a video taken of a real object. Polygonal models are much better for 3D printing.
- DeepMind Research – code to accompany DeepMind publications
- Skilful precipitation nowcasting using deep generative models of radar - Dr. Piotr Mirowski - Zoom
-
[R] Skilful precipitation nowcasting using deep generative models of radar - Link to a free online lecture by the author in comments (deepmind research published in nature)
Skilful precipitation nowcasting using deep generative models of radar https://www.nature.com/articles/s41586-021-03854-z https://deepmind.com/blog/article/nowcasting https://github.com/deepmind/deepmind-research/tree/master/nowcasting
-
Deepmind Open-Sources DM21: A Deep Learning Model For Quantum Chemistry
Github: https://github.com/deepmind/deepmind-research/tree/master/density_functional_approximation_dm21
-
[P] Choosing a self-supervised learning framework that's easy to use
BYOL - again, it seems that it's not optimized for running on multiple GPUs.
What are some alternatives?
dm-haiku - JAX-based neural network library
jaxline
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
trax - Trax — Deep Learning with Clear Code and Speed
RETRO-pytorch - Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
alphafold_pytorch - An implementation of the DeepMind's AlphaFold based on PyTorch for research
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
swav - PyTorch implementation of SwAV https//arxiv.org/abs/2006.09882
objax
TorchPQ - Approximate nearest neighbor search with product quantization on GPU in pytorch and cuda