RETRO-pytorch
deepmind-research
RETRO-pytorch | deepmind-research | |
---|---|---|
2 | 29 | |
849 | 13,298 | |
- | 1.0% | |
2.8 | 0.0 | |
about 1 year ago | 19 days ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
RETRO-pytorch
-
[D] Any pre trained retrieval based language models available?
There's a Github project that an individual put together based on the RETRO paper. If you checkout the issues list there is some info on work on a pretrained model.
-
[D] Is there an open-source implementation of the Retrieval-Enhanced Transformer (RETRO)?
i'll give it a shot https://github.com/lucidrains/RETRO-pytorch 👍
deepmind-research
- This A.I. Subculture's Motto: Go, Go, Go. The eccentric pro-tech movement known as "Effective Accelerationism" wants to unshackle powerful A.I., and party along the way.
-
How worried are you about AI taking over music?
Deepmind 63
-
Are there Notebooks of AlphaFold 1?
Found some here and here.
-
Trying to port this non-standard Tensorflow model to Pytorch and not sure if I'm missing anything
I am trying to make a physics-simulation model based on DeepMind's research, with its source code found here https://github.com/deepmind/deepmind-research/tree/master/learning_to_simulate . The thing that mainly confuses me is how to properly implement the embedding situation found at https://github.com/deepmind/deepmind-research/blob/master/learning_to_simulate/learned_simulator.py on lines 78 and 152.
-
[D] Is it possible to use machine learning to create 3D images for the purpose of 3D printing?
Yes. There's a fair bit of research into using ML to generate 3D models. Early work, like Neural Radiance Fields (NeRF) generated a voxel model, which could be used for 3D printing, but it would be low resolution, like blowing up a tiny image vs an SVG vector file. However, more recent research can generate polygonal models from a video taken of a real object. Polygonal models are much better for 3D printing.
- DeepMind Research – code to accompany DeepMind publications
- Skilful precipitation nowcasting using deep generative models of radar - Dr. Piotr Mirowski - Zoom
-
[R] Skilful precipitation nowcasting using deep generative models of radar - Link to a free online lecture by the author in comments (deepmind research published in nature)
Skilful precipitation nowcasting using deep generative models of radar https://www.nature.com/articles/s41586-021-03854-z https://deepmind.com/blog/article/nowcasting https://github.com/deepmind/deepmind-research/tree/master/nowcasting
-
Deepmind Open-Sources DM21: A Deep Learning Model For Quantum Chemistry
Github: https://github.com/deepmind/deepmind-research/tree/master/density_functional_approximation_dm21
-
[P] Choosing a self-supervised learning framework that's easy to use
BYOL - again, it seems that it's not optimized for running on multiple GPUs.
What are some alternatives?
CoCa-pytorch - Implementation of CoCa, Contrastive Captioners are Image-Text Foundation Models, in Pytorch
jaxline
TorchPQ - Approximate nearest neighbor search with product quantization on GPU in pytorch and cuda
dm-haiku - JAX-based neural network library
faiss - A library for efficient similarity search and clustering of dense vectors.
flax - Flax is a neural network library for JAX that is designed for flexibility.
retomaton - PyTorch code for the RetoMaton paper: "Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval" (ICML 2022)
SHREC2023-ANIMAR - Source codes of team TikTorch (1st place solution) for track 2 and 3 of the SHREC2023 Challenge
alphafold_pytorch - An implementation of the DeepMind's AlphaFold based on PyTorch for research
RetGen
swav - PyTorch implementation of SwAV https//arxiv.org/abs/2006.09882