Perceiver
EQTransformer
Perceiver | EQTransformer | |
---|---|---|
7 | 1 | |
86 | 323 | |
- | - | |
2.6 | 3.2 | |
over 3 years ago | 7 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Perceiver
- I implemented Deepmind's new Perceiver Model
- I Implemented Deepmind's Perceiver Model
-
[P] I implemented DeepMind's "Perceiver" in PyTorch
Great one, I implemented the Perceiver model too in TensorFlow: https://github.com/Rishit-dagli/Perceiver
- Deepmind's New Perceiver Model
-
[P] Implementing Perceiver: General perception with Iterative Attention in TensorFlow
The project: https://github.com/Rishit-dagli/Perceiver
- Perceiver, General Perception with Iterative Attention
EQTransformer
-
An Unethical Question
Here is a specific idea that would be interesting to test. SCITS released a ML P&S picker that supposedly is pretty good. They put the code on github (https://github.com/smousavi05/EQTransformer). I think a really easy idea would be to put it to the test against USGS picks in different areas. You can get IRIS data and run this in a different area that doesn't normally get much attention and see how it compares to the published catalog. Software like this always makes bold claims, so it would be nice to have independent, verified tests of it. You would just need to download the package (it's already on pypi so you can pip it), learn how to run it, and test against some real data. Would make for a decent paper actually.
What are some alternatives?
Swin-Transformer-Object-Detection - This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" on Object Detection and Instance Segmentation.
PCGrad - Code for "Gradient Surgery for Multi-Task Learning"
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch
paraphraser - Sentence paraphrase generation at the sentence level
Fast-Transformer - An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow
enformer-pytorch - Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
deepmind-perceiver - My implementation of DeepMind's Perceiver
x-transformers - A concise but complete full-attention transformer with a set of promising experimental features from various papers
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
RWKV-LM - RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
gato - Unofficial Gato: A Generalist Agent