EQTransformer
PCGrad
EQTransformer | PCGrad | |
---|---|---|
1 | 1 | |
327 | 303 | |
- | - | |
3.2 | 0.0 | |
8 months ago | over 4 years ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
EQTransformer
-
An Unethical Question
Here is a specific idea that would be interesting to test. SCITS released a ML P&S picker that supposedly is pretty good. They put the code on github (https://github.com/smousavi05/EQTransformer). I think a really easy idea would be to put it to the test against USGS picks in different areas. You can get IRIS data and run this in a different area that doesn't normally get much attention and see how it compares to the published catalog. Software like this always makes bold claims, so it would be nice to have independent, verified tests of it. You would just need to download the package (it's already on pypi so you can pip it), learn how to run it, and test against some real data. Would make for a decent paper actually.
PCGrad
-
Help with studying AI in go
Let me see if I find a public example: the other day I was trying some experiments with PCGrad, so I looked at the code and bumped into this line:
What are some alternatives?
paraphraser - Sentence paraphrase generation at the sentence level
pytorch-a2c-ppo-acktr-gail - PyTorch implementation of Advantage Actor Critic (A2C), Proximal Policy Optimization (PPO), Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation (ACKTR) and Generative Adversarial Imitation Learning (GAIL).
enformer-pytorch - Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
minimalRL - Implementations of basic RL algorithms with minimal lines of codes! (pytorch based)
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
Pytorch-PCGrad - Pytorch reimplementation for "Gradient Surgery for Multi-Task Learning"
x-transformers - A concise but complete full-attention transformer with a set of promising experimental features from various papers
trax - Trax — Deep Learning with Clear Code and Speed
RWKV-LM - RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
Gorgonia - Gorgonia is a library that helps facilitate machine learning in Go.
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
muzero-general - MuZero