CrabNet
ViTGAN
CrabNet | ViTGAN | |
---|---|---|
1 | 1 | |
96 | 167 | |
- | - | |
3.7 | 0.0 | |
almost 2 years ago | about 3 years ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
ViTGAN
-
ViTGAN: Training GANs with Vision Transformers by Kwonjoon Lee et al. explained in 5 minutes
[Full Explanation Post / Blog Post] [Arxiv] [Code]
What are some alternatives?
ProteinStructurePrediction - Protein structure prediction is the task of predicting the 3-dimensional structure (shape) of a protein given its amino acid sequence and any available supporting information. In this section, we will Install and inspect sidechainnet, a dataset with tools for predicting and inspecting protein structures, complete two simplified implementations of Attention based Networks for predicting protein angles from amino acid sequences, and visualize our predictions along the way.
SAITS - The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
GAT - Graph Attention Networks (https://arxiv.org/abs/1710.10903)
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
Informer2020 - The GitHub repository for the paper "Informer" accepted by AAAI 2021.
Parallel-Tacotron2 - PyTorch Implementation of Google's Parallel Tacotron 2: A Non-Autoregressive Neural TTS Model with Differentiable Duration Modeling
Invariant-Attention - An implementation of Invariant Point Attention from Alphafold 2
how-do-vits-work - (ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention
ubisoft-laforge-daft-exprt - PyTorch Implementation of Daft-Exprt: Robust Prosody Transfer Across Speakers for Expressive Speech Synthesis