CrabNet
Invariant-Attention
CrabNet | Invariant-Attention | |
---|---|---|
1 | 1 | |
96 | 6 | |
- | - | |
3.7 | 10.0 | |
almost 2 years ago | about 2 years ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
CrabNet
-
Artificial intelligence can revolutionise science
I don't know. As for "literature-based discovery," this project/paper sounded like a pretty big deal when it came out a few years ago: https://github.com/materialsintelligence/mat2vec . And I see this thing came out more recently: https://github.com/anthony-wang/CrabNet .
Of course not all fields lend themselves as well to this as does materials science.
Invariant-Attention
What are some alternatives?
ProteinStructurePrediction - Protein structure prediction is the task of predicting the 3-dimensional structure (shape) of a protein given its amino acid sequence and any available supporting information. In this section, we will Install and inspect sidechainnet, a dataset with tools for predicting and inspecting protein structures, complete two simplified implementations of Attention based Networks for predicting protein angles from amino acid sequences, and visualize our predictions along the way.
af2complex - Predicting direct protein-protein interactions with AlphaFold deep learning neural network models.
GAT - Graph Attention Networks (https://arxiv.org/abs/1710.10903)
best-of-ml-python - 🏆 A ranked list of awesome machine learning Python libraries. Updated weekly.
mat2vec - Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).
nanodl - A Jax-based library for designing and training transformer models from scratch.
query-selector - LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
flashattention2-custom-mask - Triton implementation of FlashAttention2 that adds Custom Masks.
ViTGAN - A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention
D2L_Attention_Mechanisms_in_TF - This repository contains Tensorflow 2 code for Attention Mechanisms chapter of Dive into Deep Learning (D2L) book.