TokenCut
nlp-tutorial
TokenCut | nlp-tutorial | |
---|---|---|
1 | 1 | |
285 | 13,735 | |
- | - | |
1.2 | 0.0 | |
about 1 year ago | 3 months ago | |
Jupyter Notebook | Jupyter Notebook | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
TokenCut
-
[R][P] Self-supervised Transformers for Unsupervised Object Discovery using Normalized Cut + Hugging Face Spaces Gradio Demo
github: https://github.com/YangtaoWANG95/TokenCut
nlp-tutorial
What are some alternatives?
pytorch-GAT - My implementation of the original GAT paper (Veliฤkoviฤ et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
nlp_course - YSDA course in Natural Language Processing
poolformer - PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)
pytorch-sentiment-analysis - Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
nn - ๐งโ๐ซ 60 Implementations/tutorials of deep learning papers with side-by-side notes ๐; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), ๐ฎ reinforcement learning (ppo, dqn), capsnet, distillation, ... ๐ง
awesome-nlp - :book: A curated list of resources dedicated to Natural Language Processing (NLP)
D2L_Attention_Mechanisms_in_TF - This repository contains Tensorflow 2 code for Attention Mechanisms chapter of Dive into Deep Learning (D2L) book.
nlp-recipes - Natural Language Processing Best Practices & Examples
pytorch-seq2seq - Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
transformers - ๐ค Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
tf-transformers - State of the art faster Transformer with Tensorflow 2.0 ( NLP, Computer Vision, Audio ).
nlp-in-python-tutorial - comparing stand up comedians using natural language processing