tf-transformers
minGPT-TF
Our great sponsors
tf-transformers | minGPT-TF | |
---|---|---|
5 | 1 | |
84 | 53 | |
- | - | |
1.7 | 0.0 | |
about 1 year ago | over 2 years ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
tf-transformers
-
Tensorflow-Transformers 2.0 ( for NLP, CV, Audio )
Code : GitHub - legacyai/tf-transformers: State of the art faster Natural Language Processing in Tensorflow 2.0 . 1 Website : https://legacyai.github.io/tf-transformers 1
- [P] Production Ready NLP Deep learning tutorials on tensorflow 2.0. tf-transformers
- Do we really need to Dstill Language Models? Joint loss is all we need - Albert-Joint .
-
tf-transformers : State of the art faster NLP in Tensorflow 2.0 . 80 % faster to existing TF based libraries.
Faster Auto Regressive Decoding using Tensorflow2. Faster than PyTorch in most experiments (V100 GPU). 80% faster compared to existing TF based libraries (relative difference) Refer benchmark code.
- [D] Why is tensorflow so hated on and pytorch is the cool kids framework?
minGPT-TF
-
minGPT: a small and educational implementation of GPT by Andrej Karpathy
TF version https://github.com/kamalkraj/minGPT-TF
What are some alternatives?
medspacy - Library for clinical NLP with spaCy.
gpt-3-simple-tutorial - Generate SQL from Natural Language Sentences using OpenAI's GPT-3 Model
flax - Flax is a neural network library for JAX that is designed for flexibility.
tensorflow-nanoGPT - Example how to train GPT-2 (XLA + AMP), export to SavedModel and serve with Tensorflow Serving
MIRNet-TFJS - TensorFlow JS models for MIRNet for low-light💡 image enhancement
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
BERT-for-Mobile - Compares the DistilBERT and MobileBERT architectures for mobile deployments.
gpt-mini - Yet another minimalistic Tensorflow (re-)re-implementation of Karpathy's Pytorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer).