bert-sklearn
tf-transformers
bert-sklearn | tf-transformers | |
---|---|---|
1 | 5 | |
293 | 84 | |
- | - | |
0.0 | 1.7 | |
over 1 year ago | about 1 year ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
bert-sklearn
-
Quick BERT Pre-Trained Model for Sentiment Analysis with Scikit Wrapper
Sckit-learn wrapper provided by Charles Nainan. GitHub of Scikit Learn BERT wrapper.
tf-transformers
-
Tensorflow-Transformers 2.0 ( for NLP, CV, Audio )
Code : GitHub - legacyai/tf-transformers: State of the art faster Natural Language Processing in Tensorflow 2.0 . 1 Website : https://legacyai.github.io/tf-transformers 1
- [P] Production Ready NLP Deep learning tutorials on tensorflow 2.0. tf-transformers
- Do we really need to Dstill Language Models? Joint loss is all we need - Albert-Joint .
-
tf-transformers : State of the art faster NLP in Tensorflow 2.0 . 80 % faster to existing TF based libraries.
Faster Auto Regressive Decoding using Tensorflow2. Faster than PyTorch in most experiments (V100 GPU). 80% faster compared to existing TF based libraries (relative difference) Refer benchmark code.
- [D] Why is tensorflow so hated on and pytorch is the cool kids framework?
What are some alternatives?
bert - TensorFlow code and pre-trained models for BERT
medspacy - Library for clinical NLP with spaCy.
OpenAI-CLIP - Simple implementation of OpenAI CLIP model in PyTorch.
flax - Flax is a neural network library for JAX that is designed for flexibility.
kruk - Ukrainian instruction-tuned language models and datasets
MIRNet-TFJS - TensorFlow JS models for MIRNet for low-lightš” image enhancement
NLU-engine-prototype-benchmarks - Demo and benchmarks for building an NLU engine similar to those in voice assistants. Several intent classifiers are implemented and benchmarked. Conditional Random Fields (CRFs) are used for entity extraction.
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
fake-news - Building a fake news detector from initial ideation to model deployment
BERT-for-Mobile - Compares the DistilBERT and MobileBERT architectures for mobile deployments.
ABSA_Project_4 - This project takes advantange of the parsing and part of speech tagging capabilites of Spacy's pipeline in order to extract aspect/opinion/sentiment triplets. Cluster aspects using unsupervised learning to process sentiment for large amazon review datasets.
gpt-3-simple-tutorial - Generate SQL from Natural Language Sentences using OpenAI's GPT-3 Model