BERT-Transformer-Pytorch
extreme-bert
BERT-Transformer-Pytorch | extreme-bert | |
---|---|---|
2 | 2 | |
39 | 283 | |
- | 0.0% | |
4.3 | 0.0 | |
4 months ago | about 1 year ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
BERT-Transformer-Pytorch
-
[P] Small problems to test out transformers?
The task is described in the paper I linked (3.1, Task #1: Masked LM). Any implementation of BERT should use it, like this one.
-
[D] [NLP] Did anything significant happen between RNN and transformer approaches?
There are many guides online, I wrote one here (it also refers to other tutorials). It comes with a python file with 300 lines of code re-implementing Transformers and Bert (it's hard to do shorter)
extreme-bert
-
[P] Releasing customized language model pre-training acceleration toolkit: ExtremeBERT
Found relevant code at https://github.com/extreme-bert/extreme-bert + all code implementations here
What are some alternatives?
minGPT - A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
torchscale - Foundation Architecture for (M)LLMs
primeqa - The prime repository for state-of-the-art Multilingual Question Answering research and development.
pixel - Research code for pixel-based encoders of language (PIXEL)
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
RATransformers - RATransformers 🐭- Make your transformer (like BERT, RoBERTa, GPT-2 and T5) Relation Aware!
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
transformer-mlm - Implementation of Transformer Encoders / Masked Language Modeling Objective
code-representations-ml-brain - [NeurIPS 2022] "Convergent Representations of Computer Programs in Human and Artificial Neural Networks" by Shashank Srikant*, Benjamin Lipkin*, Anna A. Ivanova, Evelina Fedorenko, Una-May O'Reilly.
AliceMind - ALIbaba's Collection of Encoder-decoders from MinD (Machine IntelligeNce of Damo) Lab
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
happy-transformer - Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.