TrAVis
bart-base-jax
TrAVis | bart-base-jax | |
---|---|---|
1 | 1 | |
53 | 27 | |
- | - | |
10.0 | 1.4 | |
over 1 year ago | about 1 year ago | |
Python | Python | |
- | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
TrAVis
bart-base-jax
-
How we created an in-browser BERT attention visualiser without a server - TrAVis: Transformer Attention Visualiser
Firstly, we implemented the BART model from scratch using JAX. We chose JAX because it is an amazing deep learning framework that enables us to write clear source code, and it can be easily converted to NumPy, which can be executed in-browser. We chose the #BART model because it is a complete encoder-decoder model, so it can be easily adapted to other models, such as BERT, by simply taking a subset of the source code.
What are some alternatives?
word-piece-tokenizer - A Lightweight Word Piece Tokenizer
NLP-Model-for-Corpus-Similarity - A NLP algorithm I developed to determine the similarity or relation between two documents/Wikipedia articles. Inspired by the cosine similarity algorithm and built from WordNet.
siamese-nn-semantic-text-similarity - A repository containing comprehensive Neural Networks based PyTorch implementations for the semantic text similarity task, including architectures such as: Siamese LSTM Siamese BiLSTM with Attention Siamese Transformer Siamese BERT.
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
d3 - Bring data to life with SVG, Canvas and HTML. :bar_chart::chart_with_upwards_trend::tada:
bert - TensorFlow code and pre-trained models for BERT