inseq
transformers-interpret
inseq | transformers-interpret | |
---|---|---|
4 | 3 | |
312 | 1,213 | |
9.6% | - | |
8.5 | 2.9 | |
7 days ago | 9 months ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
inseq
-
[R] [P] Inseq: An Interpretability Toolkit for Sequence Generation Models
Found relevant code at https://github.com/inseq-team/inseq + all code implementations here
- Interpreting Language Models with Inseq
- Show HN: Inseq – An Interpretability Toolkit for Generative Language Models
- Inseq: A Toolkit for Interpreting Sequence Generation Models
transformers-interpret
-
[P] XAI Recipes for the HuggingFace 🤗 Image Classification Models
Very cool, I like seeing this. I also noticed the transformers interpret package has released support for an image classification explainer: https://github.com/cdpierse/transformers-interpret
-
Using LIME to explain the predictions from a BERT model, it looks like "the", "and", "or" are "very important" features, and thus I don't think the model is learning anything interesting. Any tips?
You could look at the Transformers Interpret python library: https://github.com/cdpierse/transformers-interpret
- Show HN: Transformers Interpret – Explain and visualize Transformer models
What are some alternatives?
pytorch-grad-cam - Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, Image similarity and more.
neuro-symbolic-sudoku-solver - ⚙️ Solving sudoku using Deep Reinforcement learning in combination with powerful symbolic representations.
nebuly - The user analytics platform for LLMs
small-text - Active Learning for Text Classification in Python
happy-transformer - Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
gensim - Topic Modelling for Humans
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
shap - A game theoretic approach to explain the output of any machine learning model. [Moved to: https://github.com/shap/shap]
Vision-DiffMask - Official PyTorch implementation of Vision DiffMask, a post-hoc interpretation method for vision models.
PyTorch-NLP - Basic Utilities for PyTorch Natural Language Processing (NLP)
smaller-transformers - Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.
nlp - Repository for all things Natural Language Processing