transformers-interpret
Vision-DiffMask
transformers-interpret | Vision-DiffMask | |
---|---|---|
3 | 2 | |
1,212 | 27 | |
- | - | |
2.9 | 4.3 | |
8 months ago | about 2 months ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
transformers-interpret
-
[P] XAI Recipes for the HuggingFace 🤗 Image Classification Models
Very cool, I like seeing this. I also noticed the transformers interpret package has released support for an image classification explainer: https://github.com/cdpierse/transformers-interpret
-
Using LIME to explain the predictions from a BERT model, it looks like "the", "and", "or" are "very important" features, and thus I don't think the model is learning anything interesting. Any tips?
You could look at the Transformers Interpret python library: https://github.com/cdpierse/transformers-interpret
- Show HN: Transformers Interpret – Explain and visualize Transformer models
Vision-DiffMask
-
[R] VISION DIFFMASK: Faithful Interpretation of Vision Transformers with Differentiable Patch Masking
Found relevant code at https://github.com/AngelosNal/Vision-DiffMask + all code implementations here
What are some alternatives?
neuro-symbolic-sudoku-solver - ⚙️ Solving sudoku using Deep Reinforcement learning in combination with powerful symbolic representations.
small-text - Active Learning for Text Classification in Python
happy-transformer - Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
gensim - Topic Modelling for Humans
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
shap - A game theoretic approach to explain the output of any machine learning model. [Moved to: https://github.com/shap/shap]
PyTorch-NLP - Basic Utilities for PyTorch Natural Language Processing (NLP)
smaller-transformers - Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.
nlp - Repository for all things Natural Language Processing
facet - Human-explainable AI.
PLOD-AbbreviationDetection - This repository contains the PLOD Dataset for Abbreviation Detection released with our LREC 2022 publication