PLOD-AbbreviationDetection
transformers-interpret
PLOD-AbbreviationDetection | transformers-interpret | |
---|---|---|
1 | 3 | |
9 | 1,213 | |
- | - | |
0.0 | 2.9 | |
over 1 year ago | 8 months ago | |
Jupyter Notebook | Jupyter Notebook | |
Creative Commons Attribution Share Alike 4.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
PLOD-AbbreviationDetection
-
Clustering to find abbreviations
Finally, the main problem with unsupervised learning is that you won't be able to reliably measure system performance or improvement. In my view, any time you can spend annotating and collecting data for a (semi-)supervised solution will be well-spent. Existing datasets can also get you started with model development, such as https://github.com/surrey-nlp/PLOD-AbbreviationDetection. Once you have a good model on a conventional dataset, you should be able to start generalizing it to your specific task/dataset.
transformers-interpret
-
[P] XAI Recipes for the HuggingFace 🤗 Image Classification Models
Very cool, I like seeing this. I also noticed the transformers interpret package has released support for an image classification explainer: https://github.com/cdpierse/transformers-interpret
-
Using LIME to explain the predictions from a BERT model, it looks like "the", "and", "or" are "very important" features, and thus I don't think the model is learning anything interesting. Any tips?
You could look at the Transformers Interpret python library: https://github.com/cdpierse/transformers-interpret
- Show HN: Transformers Interpret – Explain and visualize Transformer models
What are some alternatives?
converse - Conversational text Analysis using various NLP techniques
neuro-symbolic-sudoku-solver - ⚙️ Solving sudoku using Deep Reinforcement learning in combination with powerful symbolic representations.
hate-speech-and-offensive-language - Repository for the paper "Automated Hate Speech Detection and the Problem of Offensive Language", ICWSM 2017
small-text - Active Learning for Text Classification in Python
ThoughtSource - A central, open resource for data and tools related to chain-of-thought reasoning in large language models. Developed @ Samwald research group: https://samwald.info/
happy-transformer - Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
nlp - Repository for all things Natural Language Processing
gensim - Topic Modelling for Humans
adaptnlp - An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
shap - A game theoretic approach to explain the output of any machine learning model. [Moved to: https://github.com/shap/shap]
Vision-DiffMask - Official PyTorch implementation of Vision DiffMask, a post-hoc interpretation method for vision models.