Transformer-MM-Explainability
pytea
Transformer-MM-Explainability | pytea | |
---|---|---|
3 | 3 | |
704 | 310 | |
- | 0.3% | |
0.0 | 1.8 | |
8 months ago | about 2 years ago | |
Jupyter Notebook | TypeScript | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Transformer-MM-Explainability
pytea
What are some alternatives?
pytorch-grad-cam - Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, Image similarity and more.
examples - A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
TorchDrift - Drift Detection for your PyTorch Models
cleverhans - An adversarial example library for constructing attacks, building defenses, and benchmarking both
explainerdashboard - Quickly build Explainable AI dashboards that show the inner workings of so-called "blackbox" machine learning models.
uncertainty-toolbox - Uncertainty Toolbox: a Python toolbox for predictive uncertainty quantification, calibration, metrics, and visualization
shap - A game theoretic approach to explain the output of any machine learning model.
WeightWatcher - The WeightWatcher tool for predicting the accuracy of Deep Neural Networks
clip-italian - CLIP (Contrastive LanguageāImage Pre-training) for Italian
vivit - [TMLR 2022] Curvature access through the generalized Gauss-Newton's low-rank structure: Eigenvalues, eigenvectors, directional derivatives & Newton steps
delve - PyTorch model training and layer saturation monitor