explainerdashboard VS captum

Compare explainerdashboard vs captum and see what are their differences.

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
explainerdashboard captum
2 11
2,172 4,491
- 2.2%
8.0 8.4
10 days ago 7 days ago
Python Python
MIT License BSD 3-clause "New" or "Revised" License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

explainerdashboard

Posts with mentions or reviews of explainerdashboard. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-10-28.

captum

Posts with mentions or reviews of captum. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-04-17.

What are some alternatives?

When comparing explainerdashboard and captum you can also consider the following projects:

shap - A game theoretic approach to explain the output of any machine learning model.

DALEX - moDel Agnostic Language for Exploration and eXplanation

lucid - A collection of infrastructure and tools for research in neural network interpretability.

flax - Flax is a neural network library for JAX that is designed for flexibility.

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

deepchecks - Deepchecks: Tests for Continuous Validation of ML Models & Data. Deepchecks is a holistic open-source solution for all of your AI & ML validation needs, enabling to thoroughly test your data and models from research to production.

Transformer-MM-Explainability - [ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.

WeightWatcher - The WeightWatcher tool for predicting the accuracy of Deep Neural Networks

alibi - Algorithms for explaining machine learning models

interpret - Fit interpretable models. Explain blackbox machine learning.

jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

WarpedGANSpace - [ICCV 2021] Authors official PyTorch implementation of the "WarpedGANSpace: Finding non-linear RBF paths in GAN latent space".