The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning. Learn more →
Top 17 Python Interpretability Projects
-
pytorch-grad-cam
Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, Image similarity and more.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
decision-forests
A collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models in Keras.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
WarpedGANSpace
[ICCV 2021] Authors official PyTorch implementation of the "WarpedGANSpace: Finding non-linear RBF paths in GAN latent space".
-
TalkToModel
TalkToModel gives anyone with the powers of XAI through natural language conversations 💬!
-
autoprognosis
A system for automating the design of predictive modeling pipelines tailored for clinical prognosis.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
For the two examples we will be looking at, we will be using pytorch_grad_cam, an incredible open source package that makes working with GradCam very easy. There are excellent other tutorials to check out on the repo as well.
Project mention: Alibi: Open-source Python lib for ML model inspection and interpretation | news.ycombinator.com | 2023-06-02
Project mention: Xplique Is a Neural Networks Explainability Toolbox | news.ycombinator.com | 2024-02-02
https://arxiv.org/abs/2404.03592 https://github.com/stanfordnlp/pyreft
Project mention: EPU-CNN: Introducing Generalized Additive Models in CNNs for Interpretable Image Classification | /r/computervision | 2023-12-05The code is also made available on Github here along with all the datasets used in the original research. Your feedback and contributions are highly welcome!
Python Interpretability related posts
- penzai: JAX research toolkit for building, editing, and visualizing neural nets
- Exploring GradCam and More with FiftyOne
- Alibi: Open-source Python lib for ML model inspection and interpretation
- [D] [R] Research Problem about Weakly Supervised Learning for CT Image Semantic Segmentation
- [R] Introducing ferret, a new Python package to streamline interpretability on Transformers
- [D] Off-the-shelf image saliency scoring models?
- [R] [P] Inseq: An Interpretability Toolkit for Sequence Generation Models
-
A note from our sponsor - WorkOS
workos.com | 27 Apr 2024
Index
What are some of the best open-source Interpretability projects in Python? This list will help you:
Project | Stars | |
---|---|---|
1 | pytorch-grad-cam | 9,410 |
2 | captum | 4,568 |
3 | alibi | 2,288 |
4 | DALEX | 1,318 |
5 | tf-explain | 1,007 |
6 | decision-forests | 650 |
7 | xplique | 569 |
8 | pyreft | 562 |
9 | FastTreeSHAP | 492 |
10 | pyvene | 455 |
11 | inseq | 295 |
12 | ferret | 203 |
13 | WarpedGANSpace | 107 |
14 | TalkToModel | 102 |
15 | autoprognosis | 87 |
16 | penzai | 1,334 |
17 | EPU-CNN | 5 |
Sponsored