captum
Pytorch
Our great sponsors
captum | Pytorch | |
---|---|---|
11 | 337 | |
4,568 | 77,783 | |
2.5% | 2.4% | |
8.6 | 10.0 | |
2 days ago | 5 days ago | |
Python | Python | |
BSD 3-clause "New" or "Revised" License | BSD 1-Clause License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
captum
-
[D] [R] Research Problem about Weakly Supervised Learning for CT Image Semantic Segmentation
Most likely, NNs in general love shortcut learning (see Geirhos et al. 2020). In general, local explanations such as grad-cam are quite noisy, and sometimes even inconsistent (see Seo et al. 2018 ). Now, in my experience, I've seen that integrated gradients (see Sundararajan et al 2017) does a better job compared to Grad-CAM (also, add a noise tunnel), but this is only based on my limited experience. I would totally recommend using the implementations from the Captum library for loca explanations.
-
[D] Off-the-shelf image saliency scoring models?
Take a look at captum
-
Can you interrogate a machine learning model to find out why it gave certain predictions?
Sometimes. If explainable predictions are part of your business requirements, it's probably better not to rely entirely on black box models and instead design a system that gives you the information you need as part of it. If you end up using black box models, there are still methods that attempt to help attribute explanations to your prediction. Here's an example of a toolkit for attributing explanations post-hoc to black box model predictions: https://github.com/pytorch/captum
- [D] DL Practitioners, Do You Use Layer Visualization Tools s.a GradCam in Your Process?
-
What kind of explainability techniques exist for Reinforcement learning?
The straightforward way to interpret RL agent's decision is to use captum library.
-
[D] How do you choose which Black-Box Explainability method to use?
My use-case is research-oriented. I work on Explainable AI. Generally, the best package I've come across to compute attributions in Pytorch Captum. If your object detector is in PyTorch, you can perhaps build it in.
-
PyTorch vs. TensorFlow in 2022
Do any JAX experts know if there is an equivalent to https://captum.ai/ - a model interpretability library for pytorch?
In particular i want to be able to measure feature importance on both inputs and internal layers on a sample by sample basis. This is the only thing currently holding me back from using JAX right now.
Alternatively a simle to read/understand/port implementation of DeepLIFT would work too.
thanks
-
DeepLIFT or other explainable api implementations for JAX (like captum for pytorch)?
I'm interested to use JAX but am having a hard time finding anything similar to captum for the pytorch world.
- how to extract features from a (CNN) convolutional network having raw data with (XAI) explainable techinques?
-
Looking for help regarding explainable AI
Do you actually want to implement something? There are decent explainability libraries now, e.g., [AIX360](https://aix360.mybluemix.net/), [InterpretML](https://interpret.ml/), or [captum](https://captum.ai/). Pytorch + maybe pytorch lightning + captum might be the quickest way to actually implement something like an explainable neural net yourself. Do the standard tutorials for each of them and watch a few YT videos (or follow a coursera course or something like that) about how these things work in theory and practice, and you'll get up to speed relatively quickly. You will not be able to do useful work in ML without actually learning the ropes.
Pytorch
-
My Favorite DevTools to Build AI/ML Applications!
TensorFlow, developed by Google, and PyTorch, developed by Facebook, are two of the most popular frameworks for building and training complex machine learning models. TensorFlow is known for its flexibility and robust scalability, making it suitable for both research prototypes and production deployments. PyTorch is praised for its ease of use, simplicity, and dynamic computational graph that allows for more intuitive coding of complex AI models. Both frameworks support a wide range of AI models, from simple linear regression to complex deep neural networks.
-
penzai: JAX research toolkit for building, editing, and visualizing neural nets
> does PyTorch have a similar concept
of course https://github.com/pytorch/pytorch/blob/main/torch/utils/_py...
-
Tinygrad: Hacked 4090 driver to enable P2P
fyi should work on most 40xx[1]
[1] https://github.com/pytorch/pytorch/issues/119638#issuecommen...
-
The Elements of Differentiable Programming
Sure, right here: https://github.com/pytorch/pytorch/blob/main/torch/autograd/...
Here's the documentation: https://pytorch.org/tutorials/intermediate/forward_ad_usage....
> When an input, which we call “primal”, is associated with a “direction” tensor, which we call “tangent”, the resultant new tensor object is called a “dual tensor” for its connection to dual numbers[0].
-
Functions and operators for Dot and Matrix multiplication and Element-wise calculation in PyTorch
*My post explains Dot, Matrix and Element-wise multiplication in PyTorch.
-
Dot vs Matrix vs Element-wise multiplication in PyTorch
In PyTorch with @, dot() or matmul():
-
Building a GPT Model from the Ground Up!
import torch # we use PyTorch: https://pytorch.org data = torch.tensor(encode(text), dtype=torch.long) print(data.shape, data.dtype) print(data[:1000]) # the 1000 characters we looked at earlier will to the GPT look like this
-
Open Source Ascendant: The Transformation of Software Development in 2024
AI's Open Embrace Artificial intelligence (AI) and machine learning (ML) are increasingly leveraging open-source frameworks like TensorFlow [https://www.tensorflow.org/] and PyTorch [https://pytorch.org/]. This democratization of AI tools is driving innovation and lowering entry barriers across industries.
-
Best AI Tools for Students Learning Development and Engineering
Which label applies to a tool sometimes depends on what you do with it. For example, PyTorch or TensorFlow can be called a library, a toolkit, or a machine-learning framework.
-
Element-wise vs Matrix vs Dot multiplication
In PyTorch with * or mul(). ` or mul()` can multiply 0D or more D tensors by element-wise multiplication:
What are some alternatives?
shap - A game theoretic approach to explain the output of any machine learning model.
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
DALEX - moDel Agnostic Language for Exploration and eXplanation
mediapipe - Cross-platform, customizable ML solutions for live and streaming media.
lucid - A collection of infrastructure and tools for research in neural network interpretability.
Apache Spark - Apache Spark - A unified analytics engine for large-scale data processing
flax - Flax is a neural network library for JAX that is designed for flexibility.
WeightWatcher - The WeightWatcher tool for predicting the accuracy of Deep Neural Networks
tinygrad - You like pytorch? You like micrograd? You love tinygrad! ❤️ [Moved to: https://github.com/tinygrad/tinygrad]
alibi - Algorithms for explaining machine learning models
Pandas - Flexible and powerful data analysis / manipulation library for Python, providing labeled data structures similar to R data.frame objects, statistical functions, and much more