-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
Sometimes. If explainable predictions are part of your business requirements, it's probably better not to rely entirely on black box models and instead design a system that gives you the information you need as part of it. If you end up using black box models, there are still methods that attempt to help attribute explanations to your prediction. Here's an example of a toolkit for attributing explanations post-hoc to black box model predictions: https://github.com/pytorch/captum
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.
Related posts
-
[D] [R] Research Problem about Weakly Supervised Learning for CT Image Semantic Segmentation
-
What kind of explainability techniques exist for Reinforcement learning?
-
[D] How do you choose which Black-Box Explainability method to use?
-
DeepLIFT or other explainable api implementations for JAX (like captum for pytorch)?
-
how to extract features from a (CNN) convolutional network having raw data with (XAI) explainable techinques?