DALEX
responsible-ai-toolbox
DALEX | responsible-ai-toolbox | |
---|---|---|
2 | 2 | |
1,323 | 1,218 | |
0.6% | 2.7% | |
5.5 | 9.4 | |
2 months ago | about 17 hours ago | |
Python | TypeScript | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
DALEX
-
Twitter set to accept ‘best and final offer’ of Elon Musk
Which he will not do, because: a) He can't, it's a black box algorithm. It actually is open source already, but that doesn't mean much as it's useless without Twitter's data https://github.com/ModelOriented/DALEX b) He won't release data that shows the algorithm is racist and amplifies conservative and extremist content. He won't remove such functions because it will cost him billions.
-
[D] What are your favorite Random Forest implementations that support categoricals
There are a couple of ways to use Shapley values for explanations in R. One way is to use DALEX, which also contains a lot of other methods besides SHAP. Another one is iml. I am sure there are several other implementations of SHAP as well.
responsible-ai-toolbox
-
Debugging Machine Learning "[N]"
http://erroranalysis.ai/ is a new open-source tool for in-depth understanding and diagnosis of Machine Learning Errors. The tool is available as a highly interactive jupyter widget and brings in several visualization primitives all centered around debugging activities in ML (main repo: https://github.com/microsoft/responsible-ai-widgets).
-
[N] Visualize Your Model Errors! Microsoft Toolkit Identifies and Diagnoses ML Failures
The toolkit is on the project GitHub. Additional information is available on the Error Analysis website.
What are some alternatives?
shapley - The official implementation of "The Shapley Value of Classifiers in Ensemble Games" (CIKM 2021).
EthicML - Package for evaluating the performance of methods which aim to increase fairness, accountability and/or transparency
captum - Model interpretability and understanding for PyTorch
CARLA - CARLA: A Python Library to Benchmark Algorithmic Recourse and Counterfactual Explanation Algorithms
Lime-For-Time - Application of the LIME algorithm by Marco Tulio Ribeiro, Sameer Singh, Carlos Guestrin to the domain of time series classification
awesome-shapley-value - Reading list for "The Shapley Value in Machine Learning" (JCAI 2022)
LIME - Tutorial notebooks on explainable Machine Learning with LIME (Original work: https://arxiv.org/abs/1602.04938)
jupyter-annotate - Interactive Text Annotation for Jupyter Notebook/Lab
catboost - A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. Supports computation on CPU and GPU.
simple-data-analysis - Easy-to-use JavaScript library for most common data analysis tasks. [Moved to: https://github.com/nshiab/simple-data-analysis.js]
interpret - Fit interpretable models. Explain blackbox machine learning.
cumulocity-app-builder - The Application Builder for Cumulocity provides a simple, coding-free way to create new applications inside Cumulocity. Application Builder is an open-source tool for you to create web applications in a no-code environment. Created by Global Presales.