deepchecks
AIX360
deepchecks | AIX360 | |
---|---|---|
15 | 2 | |
3,373 | 1,533 | |
2.3% | 2.0% | |
8.2 | 8.2 | |
15 days ago | 2 months ago | |
Python | Python | |
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
deepchecks
-
Detect, Defend, Prevail: Payments Fraud Detection using ML & Deepchecks
Also if you have any confusion related to it. You can directly go to their discussion section in github :
- Deepchecks: Open-source ML testing and validation library
-
Deepchecks' New Open Source is on Product Hunt, and Needs Your Help
GitHub for Deepchecks: https://github.com/deepchecks/deepchecks
- [D] DL Practitioners, Do You Use Layer Visualization Tools s.a GradCam in Your Process?
-
Data Validation tools
I use DeepChecks for my continuous training pipelines. You can check out the Data Integrity Checks.
- Deepchecks
- deepchecks: Test Suites for Validating ML Models & Data. Deepchecks is a Python package for comprehensively validating your machine learning models and data with minimal effort.
- QA help comes in many forms: Sometimes, from your heavily funded competitor
- Deepchecks: An open-source tool for testing machine learning models and data
-
Test suites for machine learning models in Python (New OSS package)
And if you liked the project, we'll be delighted to count you as one of our stargazers at https://github.com/deepchecks/deepchecks/stargazers!
AIX360
- [D] DL Practitioners, Do You Use Layer Visualization Tools s.a GradCam in Your Process?
-
[R] Explaining the Explainable AI: A 2-Stage Approach - Link to a free online lecture by the author in comments
One Explanation Does Not Fit All: A Toolkit and Taxonomy of AI Explainability Techniques https://arxiv.org/abs/1909.03012 https://github.com/Trusted-AI/AIX360
What are some alternatives?
great_expectations - Always know what to expect from your data.
AIF360 - A comprehensive set of fairness metrics for datasets and machine learning models, explanations for these metrics, and algorithms to mitigate bias in datasets and models.
evidently - Evaluate and monitor ML models from validation to production. Join our Discord: https://discord.com/invite/xZjKRaNp8b
explainable-cnn - 📦 PyTorch based visualization package for generating layer-wise explanations for CNNs.
model-validation-toolkit - Model Validation Toolkit is a collection of tools to assist with validating machine learning models prior to deploying them to production and monitoring them after deployment to production.
cleverhans - An adversarial example library for constructing attacks, building defenses, and benchmarking both
feast - Feature Store for Machine Learning
DiCE - Generate Diverse Counterfactual Explanations for any machine learning model.
postgresml - The GPU-powered AI application database. Get your app to market faster using the simplicity of SQL and the latest NLP, ML + LLM models.
awesome-shapley-value - Reading list for "The Shapley Value in Machine Learning" (JCAI 2022)
giskard - 🐢 Open-Source Evaluation & Testing framework for LLMs and ML models
backpack - BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.