Is there another way to determine the effect of the features other than the inbuilt features importance and SHAP values? [Research] [Discussion]

This page summarizes the projects mentioned and recommended in the original post on /r/MachineLearning

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • interpretable-ml-book

    Book about interpretable machine learning

  • Yes, there are many techniques beyond the two you listed. I suggest doing a survey of techniques (hint: explainable AI or XAI), starting with the following book: Interpretable Machine Learning.

  • stat_rethinking_2022

    Statistical Rethinking course winter 2022

  • I would recommend the lectures and book of Statistics Rethinking: https://github.com/rmcelreath/stat_rethinking_2022

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • stat_rethinking_2023

    Statistical Rethinking Course for Jan-Mar 2023

  • The 2023 version is currently in process: https://github.com/rmcelreath/stat_rethinking_2023

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts