Is there another way to determine the effect of the features other than the inbuilt features importance and SHAP values? [Research] [Discussion]

This page summarizes the projects mentioned and recommended in the original post on /r/MachineLearning

SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • interpretable-ml-book

    Book about interpretable machine learning

    Yes, there are many techniques beyond the two you listed. I suggest doing a survey of techniques (hint: explainable AI or XAI), starting with the following book: Interpretable Machine Learning.

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
  • stat_rethinking_2022

    Statistical Rethinking course winter 2022

    I would recommend the lectures and book of Statistics Rethinking: https://github.com/rmcelreath/stat_rethinking_2022

  • stat_rethinking_2023

    Statistical Rethinking Course for Jan-Mar 2023

    The 2023 version is currently in process: https://github.com/rmcelreath/stat_rethinking_2023

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts