Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more methods to the feature importance page #26

Open
aperloff opened this issue Jul 25, 2022 · 0 comments
Open

Add more methods to the feature importance page #26

aperloff opened this issue Jul 25, 2022 · 0 comments

Comments

@aperloff
Copy link
Contributor

Based on a CMSTalk conversation there are several methods for feature importance we may wish to add.

Methods:

  • Taylor expansion (usually up to second order) of the NN output function w.r.t. the input features. Can be done during (learn when a feature becomes important) or after training (learn the most important features). Already discussed during a journal club. This presentation contains a Jupyter example. The method is described in this paper.

Tools section:

  • Several methods are encapsulated in the innvestigate package. Andrzej Novak recommends the "integrated gradients" method.
  • The shap module can be used to obtain a ranking of the input features. The method is based on the Shapley values, which originally come from game theory and are computed by integrating-out sets of input features. This is somewhat similar to the "recursive feature elimination" which is already mentioned in the documentation.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant