-
Notifications
You must be signed in to change notification settings - Fork 169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Primary Attribution support for BERT classification #64
Comments
Hey Jay, Do I understand correctly that it is currently not possible to use Ecco for custom trained Bert models? Or is it through the use of a custom model config or by passing it as a Pytorch model? |
It's possible to use custom bert models for neuron activation capture/factorization and CCA. But not yet for Primary Attribution. |
Good to hear that it is possible! I assume I will need to use the LM class instead of the from_pretrained() |
It would be easy to implement this for MLMs as it requires a single forward step for classification. We would just need a new visualization from the JS side :) |
Primary attribution are currently supported for causal (GPT) models and enc-dec (T5/T0) models. It would be great to add support for MLM models. A good first implementation would target either BERT or RoBERTa. My sense is that Sequence Classification takes precedent, then token classification.
BertForSequenceClassification
RobertaForSequenceClassification
Captum's Interpreting BERT Models guide is a good place to look.
As far as how to fit it into Ecco, these are initial ideas:
head
config parameter .classify()
method to lm.py. ("predict" to match the familiar sklearn api, "classify" to be more specific as generation is also prediction)The text was updated successfully, but these errors were encountered: