Skip to content

kaustpradalab/SEATv2

Repository files navigation

SEATv2

This is our implementation for Towards Stable and Explainable Attention Mechanisms, a new version of SEAT.

A prior conference version is published at AAAI'23 SEAT: Stable and Explainable Attention. We provide detailed environment setup and script for quickly running our experiments.

Environments Setup

  1. We use Python 3.6 in our experiments. Please Use the following command to install the dependencies:
pip install -r ./attention/requirements.txt
  1. Preprocess the dataset.
  1. Export current dir to PYTHONPATH

Run Our Main Experiments

1. Vanila Attention

First train the baseline models and then the model ckpt and attention score will be saved in the output directory

python ./attention/train.py --dataset sst --data_dir . --output_dir ./outputs/ --attention tanh --encoder simple-rnn --exp_name baseline --train_mode std_train --bsize 32 --n_epoch 20  --seed 2 --lr 0.01

2. Other Baseline Methods and Ours

SEAT(Ours)

python ./attention/train.py --dataset sst --data_dir . --output_dir ./outputs/ --attention tanh \
--encoder simple-rnn \
--exp_name baseline --lambda_1 1 --lambda_2 1000 --pgd_radius 0.001 --x_pgd_radius 0.01 \
--K 7 --seed 2 --train_mode adv_train --bsize 32 --n_epoch 20 --lr 0.01  --method ours \
--eval_baseline

Other Baseline Methods

  • Please replace method slot with choice from ['word-at', 'word-iat', 'attention-iat', 'attention-at', 'attention-rp'] to evalute baseline methods.

Code Implementation References

Citation

@inproceedings{hu2023seat,
  title={Seat: stable and explainable attention},
  author={Hu, Lijie and Liu, Yixin and Liu, Ninghao and Huai, Mengdi and Sun, Lichao and Wang, Di},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={37},
  number={11},
  pages={12907--12915},
  year={2023}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages