Skip to content

Latest commit

 

History

History
61 lines (38 loc) · 1.49 KB

README.md

File metadata and controls

61 lines (38 loc) · 1.49 KB

Seq2Seq with Attention Mechanism

This is a Seq2Seq model with Bahdanau Attention and Luong Attention.

Datasets:

Models:

Data Process

PYTHONPATH=. python dataprocess/process.py

Unit Test

  • for loader
PYTHONPATH=. python loaders/loader1.py
  • for module
# Seq2Seq with Attention Bahdanau
PYTHONPATH=. python modules/module1.py --attention_type bahdanau

# Seq2Seq with Attention Luong and AlignMethod Dot
PYTHONPATH=. python modules/module1.py --attention_type luong --align_method dot

# Seq2Seq with Attention Luong and AlignMethod General
PYTHONPATH=. python modules/module1.py --attention_type luong --align_method general

# Seq2Seq with Attention Luong and AlignMethod Concat
PYTHONPATH=. python modules/module1.py --attention_type luong --align_method concat

Main Process

python main.py

You can change the config either in the command line or in the file utils/parser.py

Here are the examples:

python main.py --attention_type bahdanau

python main.py --attention_type luong --align_method dot

python main.py --attention_type luong --align_method general

python main.py --attention_type luong --align_method concat