Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug Report #1

Open
Dielianss opened this issue Oct 26, 2019 · 3 comments
Open

Bug Report #1

Dielianss opened this issue Oct 26, 2019 · 3 comments

Comments

@Dielianss
Copy link

Dear sir or madam:

Thanks for your time. Recently I have been studying your codes of attention-mechanisms and felt it is very beneficial for me. However, a bug occurs when I run document_classification.py, which throws an error that: 'Dimension' object does not support indexing. It seems that some bugs exist in Class Attention(Layer). The Parameter config was set to 1 and 2 respectively but the same bug appears, and it points out that the codes "self.input_sequence_length, self.hidden_dim = input_shape[0][1], input_shape[0][2]" (line 103, layers.py) lead to the problem. Could you please help me to fix this bug. I'd appreciate it if you are kind to help me.

Best wishes,
Jie Yang
1
2

@jain-harshil
Copy link

Facing the same issue

@myvitriol
Copy link

How to dealt with this issue??

@karimarwah
Copy link

same issue, how to solve it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants