You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your time. Recently I have been studying your codes of attention-mechanisms and felt it is very beneficial for me. However, a bug occurs when I run document_classification.py, which throws an error that: 'Dimension' object does not support indexing. It seems that some bugs exist in Class Attention(Layer). The Parameter config was set to 1 and 2 respectively but the same bug appears, and it points out that the codes "self.input_sequence_length, self.hidden_dim = input_shape[0][1], input_shape[0][2]" (line 103, layers.py) lead to the problem. Could you please help me to fix this bug. I'd appreciate it if you are kind to help me.
Best wishes,
Jie Yang
The text was updated successfully, but these errors were encountered:
Dear sir or madam:
Thanks for your time. Recently I have been studying your codes of attention-mechanisms and felt it is very beneficial for me. However, a bug occurs when I run document_classification.py, which throws an error that: 'Dimension' object does not support indexing. It seems that some bugs exist in Class Attention(Layer). The Parameter config was set to 1 and 2 respectively but the same bug appears, and it points out that the codes "self.input_sequence_length, self.hidden_dim = input_shape[0][1], input_shape[0][2]" (line 103, layers.py) lead to the problem. Could you please help me to fix this bug. I'd appreciate it if you are kind to help me.
Best wishes,
Jie Yang
The text was updated successfully, but these errors were encountered: