Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

got 0 ops for nn.MultiheadAttention #197

Open
wangtiance opened this issue Jan 19, 2023 · 7 comments
Open

got 0 ops for nn.MultiheadAttention #197

wangtiance opened this issue Jan 19, 2023 · 7 comments

Comments

@wangtiance
Copy link

My thop version: 0.1.1

Minimal code to replicate:

import thop
import torch.nn as nn
import torch

f = nn.MultiheadAttention(100, 1, batch_first=True)

x = torch.ones((1, 1000, 100))

result = thop.profile(f, (x,x,x))
print(result)

I got (0.0, 0). Is this module not supported yet?

@wangtiance
Copy link
Author

Does it only support modules listed here?

register_hooks = {

@CaA23187
Copy link

CaA23187 commented Mar 7, 2023

you are right. I guess this module doesn't supported MHSA layer yet.

@quancs
Copy link

quancs commented Mar 16, 2023

facing the same problem

@HaoKang-Timmy
Copy link
Contributor

You may use https://github.com/HaoKang-Timmy/torchanalyse or torchprofile for NLP models

@HaoKang-Timmy
Copy link
Contributor

You may use https://github.com/HaoKang-Timmy/torchanalyse or torchprofile for NLP models

These two repo could profile transformers, which I have tried.

@CaA23187
Copy link

You may use https://github.com/HaoKang-Timmy/torchanalyse or torchprofile for NLP models

thank you for your reply. I'll try it!

@cooma04
Copy link

cooma04 commented Apr 8, 2024

It seems thop does not account for any parameters created with nn.Parameter. For any created with nn.Parameter, it simply yields 0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants