-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why is vae decoder so slow? Can you help me? #27
Comments
Seems like most of the delay is actually synchronization, which sort of implies that the slowdown is actually something else in the code prior. The way torch works is each op actually runs asynchronously, with each next op getting pushed to the gpu to run at a later time. Since synchronization is what is taking the longest time, try synchronizing before the decode and then run the decode step to ensure that it's not something else. |
Actulally, it could be because you may have set autoencoder offloading to true, so- in that case it could be that the slowdown is moving the vae to gpu, encoding, and then moving the vae back to the cpu. |
I tried not to uninstall the autoencoder, but found that the speed of the decoder is still the same slow. The main part of speed encoding is upsampling. It takes 4 to 5 seconds,My test machine is L4 |
I'm not entirely sure what the slowdown would be- though an L4 has pretty low wattage limits so it might be related it throttling because of wattage limits. I would check the clock speeds as it's decoding, check to see whether they drop significantly. |
The text was updated successfully, but these errors were encountered: