-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn #23
Comments
Manually setting loss.requires_grad=True (fp16) reveals this:
However, in fp32 (mixed_precision="no"), it runs, but loss.requires_grad=False on each iter:
|
|
So it took a little time, but I traced this to setting gradient_checkpointing: True, but only for the text_encoder. This has a method name different from the Unet, since it appears to come from the Transformers library. Either way, it does something that changes the loss.is_leaf to True? For now, I disable gradient_checkpointing for the text_encoder |
Running with LoRA restricted to text_encoder, with no unet training produces title error.
The text was updated successfully, but these errors were encountered: