Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix FSDP for GPTQ-LoRA and Fused Ops #1

Closed

Conversation

fabianlim
Copy link
Owner

As foundation-model-stack#15 mentions there is a casting issue when using GPTQ-LoRA and Fused ops.

This issue occurs in fused ops because we bypass the base layer's forward, which we introduced a reinterpret_cast in the above mentioned issue:

  • to resolve, we instead patch the fused ops functions directly.
  • we change patch_forward_to_view_attributes_before_call to allow us to patch multiple submodules, this is needed for fused ops because these fowards trigger on the attention module, not on the linear modules directly.

@fabianlim fabianlim self-assigned this May 29, 2024
@fabianlim fabianlim closed this May 29, 2024
@fabianlim fabianlim deleted the fix-foak branch May 29, 2024 09:55
@fabianlim
Copy link
Owner Author

fabianlim commented May 29, 2024

Recreate this PR in #2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant