-
Notifications
You must be signed in to change notification settings - Fork 654
Update get_merged_lora_ckpt for dist checkpoints #2834
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update get_merged_lora_ckpt for dist checkpoints #2834
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/2834
Note: Links to docs will display an error until the docs builds have been completed. ⏳ No Failures, 3 PendingAs of commit 9a0a6eb with merge base 9d91fe3 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
torchtune/modules/peft/_utils.py
Outdated
lora_moe_modules = _get_lora_moe_modules(state_dict) | ||
|
||
# Create a simple module for matrix multiplication | ||
class MatMulModule(torch.nn.Module): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need this instead of just calling matmul directly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these operations don't work properly on d-tensors. it's what was causing the hangs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"doesn't work properly" - can you expand on that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok actually, I think it's not needed. Good catch. I thought it was causing problems, but I just re-tested without it and it still works. I'll get rid of them and just add barriers to the existing method.
torchtune/modules/peft/_utils.py
Outdated
lora_b_weight = state_dict[f"{module}.lora_{param}_b"] | ||
|
||
# Create a simple module for transpose operation | ||
class TransposeModule(torch.nn.Module): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here: why does this need to be a transpose module?
e5ed645
to
9a0a6eb
Compare
Context
What is the purpose of this PR? Is it to
Please link to any issues this PR addresses.
Changelog
What are the changes made in this PR?
Test plan
Please make sure to do each of the following if applicable to your PR. If you're unsure about any one of these just ask and we will happily help. We also have a contributing page for some guidance on contributing.
pre-commit install
)pytest tests
pytest tests -m integration_test
UX
If your function changed a public API, please add a dummy example of what the user experience will look like when calling it.
Here is a docstring example
and a tutorial example