Skip to content

Refactoring FSDP2 (_composable/fsdp) test cases to be device agnostic #149848

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

AnantGulati
Copy link
Contributor

@AnantGulati AnantGulati commented Mar 24, 2025

The motivation for this PR is refactor existing test cases in the folder test/distributed/_composable/fsdp/ or fsdp2(as referred to in torch titan) to be device agnostic such that any accelerator type is supported (for eg. CUDA, HPU, XPU etc)

The changes are in line with previously merged changes for fsdp (present in the folder test/distributed/fsdp/ ) test cases: #139184

cc @H-Huang @awgu @wanchaol @fegin @fduwjj @wz337 @wconstab @d4l3k @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng @chauhang @amjames @kwen2501 @c-p-i-o

Copy link

pytorch-bot bot commented Mar 24, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/149848

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 Cancelled Job

As of commit 36f1694 with merge base 7e16cb9 (image):

CANCELLED JOB - The following job was cancelled. Please retry:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added oncall: distributed Add this issue/PR to distributed oncall triage queue topic: not user facing topic category labels Mar 24, 2025
@colesbury colesbury requested a review from wconstab March 24, 2025 18:32
@colesbury colesbury added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Mar 24, 2025
@AnantGulati
Copy link
Contributor Author

@wconstab could you please help with this PR
Thanks

@AnantGulati
Copy link
Contributor Author

@wconstab @kwen2501 Could you please help with this PR.
Thanks

@AnantGulati AnantGulati changed the title Refactoring FSDP2 (_composable/fsdp) test cases to be device agnostic (1/n) Refactoring FSDP2 (_composable/fsdp) test cases to be device agnostic Apr 14, 2025
@AnantGulati
Copy link
Contributor Author

@wconstab Could you please help with this review.
Thanks

@AnantGulati
Copy link
Contributor Author

AnantGulati commented Apr 28, 2025

@wconstab @H-Huang @wanchaol @fegin @kwen2501 @d4l3k
Could you please help with this review
Thanks

@cyyever
Copy link
Collaborator

cyyever commented Apr 28, 2025

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Rebase failed due to Command git -C /home/runner/work/pytorch/pytorch rebase refs/remotes/origin/viable/strict pull/149848/head returned non-zero exit code 1

Rebasing (1/9)
Auto-merging test/distributed/_composable/fsdp/test_fully_shard_grad_scaler.py
CONFLICT (content): Merge conflict in test/distributed/_composable/fsdp/test_fully_shard_grad_scaler.py
Auto-merging test/distributed/_composable/fsdp/test_fully_shard_init.py
error: could not apply 920e06d304d... adding fsdp files
hint: Resolve all conflicts manually, mark them as resolved with
hint: "git add/rm <conflicted_files>", then run "git rebase --continue".
hint: You can instead skip this commit: run "git rebase --skip".
hint: To abort and get back to the state before "git rebase", run "git rebase --abort".
hint: Disable this message with "git config set advice.mergeConflict false"
Could not apply 920e06d304d... adding fsdp files

Raised by https://github.com/pytorch/pytorch/actions/runs/14700946048

@cyyever
Copy link
Collaborator

cyyever commented Apr 28, 2025

@AnantGulati Please rebase and solve any conflict.

@AnantGulati
Copy link
Contributor Author

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Rebase failed due to Command git -C /home/runner/work/pytorch/pytorch rebase refs/remotes/origin/viable/strict pull/149848/head returned non-zero exit code 1

Rebasing (1/13)
Auto-merging test/distributed/_composable/fsdp/test_fully_shard_grad_scaler.py
CONFLICT (content): Merge conflict in test/distributed/_composable/fsdp/test_fully_shard_grad_scaler.py
Auto-merging test/distributed/_composable/fsdp/test_fully_shard_init.py
error: could not apply 920e06d304d... adding fsdp files
hint: Resolve all conflicts manually, mark them as resolved with
hint: "git add/rm <conflicted_files>", then run "git rebase --continue".
hint: You can instead skip this commit: run "git rebase --skip".
hint: To abort and get back to the state before "git rebase", run "git rebase --abort".
hint: Disable this message with "git config set advice.mergeConflict false"
Could not apply 920e06d304d... adding fsdp files

Raised by https://github.com/pytorch/pytorch/actions/runs/14701438311

@AnantGulati
Copy link
Contributor Author

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Rebase failed due to Command git -C /home/runner/work/pytorch/pytorch rebase refs/remotes/origin/viable/strict pull/149848/head returned non-zero exit code 1

Rebasing (1/16)
Auto-merging test/distributed/_composable/fsdp/test_fully_shard_grad_scaler.py
CONFLICT (content): Merge conflict in test/distributed/_composable/fsdp/test_fully_shard_grad_scaler.py
Auto-merging test/distributed/_composable/fsdp/test_fully_shard_init.py
error: could not apply 920e06d304d... adding fsdp files
hint: Resolve all conflicts manually, mark them as resolved with
hint: "git add/rm <conflicted_files>", then run "git rebase --continue".
hint: You can instead skip this commit: run "git rebase --skip".
hint: To abort and get back to the state before "git rebase", run "git rebase --abort".
hint: Disable this message with "git config set advice.mergeConflict false"
Could not apply 920e06d304d... adding fsdp files

Raised by https://github.com/pytorch/pytorch/actions/runs/14701672157

@cyyever
Copy link
Collaborator

cyyever commented Apr 29, 2025

@AnantGulati You have to manually rebase..

@kwen2501 kwen2501 requested review from weifengpy and mori360 April 29, 2025 06:36
@kwen2501
Copy link
Contributor

cc @weifengpy @mori360. It is a big change but would be nice to have. Thank you!

Copy link
Contributor

@kwen2501 kwen2501 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks a lot for the contribution!

@AnantGulati
Copy link
Contributor Author

@cyyever @kwen2501 All change included in new PR.
Could you please approve the CI flow
Thanks

@cyyever
Copy link
Collaborator

cyyever commented Apr 30, 2025

@pytorchbot merge -i

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Apr 30, 2025
@AnantGulati AnantGulati force-pushed the AnantGulati_fsdp2_test_cases branch from baffbdd to 36f1694 Compare May 16, 2025 06:14
@AnantGulati
Copy link
Contributor Author

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Tried to rebase and push PR #149848, but it was already up to date. Try rebasing against main by issuing:
@pytorchbot rebase -b main

@AnantGulati
Copy link
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label May 16, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: 1 jobs have failed, first few of them are: Apply lint suggestions

Details for Dev Infra team Raised by workflow job

@AnantGulati
Copy link
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: 1 jobs have failed, first few of them are: Apply lint suggestions

Details for Dev Infra team Raised by workflow job

@cyyever
Copy link
Collaborator

cyyever commented May 19, 2025

@pytorchbot merge -f "unrelated lint failures"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/trunk Trigger trunk jobs on your pull request Merged module: dynamo oncall: distributed Add this issue/PR to distributed oncall triage queue open source topic: not user facing topic category triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants