[Train] Update Lightning RayDDPStrategy docstring#40376
Merged
matthewdeng merged 1 commit intoray-project:masterfrom Oct 18, 2023
Merged
Conversation
matthewdeng
approved these changes
Oct 18, 2023
| For a full list of initialization arguments, please refer to: | ||
| https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.strategies.DDPStrategy.html | ||
|
|
||
| Note that `process_group_backend`, `timeout`, and `start_method` are disabled here, |
Contributor
There was a problem hiding this comment.
We can also parse the args and print a warning if these values are set.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Why are these changes needed?
Ray Train will start a distributed group with the arguments specified in
TorchConfigbefore the training function. if the distributed process group has already been fired up, Lightning will ignore some arguments(backend, timeout).We need to point the users to specify these arguments in
TorchConfiginstead ofRayDDPStrategy.Related issue number
Closes #36315
Checks
git commit -s) in this PR.scripts/format.shto lint the changes in this PR.method in Tune, I've added it in
doc/source/tune/api/under thecorresponding
.rstfile.