-
Notifications
You must be signed in to change notification settings - Fork 31.7k
Remove old code for PyTorch, Accelerator and tokenizers #37234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Hi 👋, thank you for opening this pull request! The pull request is converted to draft by default. The CI will be paused while the PR is in draft mode. When it is ready for review, please click the |
142363d to
93b4c9a
Compare
Rocketknight1
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, with one comment: Are we raising an error anywhere when users use old versions of Torch? I think it's important that they get an error telling them what the problem is, or else we'll get lots of issues complaining about failing compilations and missing functions from people who don't understand what went wrong
93b4c9a to
7f51766
Compare
|
Added a warning message about unsupported PT versions. |
|
@Rocketknight1 I will rebase after it's merged to see whether there are more to delete.. |
ff855e6 to
cdf9717
Compare
66eeb32 to
200a951
Compare
|
@Rocketknight1 Lots of new commits since last review. |
|
@ydshieh Follow-up of your clean-up. |
Signed-off-by: cyy <[email protected]>
200a951 to
bf8b7bf
Compare
|
Thanks! Will check. Could you refine the PR title and description 🙏 ? Seems this PR focus on |
|
@ydshieh Ignore the branch name... |
ydshieh
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Having a few nit questions in the first round 🙏
Signed-off-by: cyy <[email protected]>
Signed-off-by: cyy <[email protected]>
Signed-off-by: cyy <[email protected]>
This reverts commit 0e75643.
| class TestFSDPTrainer(TestCasePlus): | ||
| @require_torch_multi_accelerator | ||
| @require_accelerate | ||
| @require_fsdp |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you explain this part?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
require_fsdp = require_torch after this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I might miss some details, but can't find the reason why this is the case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(forgot to unresolved so you didn't see it)
this is my last question and we are good!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because FSDP_MIN_VERSION = "1.12.0". Actually for PT2.0 fsdp has been supported for a long time. is_fsdp_available should always return True once PT is available. You can check its definition in src/transformers/utils/import_utils.py. I would like to remove is_fsdp_available but I can't since it is an exported symbol.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Understand, thank you. I was expecting there will be a change to is_fsdp_available just you did for other places, that is why I was a bit confused here. Do you think to change to
def is_fsdp_available(min_version: str = FSDP_MIN_VERSION):
return is_torch_available()
Either way, I will merge, just wondering.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ydshieh It's better to respect min_version in case someone passes "2.6" ...
|
Almost ready 🚀 to go 👍 |
Signed-off-by: cyy <[email protected]>
|
@ydshieh done |
Signed-off-by: cyy <[email protected]>
|
Merge! Thank you for the contribution and iteration 🤗 |
…37234) * Remove unneeded library version checks Signed-off-by: cyy <[email protected]> * Remove PyTorch condition Signed-off-by: cyy <[email protected]> * Remove PyTorch condition Signed-off-by: cyy <[email protected]> * Fix ROCm get_device_capability Signed-off-by: cyy <[email protected]> * Revert "Fix ROCm get_device_capability" This reverts commit 0e75643. * Remove unnecessary check Signed-off-by: cyy <[email protected]> * Revert changes Signed-off-by: cyy <[email protected]> --------- Signed-off-by: cyy <[email protected]>
…37234) * Remove unneeded library version checks Signed-off-by: cyy <[email protected]> * Remove PyTorch condition Signed-off-by: cyy <[email protected]> * Remove PyTorch condition Signed-off-by: cyy <[email protected]> * Fix ROCm get_device_capability Signed-off-by: cyy <[email protected]> * Revert "Fix ROCm get_device_capability" This reverts commit 0e75643. * Remove unnecessary check Signed-off-by: cyy <[email protected]> * Revert changes Signed-off-by: cyy <[email protected]> --------- Signed-off-by: cyy <[email protected]>
What does this PR do?
Remove outdated conditions and comments for PyTorch and Accelerator.
Specifically, some code for PyTorch < 2.1 has been found and removed. As a result, the functions
is_torch_bf16_cpu_available,is_torch_fx_available,is_torchdynamo_availableandis_torch_compile_availableare now equivalent tois_torch_available. Some tests are further simplified using this fact.There is one change regarding to old Accelerator code.
And old tokenizers version check is removed.