Skip to content

Conversation

@cyyever
Copy link
Contributor

@cyyever cyyever commented Apr 3, 2025

What does this PR do?

Remove outdated conditions and comments for PyTorch and Accelerator.

Specifically, some code for PyTorch < 2.1 has been found and removed. As a result, the functions is_torch_bf16_cpu_available,is_torch_fx_available, is_torchdynamo_available and is_torch_compile_available are now equivalent to is_torch_available. Some tests are further simplified using this fact.

There is one change regarding to old Accelerator code.
And old tokenizers version check is removed.

@github-actions github-actions bot marked this pull request as draft April 3, 2025 03:57
@github-actions
Copy link
Contributor

github-actions bot commented Apr 3, 2025

Hi 👋, thank you for opening this pull request! The pull request is converted to draft by default. The CI will be paused while the PR is in draft mode. When it is ready for review, please click the Ready for review button (at the bottom of the PR page). This will assign reviewers and trigger CI.

@cyyever cyyever force-pushed the is_torch_fx_available branch from 142363d to 93b4c9a Compare April 3, 2025 04:05
@cyyever cyyever marked this pull request as ready for review April 3, 2025 04:09
Copy link
Member

@Rocketknight1 Rocketknight1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, with one comment: Are we raising an error anywhere when users use old versions of Torch? I think it's important that they get an error telling them what the problem is, or else we'll get lots of issues complaining about failing compilations and missing functions from people who don't understand what went wrong

@cyyever cyyever force-pushed the is_torch_fx_available branch from 93b4c9a to 7f51766 Compare April 4, 2025 01:36
@cyyever
Copy link
Contributor Author

cyyever commented Apr 4, 2025

Added a warning message about unsupported PT versions.

@Rocketknight1
Copy link
Member

Hi @cyyever, sorry about this, my fault! We actually have a separate PR to bump the minimum torch version to 2.1 here: #37277

As a result, you can skip the check - sorry for suggesting it!

@cyyever
Copy link
Contributor Author

cyyever commented Apr 4, 2025

@Rocketknight1 I will rebase after it's merged to see whether there are more to delete..

@cyyever cyyever force-pushed the is_torch_fx_available branch 2 times, most recently from ff855e6 to cdf9717 Compare April 6, 2025 04:05
@cyyever cyyever changed the title Remove unneeded PyTorch version checks Remove unneeded version checks Apr 6, 2025
@cyyever cyyever force-pushed the is_torch_fx_available branch 15 times, most recently from 66eeb32 to 200a951 Compare April 8, 2025 16:10
@cyyever
Copy link
Contributor Author

cyyever commented Apr 8, 2025

@Rocketknight1 Lots of new commits since last review.

@cyyever
Copy link
Contributor Author

cyyever commented Apr 8, 2025

@ydshieh Follow-up of your clean-up.

@cyyever cyyever force-pushed the is_torch_fx_available branch from 200a951 to bf8b7bf Compare April 8, 2025 16:16
@ydshieh
Copy link
Collaborator

ydshieh commented Apr 8, 2025

Thanks! Will check. Could you refine the PR title and description 🙏 ? Seems this PR focus on is_torch_fx_available? A more detailed information regarding what a PR is doing is much appreciated.

@cyyever
Copy link
Contributor Author

cyyever commented Apr 8, 2025

@ydshieh Ignore the branch name...

@cyyever cyyever changed the title Remove unneeded version checks Remove code for old PyTorch and Accelerator Apr 8, 2025
@cyyever cyyever changed the title Remove code for old PyTorch and Accelerator Remove old code for PyTorch and Accelerator Apr 9, 2025
Copy link
Collaborator

@ydshieh ydshieh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having a few nit questions in the first round 🙏

@cyyever cyyever changed the title Remove old code for PyTorch and Accelerator Remove old code for PyTorch, Accelerator and tokenizers Apr 9, 2025
class TestFSDPTrainer(TestCasePlus):
@require_torch_multi_accelerator
@require_accelerate
@require_fsdp
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you explain this part?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

require_fsdp = require_torch after this PR.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I might miss some details, but can't find the reason why this is the case.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(forgot to unresolved so you didn't see it)
this is my last question and we are good!

Copy link
Contributor Author

@cyyever cyyever Apr 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because FSDP_MIN_VERSION = "1.12.0". Actually for PT2.0 fsdp has been supported for a long time. is_fsdp_available should always return True once PT is available. You can check its definition in src/transformers/utils/import_utils.py. I would like to remove is_fsdp_available but I can't since it is an exported symbol.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understand, thank you. I was expecting there will be a change to is_fsdp_available just you did for other places, that is why I was a bit confused here. Do you think to change to

def is_fsdp_available(min_version: str = FSDP_MIN_VERSION):
    return is_torch_available()

Either way, I will merge, just wondering.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ydshieh It's better to respect min_version in case someone passes "2.6" ...

@ydshieh
Copy link
Collaborator

ydshieh commented Apr 10, 2025

Almost ready 🚀 to go 👍

@cyyever
Copy link
Contributor Author

cyyever commented Apr 10, 2025

@ydshieh done

@ydshieh
Copy link
Collaborator

ydshieh commented Apr 10, 2025

Merge! Thank you for the contribution and iteration 🤗

@ydshieh ydshieh merged commit 371c44d into huggingface:main Apr 10, 2025
18 checks passed
@cyyever cyyever deleted the is_torch_fx_available branch April 11, 2025 00:54
cyr0930 pushed a commit to cyr0930/transformers that referenced this pull request Apr 18, 2025
…37234)

* Remove unneeded library version checks

Signed-off-by: cyy <[email protected]>

* Remove PyTorch condition

Signed-off-by: cyy <[email protected]>

* Remove PyTorch condition

Signed-off-by: cyy <[email protected]>

* Fix ROCm get_device_capability

Signed-off-by: cyy <[email protected]>

* Revert "Fix ROCm get_device_capability"

This reverts commit 0e75643.

* Remove unnecessary check

Signed-off-by: cyy <[email protected]>

* Revert changes

Signed-off-by: cyy <[email protected]>

---------

Signed-off-by: cyy <[email protected]>
zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request May 14, 2025
…37234)

* Remove unneeded library version checks

Signed-off-by: cyy <[email protected]>

* Remove PyTorch condition

Signed-off-by: cyy <[email protected]>

* Remove PyTorch condition

Signed-off-by: cyy <[email protected]>

* Fix ROCm get_device_capability

Signed-off-by: cyy <[email protected]>

* Revert "Fix ROCm get_device_capability"

This reverts commit 0e75643.

* Remove unnecessary check

Signed-off-by: cyy <[email protected]>

* Revert changes

Signed-off-by: cyy <[email protected]>

---------

Signed-off-by: cyy <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants