Skip to content

Conversation

@ArdalanM
Copy link
Contributor

@ArdalanM ArdalanM commented Feb 14, 2025

What does this PR do?

Minor fix for Qwen2VL models where the sin and cos position embeddings where wrongly casted when used with DeepSpeed. Related issue (#36187)

cc @ArthurZucker 🤗

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@zucchini-nlp
Copy link
Member

Must have been fixed by #36065?

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, but we can probably do this in apply_rotary_pos_emb_flashatt and apply_rotary_pos_emb_vision to cover both cases! (.float done in if and in else) !

@MrToy
Copy link

MrToy commented Feb 14, 2025

Must have been fixed by #36065?

Issue is reintroduced by #35837

@ArthurZucker
Copy link
Collaborator

Yep let's merge this (need to fix the modular fix not the modeling file! 🤗 )

@zucchini-nlp
Copy link
Member

Hey, let's merge this and have in the next patch release (dunno when that will be). Seems to be bothering more people as we wait

@ArdalanM can you please add the same changes in modular_xxx.py files and then run make fix-copies to make CI happy?

Copy link
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot! ❤️

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@li-plus
Copy link
Contributor

li-plus commented Mar 13, 2025

@ArdalanM Thanks for the fix! But now we are casting cos/sin embeds to fp32 at each layer, which is not efficient. How about casting them only once at the beginning? That is adding a float32 cast here:

position_embeddings = (emb.cos(), emb.sin())

and removing the cast from here:

cos, sin = cos.unsqueeze(-2).float(), sin.unsqueeze(-2).float()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants