Skip to content

Conversation

@molbap
Copy link
Contributor

@molbap molbap commented Mar 21, 2025

What does this PR do?

In the attention block, Gemma3 enforces dtype/device matching of the attention mask and query states, which fails if there is no attention mask provided. This fixes it.

@github-actions github-actions bot marked this pull request as draft March 21, 2025 12:04
@github-actions
Copy link
Contributor

Hi 👋, thank you for opening this pull request! The pull request is converted to draft by default. When it is ready for review, please click the Ready for review button (at the bottom of the PR page).

@molbap molbap marked this pull request as ready for review March 21, 2025 12:04
Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

@ArthurZucker ArthurZucker merged commit 3f9ff19 into main Mar 21, 2025
10 checks passed
@ArthurZucker ArthurZucker deleted the more_g3_fixes branch March 21, 2025 12:15
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request May 14, 2025
fix attention mask dtype + outputs type
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants