Skip to content

Conversation

@chenin-wang
Copy link
Contributor

@chenin-wang chenin-wang commented Apr 20, 2025

What does this PR do?

Set default value for output_attentions parameter in Gemma2 and Gemma3

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@ArthurZucker @zucchini-nlp

@github-actions github-actions bot marked this pull request as draft April 20, 2025 05:35
@github-actions
Copy link
Contributor

Hi 👋, thank you for opening this pull request! The pull request is converted to draft by default. The CI will be paused while the PR is in draft mode. When it is ready for review, please click the Ready for review button (at the bottom of the PR page). This will assign reviewers and trigger CI.

@chenin-wang
Copy link
Contributor Author

#37609

@chenin-wang chenin-wang changed the title Set default value for output_attentions parameter in Gemma2 and Gemma… [fix gemma] Set default value for output_attentions parameter in Gemma2 and Gemma… Apr 20, 2025
@chenin-wang
Copy link
Contributor Author

Why does it work locally without errors, but always fails in CircleCI? @zucchini-nlp

image

@chenin-wang chenin-wang marked this pull request as ready for review April 20, 2025 07:31
@zucchini-nlp
Copy link
Member

@chenin-wang hm, there are changes in unrelated files which is causing red CI because we then need to update all files from where it was copied. Usually it is fixed by make fix-copies

I am not sure why it is being modified, prob ruff version, can you re-install transformers from main ?

@chenin-wang
Copy link
Contributor Author

@zucchini-nlp Hi, I've found the problem. Can you help me review it?

Copy link
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect thanks!

@zucchini-nlp zucchini-nlp merged commit 006530d into huggingface:main Apr 22, 2025
12 checks passed
zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request May 14, 2025
…a2 and Gemma… (huggingface#37633)

* Set default value for output_attentions parameter in Gemma2 and Gemma3 models

* update

* fix

* fix

---------

Co-authored-by: chenin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants