Skip to content

Conversation

@zucchini-nlp
Copy link
Member

What does this PR do?

As per title. Only one test was flaky for VLMs - test_promp_loopup_decoding. Unfortunately it can't be fixed by suppressing image/video tokens because the Prompt Lookup doesn't apply any logits processor. This can be fixed with some additional effort, but not worth it. VLMs don't use these type of assisted decoding usually, haven't seen any issues from the community

Copy link
Collaborator

@ydshieh ydshieh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@zucchini-nlp zucchini-nlp merged commit e6cc410 into huggingface:main Feb 18, 2025
25 checks passed
@gante
Copy link
Contributor

gante commented Feb 18, 2025

Thank you for fixing @zucchini-nlp 🤗

zucchini-nlp added a commit to zucchini-nlp/transformers that referenced this pull request Feb 21, 2025
* fix

* nit

* no logits processor needed

* two more tests on assisted decoding
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants