Skip to content

Conversation

@zucchini-nlp
Copy link
Member

What does this PR do?

Actual fix for #36230. The very first PR that added the flag wasn't what we want, since the root cause of Pixtra config reloading is not in nested configs. The cause is how Mistral config is written with a None head-dim which is later inferred from hidden_dim. Bug from #36048 can be reproduced by reloading only the mistral config

TL;DR; We can't do that because the modeling code has no assumption about hidden_size = head_dim * num_heads. So the solution is to set the head dim only when config has the value, otherwise it will get inferred during modeling attention

Also, this PR renames the flag which is very misleading in its name. The flag was first added for models like RAG, when the config can't be init without args. The flag therefore is used only once, when trying to get default values of the config class (if any). Renaming shouldn't break BC I believe

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@zucchini-nlp
Copy link
Member Author

cc @ydshieh this should solve the quick fix we did a few day ago :)

@ydshieh
Copy link
Collaborator

ydshieh commented Feb 19, 2025

cc @ydshieh this should solve the quick fix we did a few day ago :)

(For the record, here means #36230)

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense! Not sure if I see breaking but it is cleaning up after our sub config updates!

@zucchini-nlp zucchini-nlp merged commit 6f4058a into huggingface:main Apr 9, 2025
20 checks passed
cyr0930 pushed a commit to cyr0930/transformers that referenced this pull request Apr 18, 2025
* update composition flag usage

* remove print

* fix tests

* actually fix

* oh c'mon

* now should be fixed right?

* fix copies
zucchini-nlp added a commit to zucchini-nlp/transformers that referenced this pull request May 14, 2025
* update composition flag usage

* remove print

* fix tests

* actually fix

* oh c'mon

* now should be fixed right?

* fix copies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants