Skip to content

Fix FSDP2 tied embedding errors with targeted ValueError guidance#3878

Merged
SunMarc merged 2 commits intohuggingface:mainfrom
amanzoni1:fix/fsdp2-tie-word-embeddings
Dec 11, 2025
Merged

Fix FSDP2 tied embedding errors with targeted ValueError guidance#3878
SunMarc merged 2 commits intohuggingface:mainfrom
amanzoni1:fix/fsdp2-tie-word-embeddings

Conversation

@amanzoni1
Copy link
Contributor

What does this PR do?

This PR replaces the raw KeyError in _prepare_fsdp2 param mapping with a defensive loop that collects all missing parameters and raises a targeted ValueError with actionable guidance for common tied embeddings mismatches (e.g., config has tie_word_embeddings=True but checkpoint has separate weights like in Qwen models).

Fixes #3870

Who can review?

@SunMarc

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Member

@SunMarc SunMarc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks ! Can you add a test for that ?

@amanzoni1
Copy link
Contributor Author

Thanks! Added the test

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@SunMarc SunMarc merged commit fa6e13d into huggingface:main Dec 11, 2025
25 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

FSDP2 fails due to KeyError: 'lm_head.weight'

3 participants