Fix torch_dtype in Kolors text encoder with transformers v4.49#10816
Fix torch_dtype in Kolors text encoder with transformers v4.49#10816DN6 merged 3 commits intohuggingface:mainfrom
torch_dtype in Kolors text encoder with transformers v4.49#10816Conversation
I think |
|
Yeah, I haven't found the exact change but it was working recently on 4.48.3 |
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
Let's maybe also update the |
|
Failing test is now just a temporary Hub issue. To re-summarize the changes we have
|
|
Marked as draft again, waiting for huggingface/transformers#36262 |
|
@hlky I think this can be reopened and merged. Change is safe to make. |
What does this PR do?
Tests for
Kolorsare failing. Tracked the issue totransformersversion update. The test model's config containstorch_dtypeas a string, in turndtypeis passed asstrtotorch.empty, seemstorch_dtypewas previously converted to atorch.dtypeor ignored.Generally
torch_dtypewould be passed to Kolors' pipelinesfrom_pretrainedorChatGLMModelif creating it separately, so this should be ok for end users.Edit: some tests still failing
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.