Skip to content

StableDiffusionPipeline.from_ckpt is not working on dev version #3450

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
takuma104 opened this issue May 16, 2023 · 4 comments
Closed

StableDiffusionPipeline.from_ckpt is not working on dev version #3450

takuma104 opened this issue May 16, 2023 · 4 comments
Labels
bug Something isn't working

Comments

@takuma104
Copy link
Contributor

Describe the bug

It seems that from_ckpt is not working properly in the latest main. It seems to be fine in release version v0.16.1. I tried to follow some of the changes, but couldn't track them all.

diff: 9b14ce3...main

Reproduction

from diffusers import StableDiffusionPipeline
pipeline = StableDiffusionPipeline.from_ckpt(
    'https://huggingface.co/gsdf/Counterfeit-V3.0/blob/main/Counterfeit-V3.0_fp16.safetensors', 
)

Logs

RuntimeError: Error(s) in loading state_dict for AutoencoderKL:
	Missing key(s) in state_dict: "encoder.mid_block.attentions.0.to_q.weight", "encoder.mid_block.attentions.0.to_q.bias", "encoder.mid_block.attentions.0.to_k.weight", "encoder.mid_block.attentions.0.to_k.bias", "encoder.mid_block.attentions.0.to_v.weight", "encoder.mid_block.attentions.0.to_v.bias", "encoder.mid_block.attentions.0.to_out.0.weight", "encoder.mid_block.attentions.0.to_out.0.bias", "decoder.mid_block.attentions.0.to_q.weight", "decoder.mid_block.attentions.0.to_q.bias", "decoder.mid_block.attentions.0.to_k.weight", "decoder.mid_block.attentions.0.to_k.bias", "decoder.mid_block.attentions.0.to_v.weight", "decoder.mid_block.attentions.0.to_v.bias", "decoder.mid_block.attentions.0.to_out.0.weight", "decoder.mid_block.attentions.0.to_out.0.bias". 
	Unexpected key(s) in state_dict: "encoder.mid_block.attentions.0.key.bias", "encoder.mid_block.attentions.0.key.weight", "encoder.mid_block.attentions.0.proj_attn.bias", "encoder.mid_block.attentions.0.proj_attn.weight", "encoder.mid_block.attentions.0.query.bias", "encoder.mid_block.attentions.0.query.weight", "encoder.mid_block.attentions.0.value.bias", "encoder.mid_block.attentions.0.value.weight", "decoder.mid_block.attentions.0.key.bias", "decoder.mid_block.attentions.0.key.weight", "decoder.mid_block.attentions.0.proj_attn.bias", "decoder.mid_block.attentions.0.proj_attn.weight", "decoder.mid_block.attentions.0.query.bias", "decoder.mid_block.attentions.0.query.weight", "decoder.mid_block.attentions.0.value.bias", "decoder.mid_block.attentions.0.value.weight". 


### System Info

- `diffusers` version: 0.17.0.dev0
- Platform: Linux-5.19.0-41-generic-x86_64-with-glibc2.35
- Python version: 3.10.9
- PyTorch version (GPU RTX3090): 2.0.0+cu117 (True)
- Huggingface_hub version: 0.13.2
- Transformers version: 4.25.1
- Accelerate version: 0.19.0.dev0
- xFormers version: 0.0.17+c36468d.d20230318
- Using GPU in script?: NO
- Using distributed or parallel set-up in script?: NO
@patrickvonplaten
Copy link
Contributor

Thanks! I need to look into it this week :-)

@patrickvonplaten
Copy link
Contributor

patrickvonplaten commented May 17, 2023

Fixed together with #3390

@PeterL1n
Copy link
Contributor

@patrickvonplaten This bug seems to be out in official 1.17.1 release, causing vae cannot be loaded.

@patrickvonplaten
Copy link
Contributor

Hey @PeterL1n , the following works fine on dev:

from diffusers import StableDiffusionPipeline
pipeline = StableDiffusionPipeline.from_ckpt(
    'https://huggingface.co/gsdf/Counterfeit-V3.0/blob/main/Counterfeit-V3.0_fp16.safetensors', 
)

no? We do a new release in ~4 days

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants