- 
                Notifications
    You must be signed in to change notification settings 
- Fork 6.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
I use unet.load_attn_procs but not working after update diffusers project.
what is load_lora_weights?
what is the difference between new and old lora format?
How can I know it is new or old?
Reproduction
When
pipe.load_lora_weights("./testlora/pytorch_lora_weights.bin")or
pipe.unet.load_attn_procs("./testlora/pytorch_lora_weights.bin")get
    KeyError: 'down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor'``
When
pipe.load_attn_procs("./testlora/pytorch_lora_weights.bin")it can run but not loading any lora weights.
Logs
No response
System Info
- diffusersversion: 0.16.0.dev0
- Platform: Linux-5.4.0-144-generic-x86_64-with-glibc2.27
- Python version: 3.10.11
- PyTorch version (GPU?): 2.0.0+cu117 (True)
- Huggingface_hub version: 0.13.4
- Transformers version: 4.28.1
- Accelerate version: 0.18.0
- xFormers version: 0.0.18
- Using GPU in script?:
- Using distributed or parallel set-up in script?:
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working