Skip to content

Multiple lora with different weights #4300

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
shubhdotai opened this issue Jul 26, 2023 · 2 comments
Closed

Multiple lora with different weights #4300

shubhdotai opened this issue Jul 26, 2023 · 2 comments

Comments

@shubhdotai
Copy link

shubhdotai commented Jul 26, 2023

How can I add different weights to different Loras added to the pipeline? As per my understanding, the cross_attention_kwargs only takes one argument scale, which is common for all the LoRAs.
cross_attention_kwargs={"scale":0.6}

@sayakpaul

@sayakpaul
Copy link
Member

Currently, you cannot load multiple LoRA weights.

@gadicc
Copy link

gadicc commented Jul 31, 2023

See also #2613 ("Support for adding multiple LoRA layers to Diffusers").

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants