Skip to content

Weights for loading loras #3682

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
pdoane opened this issue Jun 5, 2023 · 2 comments
Closed

Weights for loading loras #3682

pdoane opened this issue Jun 5, 2023 · 2 comments

Comments

@pdoane
Copy link
Contributor

pdoane commented Jun 5, 2023

Is your feature request related to a problem? Please describe.
LoRAs when loaded/applied to a pipeline typically include a user provided weight that scales the weights in the LoRA. E.g. see Compel's LoraWeight class.

Describe the solution you'd like

Add a parameter to load_lora_weights that scales the weights in the file.

Describe alternatives you've considered
I initially assumed network_alpha added in #3437 was for this purpose, but it looks like it must match what is in the file or there is an error. I'm not following how this is supposed to be used.

Maybe repurpose network_alpha for this?

Additional context
Add any other context or screenshots about the feature request here.

@freespirit
Copy link
Contributor

It sounds like you need the scale parameter of the LoRAAttnProcessor.__call__ method.
You can provide it during inference and according to the docs

A scale value of 0 is the same as not using your LoRA weights and you’re only using the base model weights, and a scale value of 1 means you’re only using the fully finetuned LoRA weights. Values between 0 and 1 interpolates between the two weights.

Example (from the same place in the documentation):

image = pipe(
    "A pokemon with blue eyes.", num_inference_steps=25, guidance_scale=7.5, cross_attention_kwargs={"scale": 0.5}
).images[0]

@pdoane
Copy link
Contributor Author

pdoane commented Jun 7, 2023

Sounds good - works for one LoRA model which I think is all that is supported at this point.

@pdoane pdoane closed this as completed Jun 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants