Skip to content

Commit b712042

Browse files
songkeygithub-actions[bot]sayakpaul
authored
[Flux2] Fix LoRA loading for Flux2 Klein by adaptively enumerating transformer blocks (#13030)
* Resolve Flux2 Klein 4B/9B LoRA loading errors * Apply style fixes --------- Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com> Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
1 parent 0b76728 commit b712042

File tree

1 file changed

+8
-2
lines changed

1 file changed

+8
-2
lines changed

src/diffusers/loaders/lora_conversion_utils.py

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2321,8 +2321,14 @@ def _convert_non_diffusers_flux2_lora_to_diffusers(state_dict):
23212321
prefix = "diffusion_model."
23222322
original_state_dict = {k[len(prefix) :]: v for k, v in state_dict.items()}
23232323

2324-
num_double_layers = 8
2325-
num_single_layers = 48
2324+
num_double_layers = 0
2325+
num_single_layers = 0
2326+
for key in original_state_dict.keys():
2327+
if key.startswith("single_blocks."):
2328+
num_single_layers = max(num_single_layers, int(key.split(".")[1]) + 1)
2329+
elif key.startswith("double_blocks."):
2330+
num_double_layers = max(num_double_layers, int(key.split(".")[1]) + 1)
2331+
23262332
lora_keys = ("lora_A", "lora_B")
23272333
attn_types = ("img_attn", "txt_attn")
23282334

0 commit comments

Comments
 (0)