Fix Flux multiple Lora loading bug by maxs-kan · Pull Request #10388 · huggingface/diffusers (original) (raw)

The case where the first LoRA has extra weights than the second is ok on main

  1. Hyper-FLUX.1-dev-8steps-lora.safetensors
  2. Purz/choose-your-own-adventure

or

  1. alimama-creative/FLUX.1-Turbo-Alpha
  2. TTPlanet/Migration_Lora_flux

In this case base_param_name is set to f"{k.replace(prefix, '')}.base_layer.weight" for the 2nd LoRA and all keys exist.

If loaded in the reverse order f"{k.replace(prefix, '')}.base_layer.weight" doesn't exist for the extra weights.

  1. Purz/choose-your-own-adventure
  2. Hyper-FLUX.1-dev-8steps-lora.safetensors

or

  1. TTPlanet/Migration_Lora_flux
  2. alimama-creative/FLUX.1-Turbo-Alpha

KeyError context_embedder.base_layer.weight

So for the extra weights we use f"{k.replace(prefix, '')}.weight". If another LoRA were loaded with context_embedder it would then use context_embedder.base_layer.weight.

We could continue if f"{k.replace(prefix, '')}.base_layer.weight" is not found but the extra weights may need to be expanded.