Fix Flux multiple Lora loading bug by maxs-kan · Pull Request #10388 · huggingface/diffusers (original) (raw)
The case where the first LoRA has extra weights than the second is ok on main
Hyper-FLUX.1-dev-8steps-lora.safetensors
Purz/choose-your-own-adventure
or
alimama-creative/FLUX.1-Turbo-Alpha
TTPlanet/Migration_Lora_flux
In this case base_param_name
is set to f"{k.replace(prefix, '')}.base_layer.weight" for the 2nd LoRA and all keys exist.
If loaded in the reverse order f"{k.replace(prefix, '')}.base_layer.weight"
doesn't exist for the extra weights.
Purz/choose-your-own-adventure
Hyper-FLUX.1-dev-8steps-lora.safetensors
or
TTPlanet/Migration_Lora_flux
alimama-creative/FLUX.1-Turbo-Alpha
KeyError context_embedder.base_layer.weight
So for the extra weights we use f"{k.replace(prefix, '')}.weight"
. If another LoRA were loaded with context_embedder
it would then use context_embedder.base_layer.weight
.
We could continue
if f"{k.replace(prefix, '')}.base_layer.weight"
is not found but the extra weights may need to be expanded.