Fix Flux multiple Lora loading bug by maxs-kan · Pull Request #10388 · huggingface/diffusers (original) (raw)
The case where the first LoRA has extra weights than the second is ok on main
Hyper-FLUX.1-dev-8steps-lora.safetensorsPurz/choose-your-own-adventure
or
alimama-creative/FLUX.1-Turbo-AlphaTTPlanet/Migration_Lora_flux
In this case base_param_name is set to f"{k.replace(prefix, '')}.base_layer.weight" for the 2nd LoRA and all keys exist.
If loaded in the reverse order f"{k.replace(prefix, '')}.base_layer.weight" doesn't exist for the extra weights.
Purz/choose-your-own-adventureHyper-FLUX.1-dev-8steps-lora.safetensors
or
TTPlanet/Migration_Lora_fluxalimama-creative/FLUX.1-Turbo-Alpha
KeyError context_embedder.base_layer.weight
So for the extra weights we use f"{k.replace(prefix, '')}.weight". If another LoRA were loaded with context_embedder it would then use context_embedder.base_layer.weight.
We could continue if f"{k.replace(prefix, '')}.base_layer.weight" is not found but the extra weights may need to be expanded.