fix xlabs FLUX lora conversion typo by Clement-Lelievre · Pull Request #9581 · huggingface/diffusers (original) (raw)

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Conversation9 Commits2 Checks15 Files changed

Conversation

This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters

[ Show hidden characters]({{ revealButtonHref }})

Clement-Lelievre

What does this PR do?

Typo fixing for XLab FLUX LoRA conversion to diffusers.
In its current version, this raises: TypeError: slice indices must be integers or None or have an __index__ method because str.startswith expects a tuple of strs or a single str.
(Unless I've missed an issue on this topic, I'm a bit surprised this typo has been here for over a month without anyone complaining)

Repro steps:

  1. instanciate a flux pipeline
  2. get a xlab lora safetensors
  3. run pipe.load_lora_weights(xlab_lora) , passing the lora from step 2

Before submitting

Who can review?

@sayakpaul , @yiyixuxu

@Clement-Lelievre

@Clement-Lelievre Clement-Lelievre changed the titlefix xlabs lora conversion typo fix xlabs FLUX lora conversion typo

Oct 4, 2024

@sayakpaul

Thank you for this. The reason why, it was not brought up because we don't test for an Xlabs LoRA that has single_block components:

self.pipeline.load_lora_weights("XLabs-AI/flux-lora-collection", weight_name="disney_lora.safetensors")

sayakpaul

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@Clement-Lelievre

@sayakpaul from a cursory glance from my phone it seems the failed test is unrelated to the diff of this PR

@sayakpaul

@Clement-Lelievre

@Clement-Lelievre

@sayakpaul

Failing test is completely unrelated.

@sayakpaul

@Clement-Lelievre

@sayakpaul as follow-up, using the same xlabs lora the conversion now works, however I now get the error: Adapter name(s) {'<my_adapter_name>'} not in the list of present adapters: set(). raised here since this PR

Possibly because get_list_adapters fails to add a component?

@sayakpaul

Can you open a new issue with a reproducible snippet?

leisuzz pushed a commit to leisuzz/diffusers that referenced this pull request

Oct 11, 2024

@Clement-Lelievre

sayakpaul pushed a commit that referenced this pull request

Dec 23, 2024

@Clement-Lelievre @sayakpaul

#9581 (comment)