Bria fibo by galbria · Pull Request #12545 · huggingface/diffusers (original) (raw)

Conversation

This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters

[ Show hidden characters]({{ revealButtonHref }})

@galbria

What does this PR do?

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@galbria

@galbria

sayakpaul

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for the PR. Excited for FIBO to make strides!

I have left a bunch of comments, most of which should be easily resolvable. If not, please let me know.

Additionally, I think:

output_height, output_width, _ = image.shape
assert (output_height, output_width) == (expected_height, expected_width)
@unittest.skipIf(torch_device not in ["cuda", "xpu"], reason="float16 requires CUDA or XPU")

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can remove this test I guess. If not, would you mind explaining why we had to override it here?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we used it to debug something, its redundant and removed

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like the test is still being kept here?

DN6

@galbria

@galbria

sayakpaul

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a few more comments. I think we should let the users know that they should absolutely use the structured prompt in the docs.

output_height, output_width, _ = image.shape
assert (output_height, output_width) == (expected_height, expected_width)
@unittest.skipIf(torch_device not in ["cuda", "xpu"], reason="float16 requires CUDA or XPU")

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like the test is still being kept here?

@galbria

@galbria

@galbria

@galbria

@galbria

DN6

Comment on lines 740 to 741

latents = latents.unsqueeze(dim=2)
latents = list(torch.unbind(latents, dim=0))

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kfirbria Hmm that's unusual. Is the input shape to the decoder in this format (batch_size, channels, 1, height, width)?

@galbria

@galbria

DN6

@galbria

@galbria

@galbria

@galbria

@galbria

@galbria

…oTransformer2DModel and BriaFiboPipeline classes to dummy objects for enhanced compatibility with torch and transformers.

DN6

DN6

@galbria

… in pipeline module

@galbria

@galbria

…ration from existing implementations

@galbria

@galbria

@sayakpaul @galbria

…riaFibo classes

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

sayakpaul

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot!

DN6

DN6 approved these changes Oct 28, 2025

@vladmandic

@sayakpaul