Skip to content

Conversation

@hameerabbasi
Copy link
Contributor

What does this PR do?

Adds a ChromaInpaintPipeline

Fixes #12572

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

cc @yiyixuxu and @asomoza

@hameerabbasi hameerabbasi force-pushed the chroma-inpaint branch 4 times, most recently from e2c7fb5 to 1e3a4c3 Compare December 17, 2025 08:33
@hameerabbasi
Copy link
Contributor Author

cc @DN6 A review would be appreciated.

@lodestone-rock
Copy link

bump cc @yiyixuxu

@yiyixuxu
Copy link
Collaborator

yiyixuxu commented Jan 6, 2026

@bot /style

@github-actions
Copy link
Contributor

github-actions bot commented Jan 6, 2026

Style fix runs successfully without any file modified.

@yiyixuxu
Copy link
Collaborator

yiyixuxu commented Jan 6, 2026

can you share code example + output?

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Collaborator

@DN6 DN6 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hameerabbasi Could you share a couple of example outputs please?

Comment on lines +183 to +186
# pixi environments
.pixi/*
!.pixi/config.toml
pixi.toml
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

? Might be accidentally committed from local env?


# Extend the prompt attention mask to account for image tokens in the final sequence
attention_mask = torch.cat(
[attention_mask, torch.ones(batch_size, sequence_length, device=attention_mask.device)],
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The T2I and I2I pipelines set the mask to bool dtype

[attention_mask, torch.ones(batch_size, sequence_length, device=attention_mask.device, dtype=torch.bool)],

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Chroma inpainting pipeline

5 participants