Skip to content

[CI] Refactor Bria Transformer Tests#13341

Open
DN6 wants to merge 2 commits intomainfrom
bria-test-refactor
Open

[CI] Refactor Bria Transformer Tests#13341
DN6 wants to merge 2 commits intomainfrom
bria-test-refactor

Conversation

@DN6
Copy link
Copy Markdown
Collaborator

@DN6 DN6 commented Mar 26, 2026

What does this PR do?

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@DN6 DN6 requested a review from sayakpaul March 27, 2026 07:51
@sayakpaul
Copy link
Copy Markdown
Member

@claude review this PR.

@claude
Copy link
Copy Markdown

claude bot commented Mar 27, 2026

Claude finished @sayakpaul's task in 2m 53s —— View job


PR Review: [CI] Refactor Bria Transformer Tests

  • Gather context and read files
  • Read review rules
  • Analyze changed files
  • Post review feedback

Overall this is a clean refactor to the new BaseModelTesterConfig / mixin style, adding meaningful new test coverage (IPAdapterTesterMixin, LoraTesterMixin, LoraHotSwappingForModelTesterMixin, TorchCompileTesterMixin for Fibo). No correctness bugs in the model code itself — these are test-only changes. A few minor issues worth considering:


Issues

1. generator property called once per tensor — each gets an independent seed-0 generator

In both BriaTransformerTesterConfig and BriaFiboTransformerTesterConfig, get_dummy_inputs accesses self.generator once per randn_tensor call. Since generator is a @property that constructs a torch.Generator("cpu").manual_seed(0) fresh each access, each tensor independently starts from seed 0. For identically-shaped tensors this produces identical values; for differently-shaped ones the overlap is partial. The effect is subtle but non-conventional and could mask model bugs (e.g., if the model collapses when certain inputs are equal).

The idiomatic approach is to create the generator once per get_dummy_inputs() call:

# tests/models/transformers/test_models_transformer_bria.py, get_dummy_inputs
def get_dummy_inputs(self, batch_size: int = 1) -> dict[str, torch.Tensor]:
    generator = torch.Generator("cpu").manual_seed(0)  # one generator, shared across all calls
    ...
    return {
        "hidden_states": randn_tensor(..., generator=generator, ...),
        "encoder_hidden_states": randn_tensor(..., generator=generator, ...),
        ...
    }

Same applies to BriaFiboTransformerTesterConfig.get_dummy_inputs() (test_models_transformer_bria_fibo.py:69). Fix this →

2. TestBriaTransformerLoRAHotSwap.get_dummy_inputs doesn't seed its tensors

The override at the bottom of test_models_transformer_bria.py constructs tensors with bare randn_tensor(..., device=torch_device) — no generator. The LoRA mixin calls torch.manual_seed(0) in some places, but the inputs themselves are non-reproducible, which could cause flakiness. Consider adding a generator here too for consistency.


Nits

  • The input_shape / output_shape property order is swapped in BriaTransformerTesterConfig vs the original — both still return (16, 4), so no functional change, but a small readability inconsistency.
  • Removing the inline comments (# "ip_adapter" (cross-attention weights), # "image_proj" (ImageProjection layer weights)) from create_bria_ip_adapter_state_dict reduces readability slightly for a reader not familiar with IP-adapter structure.

The uses_custom_attn_processor = True removal is intentional and correct — the new ModelTesterMixin (in testing_utils/common.py) doesn't have test_set_attn_processor_for_determinism, so this flag isn't needed.

Comment on lines +29 to +32
LoraHotSwappingForModelTesterMixin,
LoraTesterMixin,
ModelTesterMixin,
TorchCompileTesterMixin,
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess we can remove the expensive test suites from here provided the popularity of the model?

from ..testing_utils import (
BaseModelTesterConfig,
ModelTesterMixin,
TorchCompileTesterMixin,
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as above.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants