Conversation
|
@claude review this PR. |
|
Claude finished @sayakpaul's task in 2m 53s —— View job PR Review: [CI] Refactor Bria Transformer Tests
Overall this is a clean refactor to the new Issues1. In both The idiomatic approach is to create the generator once per # tests/models/transformers/test_models_transformer_bria.py, get_dummy_inputs
def get_dummy_inputs(self, batch_size: int = 1) -> dict[str, torch.Tensor]:
generator = torch.Generator("cpu").manual_seed(0) # one generator, shared across all calls
...
return {
"hidden_states": randn_tensor(..., generator=generator, ...),
"encoder_hidden_states": randn_tensor(..., generator=generator, ...),
...
}Same applies to 2. The override at the bottom of Nits
The |
| LoraHotSwappingForModelTesterMixin, | ||
| LoraTesterMixin, | ||
| ModelTesterMixin, | ||
| TorchCompileTesterMixin, |
There was a problem hiding this comment.
I guess we can remove the expensive test suites from here provided the popularity of the model?
| from ..testing_utils import ( | ||
| BaseModelTesterConfig, | ||
| ModelTesterMixin, | ||
| TorchCompileTesterMixin, |
What does this PR do?
Fixes # (issue)
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.