Skip to content

upsample_size not an argument for AttnUpBlock2D #3249

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
will-rice opened this issue Apr 26, 2023 · 3 comments · Fixed by #3275
Closed

upsample_size not an argument for AttnUpBlock2D #3249

will-rice opened this issue Apr 26, 2023 · 3 comments · Fixed by #3275
Labels
bug Something isn't working

Comments

@will-rice
Copy link
Contributor

Describe the bug

The argument upsample_size is passed to each upsampling block here. However, it looks like AttnUpBlock2D doesn't accept this argument. My code to reproduce is below. Am I using this incorrectly or should AttnUpBlock2D allow that argument (even if unused) to maintain consistency across upsample blocks.

Reproduction

from diffusers import UNet2DConditionModel

model = UNet2DConditionModel(
    sample_size=256,  # the target image resolution
    in_channels=6,  # the number of input channels, 3 for RGB images
    out_channels=3,  # the number of output channels
    layers_per_block=2,  # how many ResNet layers to use per UNet block
    block_out_channels=(128, 128, 256, 256, 512, 512),  # the number of output channels for each UNet block
    down_block_types=(
        "DownBlock2D",  # a regular ResNet downsampling block
        "DownBlock2D",
        "DownBlock2D",
        "DownBlock2D",
        "AttnDownBlock2D",  # a ResNet downsampling block with spatial self-attention
        "DownBlock2D",
    ),
    up_block_types=(
        "UpBlock2D",  # a regular ResNet upsampling block
        "AttnUpBlock2D",  # a ResNet upsampling block with spatial self-attention
        "UpBlock2D",
        "UpBlock2D",
        "UpBlock2D",
        "UpBlock2D",
    ),
    cross_attention_dim=256
)

outputs = model(torch.randn(8, 6, 256, 256), 1, encoder_hidden_states=torch.randn(8, 4, 256))
outputs[0].shape

Logs

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-5-ea419884a028> in <module>
     26 )
     27 
---> 28 outputs = model(torch.randn(8, 6, 256, 256), 1, encoder_hidden_states=torch.randn(8, 4, 256))
     29 outputs[0].shape

~/.pyenv/versions/notebook-3.8/lib/python3.8/site-packages/torch/nn/modules/module.py in _call_impl(self, *args, **kwargs)
   1499                 or _global_backward_pre_hooks or _global_backward_hooks
   1500                 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1501             return forward_call(*args, **kwargs)
   1502         # Do not call functions when jit is used
   1503         full_backward_hooks, non_full_backward_hooks = [], []

~/.pyenv/versions/notebook-3.8/lib/python3.8/site-packages/diffusers/models/unet_2d_condition.py in forward(self, sample, timestep, encoder_hidden_states, class_labels, timestep_cond, attention_mask, cross_attention_kwargs, down_block_additional_residuals, mid_block_additional_residual, return_dict)
    781                 )
    782             else:
--> 783                 sample = upsample_block(
    784                     hidden_states=sample, temb=emb, res_hidden_states_tuple=res_samples, upsample_size=upsample_size
    785                 )

~/.pyenv/versions/notebook-3.8/lib/python3.8/site-packages/torch/nn/modules/module.py in _call_impl(self, *args, **kwargs)
   1499                 or _global_backward_pre_hooks or _global_backward_hooks
   1500                 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1501             return forward_call(*args, **kwargs)
   1502         # Do not call functions when jit is used
   1503         full_backward_hooks, non_full_backward_hooks = [], []

TypeError: forward() got an unexpected keyword argument 'upsample_size'


### System Info

diffusers v0.16.0
@will-rice will-rice added the bug Something isn't working label Apr 26, 2023
@sayakpaul
Copy link
Member

Cc: @williamberman

@patrickvonplaten
Copy link
Contributor

Good catch @will-rice (nice to see you here as well btw :-))

Do you mind opening a PR, I think we should just add upsample_size to AttnUpBlock2D and AttnDownBlock2D

@will-rice
Copy link
Contributor Author

I don't mind. I'll work on it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants