Skip to content

[Attention processor] Create SDPA versions of attention processors #3464

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
sayakpaul opened this issue May 17, 2023 · 1 comment
Closed
Assignees
Labels
stale Issues that haven't received updates

Comments

@sayakpaul
Copy link
Member

sayakpaul commented May 17, 2023

diffusers currently supports the following PT 2.0 variant of attention processors

  • AttnProcessor => AttnProcessor2_0
  • AttnAddedKVProcessor => AttnAddedKVProcessor2_0

The following are not supported:

  • SlicedAttnProcessor
  • SlicedAttnAddedKVProcessor
  • LoRAAttnProcessor
  • CustomDiffusionAttnProcessor

We should add SDPA versions of the above processors. This essentially eliminates the need to use xformers.

@sayakpaul sayakpaul changed the title Create SDPA versions of attention processors [Attention processor] Create SDPA versions of attention processors May 17, 2023
@sayakpaul sayakpaul self-assigned this May 17, 2023
@github-actions
Copy link
Contributor

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@github-actions github-actions bot added the stale Issues that haven't received updates label Jun 30, 2023
@github-actions github-actions bot closed this as completed Jul 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale Issues that haven't received updates
Projects
None yet
Development

No branches or pull requests

1 participant