You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
diffusers currently supports the following PT 2.0 variant of attention processors
AttnProcessor => AttnProcessor2_0
AttnAddedKVProcessor => AttnAddedKVProcessor2_0
The following are not supported:
SlicedAttnProcessor
SlicedAttnAddedKVProcessor
LoRAAttnProcessor
CustomDiffusionAttnProcessor
We should add SDPA versions of the above processors. This essentially eliminates the need to use xformers.
The text was updated successfully, but these errors were encountered:
sayakpaul
changed the title
Create SDPA versions of attention processors
[Attention processor] Create SDPA versions of attention processors
May 17, 2023
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
diffusers currently supports the following PT 2.0 variant of attention processors
The following are not supported:
LoRAAttnProcessorWe should add SDPA versions of the above processors. This essentially eliminates the need to use xformers.
The text was updated successfully, but these errors were encountered: