Skip to content

Commit 949f97b

Browse files
zheliuyungazagna-qc
authored andcommitted
Add conditional checks to _check_and_adjust_attn_implementation() (huggingface#41542)
1 parent edff0f9 commit 949f97b

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

src/transformers/modeling_utils.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2499,6 +2499,7 @@ def _check_and_adjust_attn_implementation(
24992499
and self._supports_flash_attn
25002500
and not (is_flash_attn_2_available() or is_flash_attn_3_available())
25012501
and is_kernels_available()
2502+
and not is_torch_npu_available()
25022503
):
25032504
if attn_implementation.endswith("2"):
25042505
applicable_attn_implementation = "kernels-community/flash-attn"

0 commit comments

Comments
 (0)