Skip to content

move mask as sdpa input instead of attribute #3036

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from

Conversation

cccclai
Copy link
Contributor

@cccclai cccclai commented Apr 14, 2024

Stack from ghstack (oldest at bottom):

sdpa (https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) input is taking attention mask as input, refactor the sdpa module input closer to the sdpa input

Differential Revision: D56119739

sdpa (https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) input is taking attention mask as input, refactor the sdpa module input closer to the sdpa input

Differential Revision: [D56119739](https://our.internmc.facebook.com/intern/diff/D56119739/)

[ghstack-poisoned]
Copy link

pytorch-bot bot commented Apr 14, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/3036

Note: Links to docs will display an error until the docs builds have been completed.

❌ 2 New Failures

As of commit 20854b6 with merge base 075fe40 (image):

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 14, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56119739

sdpa (https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) input is taking attention mask as input, refactor the sdpa module input closer to the sdpa input

Differential Revision: [D56119739](https://our.internmc.facebook.com/intern/diff/D56119739/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56119739

sdpa (https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) input is taking attention mask as input, refactor the sdpa module input closer to the sdpa input

Differential Revision: [D56119739](https://our.internmc.facebook.com/intern/diff/D56119739/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56119739

sdpa (https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) input is taking attention mask as input, refactor the sdpa module input closer to the sdpa input

Differential Revision: [D56119739](https://our.internmc.facebook.com/intern/diff/D56119739/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56119739

@mergennachin mergennachin self-requested a review April 17, 2024 16:28
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in b341223.

cccclai added a commit to cccclai/executorch-1 that referenced this pull request Apr 18, 2024
Summary:
Pull Request resolved: pytorch#3036

sdpa (https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) input is taking attention mask as input, refactor the sdpa module input closer to the sdpa input
ghstack-source-id: 222650466
exported-using-ghexport

Reviewed By: mergennachin

Differential Revision: D56119739

fbshipit-source-id: d9adda66e540abc518b7ffb6a5ebd2aab1626b3b
(cherry picked from commit b341223)
guangy10 pushed a commit that referenced this pull request Apr 18, 2024
Summary:
Pull Request resolved: #3036

sdpa (https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) input is taking attention mask as input, refactor the sdpa module input closer to the sdpa input
ghstack-source-id: 222650466
exported-using-ghexport

Reviewed By: mergennachin

Differential Revision: D56119739

fbshipit-source-id: d9adda66e540abc518b7ffb6a5ebd2aab1626b3b
(cherry picked from commit b341223)
This was referenced Apr 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants