Skip to content

[torchlib] Fix scaled_dot_product_attention #968

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
justinchuby opened this issue Aug 4, 2023 · 1 comment · Fixed by #970
Closed

[torchlib] Fix scaled_dot_product_attention #968

justinchuby opened this issue Aug 4, 2023 · 1 comment · Fixed by #970
Labels
module: torchlib Related to the torch/aten function lib in development

Comments

@justinchuby
Copy link
Collaborator

scaled_dot_product_attention does not segfault now, but numbers mismatch: https://github.com/microsoft/onnxscript/runs/15634782795

@justinchuby justinchuby added module: torchlib Related to the torch/aten function lib in development bug Something isn't working labels Aug 4, 2023
@justinchuby justinchuby removed the bug Something isn't working label Aug 4, 2023
@justinchuby
Copy link
Collaborator Author

No bugs. We just need to update tolerance and enable the tests

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: torchlib Related to the torch/aten function lib in development
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant