[torchlib] Fix scaled_dot_product_attention
#968
Labels
module: torchlib
Related to the torch/aten function lib in development
scaled_dot_product_attention
does not segfault now, but numbers mismatch: https://github.com/microsoft/onnxscript/runs/15634782795The text was updated successfully, but these errors were encountered: