feat(atenlib): add ops (new_empty, new_empty_strided)#436
feat(atenlib): add ops (new_empty, new_empty_strided)#436xiaowuhu merged 10 commits intomicrosoft:mainfrom xiaowuhu:xiaowu/addOps(0214)
Conversation
Codecov Report
@@ Coverage Diff @@
## main #436 +/- ##
==========================================
+ Coverage 71.27% 71.30% +0.02%
==========================================
Files 108 108
Lines 10475 10489 +14
Branches 1085 1088 +3
==========================================
+ Hits 7466 7479 +13
Misses 2699 2699
- Partials 310 311 +1
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
onnxscript/tests/function_libs/torch_aten/ops_correctness_test.py
Outdated
Show resolved
Hide resolved
| if dtype == -1: | ||
| result = op.CastLike(result, self) | ||
| else: | ||
| result = op.Cast(result, to=dtype) |
There was a problem hiding this comment.
dtype needs to be the same for both branches. this op may need to be trace only
There was a problem hiding this comment.
not got your point
There was a problem hiding this comment.
So ONNX requires that both branches of an If node’s type to be the same. Since we are casting here the graph violates this constraint. I think Rama is looking at potential solutions, but for now we will need to mark the function traceonly, or else the graph would be invalid.
| return result | ||
|
|
||
|
|
||
| @torch_op("aten::new_empty_strided") |
There was a problem hiding this comment.
potentially trace only
| skip("empty_like", reason="Using zeros_like to simulate empty_like"), | ||
| xfail("logcumsumexp", reason="naive implementation not numerically stable"), | ||
| xfail("logsumexp", reason="ONNX Runtime 1.13 does not support ReduceLogSumExp-18"), | ||
| xfail("new_empty", reason="Using zeros to simulate empty"), |
There was a problem hiding this comment.
@xiaowuhu I also realized this can succeed unexpectedly. In that case a skip may be a better choice than xfail.
No description provided.