-
Notifications
You must be signed in to change notification settings - Fork 63
Merge 'initializers' into 'TorchScriptGraph' #480
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merge 'initializers' into 'TorchScriptGraph' #480
Conversation
Codecov Report
@@ Coverage Diff @@
## main #480 +/- ##
=======================================
Coverage 72.45% 72.46%
=======================================
Files 109 109
Lines 10602 10607 +5
Branches 1093 1093
=======================================
+ Hits 7682 7686 +4
- Misses 2613 2614 +1
Partials 307 307
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
I think torch-nightly is broken by #460 |
@titaiwangms thx, hard to figure out what are known failures... Does the CI here depend on code from torch exporter? This api change requires modification from both sides, I was planning to merge this first hoping CI could be all green. |
I don't think onnx-script relies on torch exporter, but what part in this PR needs torch exporter modification? As far as I know, the experimental branch is broken due to lack of maintain on ort-experimental branch, and torch-nightly fails because of squeeze op. There are other two erroring with ORT that I don't quite understand, but this PR seems to be unrelated... |
The api changed, |
Fixes #389 |
|
||
@property | ||
def torch_graph(self): | ||
return self._torch_graph | ||
|
||
@property | ||
def initializers(self) -> Mapping[str, torch.Tensor]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would a brief docstring for this property help?
@@ -17,23 +17,22 @@ | |||
@unittest.skipIf(version_utils.torch_older_than("2.0"), "torchscript in 1.13 not supported") | |||
class TestTorchScriptTracingEvaluator(unittest.TestCase): | |||
def setUp(self): | |||
# FIXME: Currently this must match with the import line | |||
# `from onnxscript import opset17 as op`, which restricts opset to be 17 in these |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The model opset can actually be different from the local functions' opsets I think. Do you know if that's true?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea it could be different under constraint.
https://github.com/onnx/onnx/blob/4b2d50334914621835cc1e8dadd4fe82b6b9876c/onnx/onnx.in.proto#L824-L828
// The operator sets imported by FunctionProto should be compatible with the ones
// imported by ModelProto. Example, if same operator set say 'A' is imported by FunctionProto
// and ModelProto then versions for the operator set may be different but,
// the operator schema returned for op_type, domain, version combination
// for both the versions should be same.
Reduce friction.