Skip to content

Merge 'initializers' into 'TorchScriptGraph' #480

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Mar 1, 2023

Conversation

BowenBao
Copy link
Contributor

Reduce friction.

@codecov
Copy link

codecov bot commented Feb 28, 2023

Codecov Report

Merging #480 (dcca04f) into main (5ba4fc7) will increase coverage by 0.00%.
The diff coverage is 90.90%.

@@           Coverage Diff           @@
##             main     #480   +/-   ##
=======================================
  Coverage   72.45%   72.46%           
=======================================
  Files         109      109           
  Lines       10602    10607    +5     
  Branches     1093     1093           
=======================================
+ Hits         7682     7686    +4     
- Misses       2613     2614    +1     
  Partials      307      307           
Impacted Files Coverage Δ
...xscript/function_libs/torch_aten/graph_building.py 69.16% <87.50%> (+0.49%) ⬆️
...pt/function_libs/torch_aten/graph_building_test.py 88.67% <100.00%> (-0.42%) ⬇️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@BowenBao BowenBao marked this pull request as ready for review February 28, 2023 18:19
@BowenBao BowenBao requested a review from titaiwangms February 28, 2023 18:19
@titaiwangms
Copy link
Contributor

I think torch-nightly is broken by #460

@BowenBao
Copy link
Contributor Author

@titaiwangms thx, hard to figure out what are known failures...

Does the CI here depend on code from torch exporter? This api change requires modification from both sides, I was planning to merge this first hoping CI could be all green.

@titaiwangms
Copy link
Contributor

@titaiwangms thx, hard to figure out what are known failures...

Does the CI here depend on code from torch exporter? This api change requires modification from both sides, I was planning to merge this first hoping CI could be all green.

I don't think onnx-script relies on torch exporter, but what part in this PR needs torch exporter modification? As far as I know, the experimental branch is broken due to lack of maintain on ort-experimental branch, and torch-nightly fails because of squeeze op. There are other two erroring with ORT that I don't quite understand, but this PR seems to be unrelated...

@BowenBao
Copy link
Contributor Author

BowenBao commented Mar 1, 2023

The api changed, initializers no longer an argument to TorchScripGraph.to_model_proto, and pt exporter need to adjust for it, which is in pytorch/pytorch#95676. It does seem that pt exporter is not involved in any of the CI tests here.

@BowenBao BowenBao merged commit e66ca88 into microsoft:main Mar 1, 2023
@justinchuby
Copy link
Collaborator

Fixes #389


@property
def torch_graph(self):
return self._torch_graph

@property
def initializers(self) -> Mapping[str, torch.Tensor]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would a brief docstring for this property help?

@@ -17,23 +17,22 @@
@unittest.skipIf(version_utils.torch_older_than("2.0"), "torchscript in 1.13 not supported")
class TestTorchScriptTracingEvaluator(unittest.TestCase):
def setUp(self):
# FIXME: Currently this must match with the import line
# `from onnxscript import opset17 as op`, which restricts opset to be 17 in these
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The model opset can actually be different from the local functions' opsets I think. Do you know if that's true?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea it could be different under constraint.
https://github.com/onnx/onnx/blob/4b2d50334914621835cc1e8dadd4fe82b6b9876c/onnx/onnx.in.proto#L824-L828

  // The operator sets imported by FunctionProto should be compatible with the ones
  // imported by ModelProto. Example, if same operator set say 'A' is imported by FunctionProto
  // and ModelProto then versions for the operator set may be different but,
  // the operator schema returned for op_type, domain, version combination
  // for both the versions should be same.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants