-
Notifications
You must be signed in to change notification settings - Fork 72
Remove legacy optimizer #2180
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove legacy optimizer #2180
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Copilot reviewed 7 out of 7 changed files in this pull request and generated no comments.
Comments suppressed due to low confidence (2)
onnxscript/optimizer/init.py:41
- The conversion logic for onnx.ModelProto in the optimize function assumes in-place modification via Clear() and CopyFrom(), which may lead to unintended side effects. Consider reviewing this approach to ensure that modifying the input model in place is safe in all use cases.
assert isinstance(model, onnx.ModelProto)
onnxscript/optimizer/init.py:62
- Clearing and subsequently copying back to model_proto may affect external references to the input model. It is recommended to verify that this in-place update is acceptable or switch to a functional model conversion approach.
model_proto.Clear()
❌ 18 Tests Failed:
View the top 3 failed test(s) by shortest run time
To view more test analytics, go to the Test Analytics Dashboard |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Copilot reviewed 12 out of 12 changed files in this pull request and generated no comments.
Comments suppressed due to low confidence (3)
onnxscript/optimizer/_legacy/_optimizer.py:1
- Ensure that removal of these legacy optimizer modules is fully covered by tests for the new IR-based optimizer to prevent any functionality regressions.
# Removed legacy optimizer implementation
onnxscript/optimizer/_function_folding_test.py:47
- [nitpick] Verify that all test assertions have been updated consistently to use the new IR graph interface (e.g. treating 'optimized.graph' as a list of nodes) so that test coverage remains complete.
self.assertEqual(len(optimized.graph), 2)
onnxscript/optimizer/init.py:62
- [nitpick] Double-check that converting from an ir.Model back to an onnx.ModelProto preserves all necessary metadata, as missing metadata could lead to subtle runtime issues.
def optimize(model: _ModelProtoOrIr, num_iterations: int = 2, *, onnx_shape_inference: bool = True, ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Copilot reviewed 12 out of 12 changed files in this pull request and generated no comments.
Comments suppressed due to low confidence (2)
onnxscript/optimizer/init.py:54
- The docstring indicates that 'stop_if_no_change' has no effect yet the parameter remains in the API. Consider either fully supporting this parameter or clarifying its intended use to avoid confusion.
stop_if_no_change: Not supported currently (has no effect).
onnxscript/optimizer/_function_folding_test.py:67
- [nitpick] The use of a tuple as a key to access a function may be unclear to readers; consider defining a constant for this key or adding a clarifying comment to improve readability.
function = optimized.functions[("local", "fun1", "")]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approve to unblock @justinchuby. @gramalingam feel free to follow up on this merged PR when you have time.
inline=True
option inoptimize()
to control whether function inlining is done when optimizingFix #2185