-
Notifications
You must be signed in to change notification settings - Fork 72
[Rewriter] Add optimizer to fold Pad operators into Conv #2363
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
4b9b69b
to
19b0418
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2363 +/- ##
==========================================
+ Coverage 69.55% 69.83% +0.28%
==========================================
Files 210 212 +2
Lines 26120 26408 +288
Branches 2721 2764 +43
==========================================
+ Hits 18168 18443 +275
- Misses 7017 7024 +7
- Partials 935 941 +6 ☔ View full report in Codecov by Sentry. |
19b0418
to
1604446
Compare
Push force rebasing on main and fixing conflicts. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@justinchuby I forgot to explain in the previous message that the changes were not ready yet (just fixing rebase with main).
In the last commits I update the code with all the suggestions.
Now this work is ready to be reviewed.
Could you update the PR title and description? Thanks |
@justinchuby is ok the new description and title ? |
Thanks - maybe @gramalingam or @titaiwangms for another review? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please run "lintrunner -a"
return check_result | ||
|
||
|
||
class NormalizePadFormatConv(_NormalizePadFormatBase): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suppose this does not fallback when the rewritten conv still does not match FusePad
rules? Do we still benefit from converting auto_pad attribute into 'NOTSET' if it's still a no match?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right, always NormalizePadFormat
is applied, even if FusePad
is not applied.
On the other hand, I consider that having a single pad format is coherent, since it makes it easier to be interpreted by different accelerators. (bug example in onnxruntime
due to the auto_pad
type).
Let me know if you still disagree with this information. What do you think @justinchuby.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @gramalingam
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Last commit with @titaiwangms suggestions
return check_result | ||
|
||
|
||
class NormalizePadFormatConv(_NormalizePadFormatBase): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right, always NormalizePadFormat
is applied, even if FusePad
is not applied.
On the other hand, I consider that having a single pad format is coherent, since it makes it easier to be interpreted by different accelerators. (bug example in onnxruntime
due to the auto_pad
type).
Let me know if you still disagree with this information. What do you think @justinchuby.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Adds rewrite rules that fold preceding Pad operators into Conv/ConvInteger nodes and normalize Conv auto_pad attributes to explicit pads.
- Define
fuse_pad_into_conv
andfuse_pad_into_conv_integer
rules to merge Pad into Conv/ConvInteger - Introduce
normalize_pad_format_conv
/normalize_pad_format_conv_integer
to convertauto_pad
into explicitpads
- Add and register comprehensive unit tests for fusion and normalization, and update
__init__.py
to expose the new rules
Reviewed Changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.
File | Description |
---|---|
onnxscript/rewriter/fuse_pad_into_conv.py | New rewrite rules for fusing Pad into Conv/ConvInteger and normalizing pad formats |
onnxscript/rewriter/fuse_pad_into_conv_test.py | Unit tests for both fusion and normalization behaviors |
onnxscript/rewriter/init.py | Imported and included the new rules in the global rule set |
Comments suppressed due to low confidence (2)
onnxscript/rewriter/fuse_pad_into_conv.py:202
- [nitpick] The error message in NotImplementedError has a grammar issue; consider updating it to 'Subclasses must implement this function' for clarity.
raise NotImplementedError("Child have to implement this function")
onnxscript/rewriter/fuse_pad_into_conv_test.py:405
- There are tests for NormalizePadFormatConv but none for NormalizePadFormatConvInteger; consider adding similar tests to cover the ConvInteger normalization rule.
if __name__ == "__main__":
Convert 'auto_pad' attribute into a list of explicit pads.
Fix silent bugs
0b166c1
to
7f7d17e
Compare
Last push force rebasing on main and addig @justinchuby suggestions |
|
||
def check(self, context, x: ir.Value, pad: ir.Value, conv: ir.Value) -> orp.MatchResult: | ||
check_result = super().check(context, x, pad, conv) | ||
if check_result.reason: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: I suggest if not check_result:
... a MatchResult can be treated as a boolean, with True meaning success, and False meaning failure.
conv_node = conv.producer() | ||
if ( | ||
apad := conv_node.attributes.get("auto_pad", None) | ||
) and apad.as_string() != "NOTSET": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
minor nit: I think this condition can be simplified to conv_node.attributes.get_string("auto_pad", "NOTSET") != "NOTSET"
.
|
||
# Retrieve the padding and axes | ||
x_rank = len(x.shape) | ||
pad_pads = pad_node.inputs[1].const_value.numpy().tolist() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The above line needs to handle various special-conditions and error-situations. I suggest using onnxscript.rewriter._ir_utils.get_numpy_value(pad_node.inputs[1])
which will handle the cases where the input is None or is not a constant. And check that the value is not None before using it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, I see, the check is being done in the check method done below.
axes_list = list(range(x_rank)) | ||
|
||
# Pad constraints: values | ||
pads_list = fill_pads_with_axes(pads.const_value.numpy(), axes_list, x_rank) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this is the same as the pad_pads
used in the rewrite
method, you can save it as self._pads_list
and use it in the rewrite method to avoid redoing the entire computation. It involves implicitly passed state, but works fine.
Following (#2301),
fuse_pad_into_conv
rule set is introduced to reduce the following list of operators:Additionally,
NormalizePadFormat
is introduced in order to changeauto_pads
Conv attribute in its explicitpads
list (ref: https://onnx.ai/onnx/operators/onnx__Conv.html).