Skip to content

[Rewriter]: fuse successive Relu/Clip nodes #2410

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Jul 1, 2025

Conversation

AyoubMDL
Copy link
Contributor

This PR adds the following transformation:

  • Relu(Relu(X)) -> Relu
  • Relu(Clip(X)) -> Clip
  • Clip(Relu(X)) -> Clip
  • Clip(Clip(X)) -> Clip

Copy link

codecov bot commented Jun 22, 2025

Codecov Report

Attention: Patch coverage is 98.29545% with 3 lines in your changes missing coverage. Please review.

Project coverage is 69.53%. Comparing base (87d6f11) to head (10a43f9).
Report is 3 commits behind head on main.

Files with missing lines Patch % Lines
onnxscript/rewriter/fuse_relus_clips_test.py 97.64% 1 Missing and 1 partial ⚠️
onnxscript/rewriter/fuse_relus_clips.py 98.87% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2410      +/-   ##
==========================================
+ Coverage   69.32%   69.53%   +0.20%     
==========================================
  Files         204      206       +2     
  Lines       25854    26034     +180     
  Branches     2696     2715      +19     
==========================================
+ Hits        17923    18102     +179     
+ Misses       7001     7000       -1     
- Partials      930      932       +2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@justinchuby justinchuby requested a review from Copilot June 22, 2025 15:17
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds graph rewrite rules to fuse consecutive Relu and Clip operations, updates the test harness to control ONNX Runtime’s optimization level, and provides unit tests to validate the new transformations.

  • Introduce four fusion rules (Relu(Relu), Relu(Clip), Clip(Relu), Clip(Clip)) in fuse_relus_clips.py
  • Extend assert_numerically_equal in testing.py to accept an ort_optimization_level argument
  • Add comprehensive tests in fuse_relus_clips_test.py to cover valid and invalid fusion scenarios

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.

File Description
onnxscript/rewriter/fuse_relus_clips.py Implement new RewriteRule classes and assemble them into a set.
onnxscript/rewriter/testing.py Update test helper to pass through ONNX Runtime optimization level.
onnxscript/rewriter/fuse_relus_clips_test.py Add unit tests for each fusion pattern and edge‐case validations.
Comments suppressed due to low confidence (2)

onnxscript/rewriter/fuse_relus_clips.py:161

  • The variable name fuse_sucessive_relu_clip_rule has a typo (sucessive vs. successive). Rename it to fuse_successive_relu_clip_rule for consistency with the other rules, and update any references.
fuse_sucessive_relu_clip_rule = FuseSuccessiveReluClip().rule()

onnxscript/rewriter/testing.py:27

  • [nitpick] The Args: section in the docstring does not match the parameter order of the function signature. Consider reordering the entries so they follow (original_model_proto, rewritten_model_proto, args, ort_optimization_level, rtol, atol).
        ort_optimization_level: Onnxruntime optimization level.

Copy link
Collaborator

@justinchuby justinchuby left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you - I think this can be part of the default rewrite rules

cc @gramalingam

Copy link
Contributor

@titaiwangms titaiwangms left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix optional lint (it's mainly spelling).

AyoubMDL added 3 commits June 28, 2025 11:34
- Relu(Relu(X)) -> Relu
- Relu(Clip(X)) -> Clip
- Clip(Relu(X)) -> Clip
- Clip(Clip(X)) -> Clip
@AyoubMDL AyoubMDL force-pushed the fuse-sucessive-relus-clips branch from e0f6332 to b05b6c0 Compare June 29, 2025 10:24
@AyoubMDL AyoubMDL force-pushed the fuse-sucessive-relus-clips branch from b05b6c0 to e30950c Compare June 30, 2025 15:59
@titaiwangms
Copy link
Contributor

@justinchuby Is the error "AttributeError: module 'onnx_ir.convenience' has no attribute 'get_const_tensor'" suggesting a newer version of onnx-ir?

@justinchuby
Copy link
Collaborator

Trying merging from main?

@justinchuby
Copy link
Collaborator

@titaiwangms turns out we also want to update the nox file. It needs to stay pinned because we want to test with the lowest supported version of onnx-ir.

@titaiwangms
Copy link
Contributor

@AyoubMDL Looks like there is a case failing.

@justinchuby
Copy link
Collaborator

justinchuby commented Jul 1, 2025

Strange - could you help ensure the tensor min_clip has a proper ir.Shape defined? If not there may be a bug in ir.tensor(), or logic in this rewrite rule @AyoubMDL

@AyoubMDL
Copy link
Contributor Author

AyoubMDL commented Jul 1, 2025

I'll check

@AyoubMDL
Copy link
Contributor Author

AyoubMDL commented Jul 1, 2025

Strange - could you help ensure the tensor min_clip has a proper ir.Shape defined? If not there may be a bug in ir.tensor(), or logic in this rewrite rule @AyoubMDL

It was from my side. Sometimes when I compute_clip_mix_max for FuseSuccessiveClip, I don't return an ir.tensor. Added the fix and test cases in last commit.

@github-project-automation github-project-automation bot moved this from Todo to Done in ONNX Script Review Board Jul 1, 2025
@justinchuby justinchuby enabled auto-merge (squash) July 1, 2025 18:18
@justinchuby justinchuby merged commit 34dc350 into microsoft:main Jul 1, 2025
26 of 32 checks passed
- Clip(Relu(X)) -> Clip
- Clip(Clip(X)) -> Clip
"""

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

from __future__ import annotations would be helpful

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

Successfully merging this pull request may close these issues.

4 participants