-
Notifications
You must be signed in to change notification settings - Fork 72
Add BiasGelu, Erfgelu and SkipLayerNormalization fusions #2222
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
❌ 4 Tests Failed:
View the top 3 failed test(s) by shortest run time
To view more test analytics, go to the Test Analytics Dashboard |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds new fusion rules for BiasGelu, Erfgelu, and updates SkipLayerNormalization fusion to support an additional output. Key changes include:
- Updating SkipLayerNormalization to return a new output (skip_sum) and adding a corresponding fusion rule for Add+SkipLayerNormalization.
- Implementing a new BiasGelu fusion along with its test.
- Updating Erfgelu rewriting with two pattern functions for better matching capability.
Reviewed Changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated no comments.
Show a summary per file
File | Description |
---|---|
onnxscript/rewriter/ort_fusions/skip_normalization.py | Modified SkipLayerNormalization to return an extra output and added support for bias fusion. |
onnxscript/rewriter/ort_fusions/bias_gelu_test.py | Added a unit test to verify the BiasGelu fusion. |
onnxscript/rewriter/ort_fusions/bias_gelu.py | Implemented the BiasGelu fusion rule. |
onnxscript/rewriter/ort_fusions/_core.py | Updated the fusion count to include BiasGelu fusion. |
onnxscript/rewriter/erfgelu.py | Introduced two Erfgelu pattern rules to enhance rewrite flexibility. |
onnxscript/rewriter/init.py | Updated to include the new Erfgelu fusion rule. |
Comments suppressed due to low confidence (1)
onnxscript/rewriter/erfgelu.py:9
- [nitpick] The current names 'erf_gelu_pattern_1' and 'erf_gelu_pattern_2' are not very descriptive. Consider renaming them to indicate the specific matching or transformation behavior they implement.
def erf_gelu_pattern_1(op, x):
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a minor suggestion to use onnxscript for test-case model-proto
82a891f
to
d65da68
Compare
Add fusion rules to support the optimization of Whisper models. Fusions added: - Basic Fusions: * additional pattern for erfgelu [moved to #2222] - SkipLayerNorm: * #2259 * Fusion patterns where skip_sum is also an output * Bias + SkipLayerNorm -> SkipLayerNorm (with bias) [moved to #2222] - BiasGelu Fusion [moved to #2222] - SDPA: * Support for pattern where only q is pre-scaled - MHA: * Patterns with/without past/present keys/values * Patterns with non-rotary embeddings * Patterns with/without mask * Patterns with cross-attention (only for past key/value patterns) - MHA Bias Fusion: * Bias was offloaded to Attention fusion previously, this fusion fixes that - Attention: * Patterns where Q, K and V do not come from slicing TODO: - [x] Fix SDPA singular prescale case, due to lost shape information - [x] - Enable check conditions when #2210 is merged - [x] - Improve/Rewrite whisper model test case to be similar to that of smollm (for eg) - [x] - Fix failing test cases to account for new patterns - [x] - Add isolated test cases for new fusions like BiasGelu, SkipLayerNorm etc
This pull request introduces new fusion patterns and enhancements to the ONNXScript rewriter module, focusing on optimization and test coverage improvements. The key changes include adding support for
BiasGelu
and additionalErfGelu
patterns, extendingSkipLayerNormalization
to handle bias addition, and updating test utilities for better accuracy validation.New fusion patterns:
BiasGelu Fusion: Added a new fusion pattern for
BiasGelu
operations, including its implementation inonnxscript/rewriter/ort_fusions/bias_gelu.py
and integration into thefuse_xformers
pipeline. A corresponding unit test was added to validate the functionality. [1] [2] [3] [4]ErfGelu Enhancements: Introduced a second pattern for
ErfGelu
fusion and refactored the corresponding implementation. The file was renamed fromerfgelu.py
toort_fusions/erfgelu.py
for consistency. [1] [2] [3] [4]Enhancements to existing fusions:
SkipLayerNormalization
fusion to support an additional bias term. This includes new patterns and rewrite rules inonnxscript/rewriter/ort_fusions/skip_normalization.py
.Test utility updates:
assert_allclose
to1e-3
for better handling of numerical discrepancies in tests.