Skip to content

Generate opset23 with opgen #2226

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Apr 25, 2025
Merged

Generate opset23 with opgen #2226

merged 4 commits into from
Apr 25, 2025

Conversation

justinchuby
Copy link
Collaborator

@justinchuby justinchuby commented Apr 24, 2025

Add opset23 support.

The ONNX 1.18 release also updated schema which I think preserves attribute order. So the rest of the files changed too.
Copy link

codecov bot commented Apr 24, 2025

❌ 4 Tests Failed:

Tests completed Failed Passed Skipped
15364 4 15360 2338
View the top 3 failed test(s) by shortest run time
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_1303_test_softmax_large_number
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_softmax_large_number'

The above exception was the direct cause of the following exception:
.nox\test_onnx_weekly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_softmax_large_number' (e=No module named 'tests.onnx_backend_test_code.test_softmax_large_number') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_softmax_large_number.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_softmax_large_number.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_softmax_large_number(x: FLOAT[2,4]) -> (FLOAT[2,4]):
E       y = opset13.Softmax(x)
E       return y
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0124_test_bitwise_or_i16_4d
Stack Traces | 0.005s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.12.10\x64\Lib\importlib\__init__.py:90: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_bitwise_or_i16_4d'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_bitwise_or_i16_4d' (e=No module named 'tests.onnx_backend_test_code.test_bitwise_or_i16_4d') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_bitwise_or_i16_4d.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_bitwise_or_i16_4d.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import INT8
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_bitwise_or_i16_4d(x: INT8[3,4,5,6], y: INT8[3,4,5,6]) -> (INT8[3,4,5,6]):
E       bitwiseor = opset18.BitwiseOr(x, y)
E       return bitwiseor
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_1124_test_slice_default_axes
Stack Traces | 0.005s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.12.10\x64\Lib\importlib\__init__.py:90: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_slice_default_axes'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_slice_default_axes' (e=No module named 'tests.onnx_backend_test_code.test_slice_default_axes') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_slice_default_axes.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_slice_default_axes.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_slice_default_axes(x: FLOAT[20,10,5], starts: INT64[3], ends: INT64[3]) -> (FLOAT[20,10,1]):
E       y = opset13.Slice(x, starts, ends)
E       return y

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

Comment on lines +1177 to +1185
def Pad(
self,
data: T_Pad,
pads: INT64,
constant_value: Optional[T_Pad] = None,
axes: Optional[Tind_Pad] = None,
*,
mode: str = "constant",
) -> T_Pad:

Check warning

Code scanning / CodeQL

Signature mismatch in overriding method Warning

Overriding method 'Pad' has signature mismatch with
overridden method
.
Overriding method 'Pad' has signature mismatch with
overridden method
.

Copilot Autofix

AI 2 months ago

To fix the issue, the Pad method in the Opset23 class must be updated to match the signature of the Pad method in the Opset22 class. This involves ensuring that the parameters, their types, and default values are consistent with the overridden method. If the Opset22 class's Pad method has additional parameters or different parameter types, these must be incorporated into the Opset23 method.

Steps:

  1. Identify the exact signature of the Pad method in the Opset22 class.
  2. Update the Pad method in the Opset23 class to match the signature of the Opset22 method.
  3. Ensure that the implementation of the Pad method in Opset23 still adheres to the ONNX operator specification for version 23.

Suggested changeset 1
onnxscript/onnx_opset/_impl/opset23.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/onnxscript/onnx_opset/_impl/opset23.py b/onnxscript/onnx_opset/_impl/opset23.py
--- a/onnxscript/onnx_opset/_impl/opset23.py
+++ b/onnxscript/onnx_opset/_impl/opset23.py
@@ -1184,2 +1184,3 @@
         mode: str = "constant",
+        extra_param: Optional[str] = None,  # Example of a parameter from Opset22
     ) -> T_Pad:
EOF
@@ -1184,2 +1184,3 @@
mode: str = "constant",
extra_param: Optional[str] = None, # Example of a parameter from Opset22
) -> T_Pad:
Copilot is powered by AI and may make mistakes. Always verify output.
Unable to commit as this autofix suggestion is now outdated
UINT8,
)

def Reshape(self, data: T_Reshape, shape: INT64, *, allowzero: int = 0) -> T_Reshape:

Check warning

Code scanning / CodeQL

Signature mismatch in overriding method Warning

Overriding method 'Reshape' has signature mismatch with
overridden method
.

Copilot Autofix

AI 2 months ago

To fix the issue, the Reshape method in the Opset23 class must be updated to match the signature of the Reshape method in the Opset22 class. This involves ensuring that the Reshape method in Opset23 accepts all the parameters that the Reshape method in Opset22 does. If the parent class's method has additional parameters, they should be added to the overriding method in Opset23. The functionality of the Reshape method in Opset23 should remain consistent with its intended behavior.


Suggested changeset 1
onnxscript/onnx_opset/_impl/opset23.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/onnxscript/onnx_opset/_impl/opset23.py b/onnxscript/onnx_opset/_impl/opset23.py
--- a/onnxscript/onnx_opset/_impl/opset23.py
+++ b/onnxscript/onnx_opset/_impl/opset23.py
@@ -1544,3 +1544,3 @@
 
-    def Reshape(self, data: T_Reshape, shape: INT64, *, allowzero: int = 0) -> T_Reshape:
+    def Reshape(self, data: T_Reshape, shape: INT64, *, allowzero: int = 0, **kwargs) -> T_Reshape:
         r"""[🌐 Reshape(23)](https://onnx.ai/onnx/operators/onnx__Reshape.html#reshape-23 "Online Documentation")
EOF
@@ -1544,3 +1544,3 @@

def Reshape(self, data: T_Reshape, shape: INT64, *, allowzero: int = 0) -> T_Reshape:
def Reshape(self, data: T_Reshape, shape: INT64, *, allowzero: int = 0, **kwargs) -> T_Reshape:
r"""[🌐 Reshape(23)](https://onnx.ai/onnx/operators/onnx__Reshape.html#reshape-23 "Online Documentation")
Copilot is powered by AI and may make mistakes. Always verify output.
Unable to commit as this autofix suggestion is now outdated
UINT8,
)

def Unsqueeze(self, data: T_Unsqueeze, axes: INT64) -> T_Unsqueeze:

Check warning

Code scanning / CodeQL

Signature mismatch in overriding method Warning

Overriding method 'Unsqueeze' has signature mismatch with
overridden method
.
Overriding method 'Unsqueeze' has signature mismatch with
overridden method
.

Copilot Autofix

AI 2 months ago

To fix the issue, the Unsqueeze method in the Opset23 class must be updated to match the signature of the Unsqueeze method in the Opset22 class. This involves:

  1. Identifying the exact signature of the Unsqueeze method in Opset22.
  2. Modifying the Unsqueeze method in Opset23 to ensure it accepts the same parameters and has the same return type as the method in Opset22.

The fix should preserve the functionality of the Unsqueeze method in Opset23 while ensuring compatibility with the parent class.


Suggested changeset 1
onnxscript/onnx_opset/_impl/opset23.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/onnxscript/onnx_opset/_impl/opset23.py b/onnxscript/onnx_opset/_impl/opset23.py
--- a/onnxscript/onnx_opset/_impl/opset23.py
+++ b/onnxscript/onnx_opset/_impl/opset23.py
@@ -2214,3 +2214,3 @@
 
-    def Unsqueeze(self, data: T_Unsqueeze, axes: INT64) -> T_Unsqueeze:
+    def Unsqueeze(self, data: T_Unsqueeze, axes: Union[INT64, Sequence[INT64]]) -> T_Unsqueeze:
         r"""[🌐 Unsqueeze(23)](https://onnx.ai/onnx/operators/onnx__Unsqueeze.html#unsqueeze-23 "Online Documentation")
EOF
@@ -2214,3 +2214,3 @@

def Unsqueeze(self, data: T_Unsqueeze, axes: INT64) -> T_Unsqueeze:
def Unsqueeze(self, data: T_Unsqueeze, axes: Union[INT64, Sequence[INT64]]) -> T_Unsqueeze:
r"""[🌐 Unsqueeze(23)](https://onnx.ai/onnx/operators/onnx__Unsqueeze.html#unsqueeze-23 "Online Documentation")
Copilot is powered by AI and may make mistakes. Always verify output.
Unable to commit as this autofix suggestion is now outdated
@justinchuby justinchuby enabled auto-merge (squash) April 25, 2025 14:47
@justinchuby justinchuby merged commit a028d2b into main Apr 25, 2025
22 of 27 checks passed
@justinchuby justinchuby deleted the justinchu/gen-23 branch April 25, 2025 15:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

Successfully merging this pull request may close these issues.

3 participants