Skip to content

[torchlib] Fix layer norm dtype #2100

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Mar 12, 2025
Merged

[torchlib] Fix layer norm dtype #2100

merged 2 commits into from
Mar 12, 2025

Conversation

justinchuby
Copy link
Collaborator

Fix layer norm dtype mismatch errors

Fixes #2099

Fix layer norm dtype mismatch errors
@justinchuby justinchuby added the module: torchlib Related to the torch/aten function lib in development label Mar 12, 2025
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR addresses a layer normalization datatype mismatch error by updating how constants are created for weight initialization and removing an unnecessary helper function.

  • Updated weight initialization to ensure that the constant matches the input tensor’s dtype.
  • Removed the bias initialization block and the auxiliary _aten_layer_norm_onnx function, directly calling op.LayerNormalization.
Comments suppressed due to low confidence (1)

onnxscript/function_libs/torch_lib/ops/core.py:4755

  • The bias initialization block was removed, so if bias is None, op.LayerNormalization will receive None. Please verify that passing None for bias is acceptable or provide a suitable default.
result, _, _ = op.LayerNormalization(input, weight, bias, axis=axis, epsilon=eps)

Copy link

codecov bot commented Mar 12, 2025

❌ 3 Tests Failed:

Tests completed Failed Passed Skipped
11764 3 11761 2406
View the top 3 failed test(s) by shortest run time
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0963_test_resize_downsample_scales_linear_antialias
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_resize_downsample_scales_linear_antialias'

The above exception was the direct cause of the following exception:
.nox\test_torch_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_resize_downsample_scales_linear_antialias' (e=No module named 'tests.onnx_backend_test_code.test_resize_downsample_scales_linear_antialias') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_resize_downsample_scales_linear_antialias.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_resize_downsample_scales_linear_antialias.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset19
E   
E   @script()
E   def bck_test_resize_downsample_scales_linear_antialias(X: FLOAT[1,1,4,4], scales: FLOAT[4]) -> (FLOAT[1,1,2,2]):
E       Y = opset19.Resize(X, None, scales, antialias=1, mode='linear')
E       return Y
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_1241_test_tfidfvectorizer_tf_batch_onlybigrams_skip5
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_tfidfvectorizer_tf_batch_onlybigrams_skip5'

The above exception was the direct cause of the following exception:
.nox\test_torch_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_tfidfvectorizer_tf_batch_onlybigrams_skip5' (e=No module named 'tests.onnx_backend_test_code.test_tfidfvectorizer_tf_batch_onlybigrams_skip5') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_tfidfvectorizer_tf_batch_onlybigrams_skip5.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_tfidfvectorizer_tf_batch_onlybigrams_skip5.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT32
E   from onnxscript.onnx_opset import opset9
E   
E   @script()
E   def bck_test_tfidfvectorizer_tf_batch_onlybigrams_skip5(X: INT32[2,6]) -> (FLOAT[2,7]):
E       Y = opset9.TfIdfVectorizer(X, max_gram_length=2, max_skip_count=5, min_gram_length=2, mode='TF', ngram_counts=[0, 4], ngram_indexes=[0, 1, 2, 3, 4, 5, 6], pool_int64s=[2, 3, 5, 4, 5, 6, 7, 8, 6, 7])
E       return Y
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0000_test_abs
Stack Traces | 0.005s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_abs'

The above exception was the direct cause of the following exception:
.nox\test_torch_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_abs' (e=No module named 'tests.onnx_backend_test_code.test_abs') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_abs.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_abs.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_abs(x: FLOAT[3,4,5]) -> (FLOAT[3,4,5]):
E       y = opset13.Abs(x)
E       return y

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

@justinchuby justinchuby enabled auto-merge (squash) March 12, 2025 21:58
@justinchuby justinchuby merged commit 1da3b9c into main Mar 12, 2025
20 of 29 checks passed
@justinchuby justinchuby deleted the justinchu/layer-norm branch March 12, 2025 21:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: torchlib Related to the torch/aten function lib in development
Projects
Development

Successfully merging this pull request may close these issues.

LayerNorm node has data type mismatches for input and scale
2 participants