Skip to content

Commit c19e893

Browse files
committed
relax fp16 tolerance, forgot to include on "[ONNX][dynamo_export] Skip instance_norm decomp for export"
Otherwise, instance_norm is decomposed into batch_norm with training set to True. Downstream exporter has no way to figure out that training is actually not needed. On the other hand, ONNX does have InstanceNormalization operator defined, however due to decomp, it unnecessarily exports as batch norm and glue code. Depends on microsoft/onnxscript#1284 [ghstack-poisoned]
1 parent f19163a commit c19e893

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

test/onnx/test_fx_op_consistency.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1957,6 +1957,7 @@ class TestOnnxModelOutputConsistency(onnx_test_common._TestONNXRuntime):
19571957
"nn.functional.hardsigmoid": [1e-3, 5e-3],
19581958
"nn.functional.hardswish": [1e-3, 5e-3],
19591959
"nn.functional.hinge_embedding_loss": [4e-1, 3e-3],
1960+
"nn.functional.instance_norm": [1e-2, 1e-3],
19601961
"nn.functional.interpolate": [1e-2, 1e-3],
19611962
"nn.functional.kl_div": [2e-3, 2e-4],
19621963
"nn.functional.multilabel_soft_margin_loss": [4e-2, 5e-3],

0 commit comments

Comments
 (0)