You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Use storageAssighmentPolicy for casts in DML commands
Follow spark.sql.storeAssignmentPolicy instead of spark.sql.ansi.enabled for casting behaviour in UPDATE and MERGE. This will by default error out at runtime when an overflow happens.
Closes#1938
GitOrigin-RevId: c960a0521df27daa6ee231e0a1022d8756496785
Copy file name to clipboardExpand all lines: spark/src/main/resources/error/delta-error-classes.json
+8Lines changed: 8 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -272,6 +272,14 @@
272
272
],
273
273
"sqlState" : "0A000"
274
274
},
275
+
"DELTA_CAST_OVERFLOW_IN_TABLE_WRITE" : {
276
+
"message" : [
277
+
"Failed to write a value of <sourceType> type into the <targetType> type column <columnName> due to an overflow.",
278
+
"Use `try_cast` on the input value to tolerate overflow and return NULL instead.",
279
+
"If necessary, set <storeAssignmentPolicyFlag> to \"LEGACY\" to bypass this error or set <updateAndMergeCastingFollowsAnsiEnabledFlag> to true to revert to the old behaviour and follow <ansiEnabledFlag> in UPDATE and MERGE."
280
+
],
281
+
"sqlState" : "22003"
282
+
},
275
283
"DELTA_CDC_NOT_ALLOWED_IN_THIS_VERSION" : {
276
284
"message" : [
277
285
"Configuration delta.enableChangeDataFeed cannot be set. Change data feed from Delta is not yet available."
0 commit comments