-
Notifications
You must be signed in to change notification settings - Fork 24
Description
Is there an existing issue for this?
- I have searched the existing issues
Current Behavior
If a table contains columns names like for example
- "DESC"
- "_AIRBYTE_AB_ID"
- "_AIRBYTE_EMITTED_AT"
- "_AIRBYTE_NORMALIZED_AT"
the dbt-oracle incremental strategy fails with
ORA-01747: invalid user.table.column, table.column, or column specification for example the column name desc -> "DESC"
or
ORA-00911: invalid character which are required internal Airbyte system columns e.g. "_AIRBYTE_AB_ID" or "_AIRBYTE_NORMALIZED_AT".
This is due to the fact that the following macro call returns the columns lowercase and unquoted instead of uppercase and quoted, no matter if the OracleQuotePolicy for schema and identifier is set to true or not.
dbt-oracle/dbt/include/oracle/macros/materializations/incremental/helpers.sql
Lines 37 to 38 in 7348c2b
| {% macro oracle_incremental_upsert(tmp_relation, target_relation, unique_key=none, statement_name="main") %} | |
| {%- set dest_columns = adapter.get_columns_in_relation(target_relation) -%} |
Set schema and identifier to true
dbt-oracle/dbt/include/oracle/dbt_project.yml
Lines 6 to 9 in 7348c2b
| quoting: | |
| database: false | |
| schema: false | |
| identifier: false |
dbt-oracle/dbt_adbs_test_project/dbt_project.yml
Lines 8 to 11 in 7348c2b
| quoting: | |
| database: false | |
| identifier: false | |
| schema: false |
The two for loops that fill the columns for the two when matched then update set and when not matched then insert(...) values(...) are causing the issue, as they produce for example:
merge into staging.ln_tfgld008 target
using staging.o$pt_ln_tfgld008125639 temp
on (temp."_AIRBYTE_AB_ID" = target."_AIRBYTE_AB_ID")
when matched then
update set
target.bloc = temp.bloc,
target.desc = temp.desc, -- desc is not an issue as long it is quoted "DESC"
target.timestamp = temp.timestamp, -- timestamp suprisingly is not an issue and does not trigger ORA-01747
target._airbyte_ab_id = temp._airbyte_ab_id, -- triggers ORA-00911 without quoting
target._airbyte_emitted_at = temp._airbyte_emitted_at, -- triggers ORA-00911 without quoting
target._airbyte_normalized_at = temp._airbyte_normalized_at, -- triggers ORA-00911 without quoting
target._airbyte_ln_tfgld008_hashid = temp._airbyte_ln_tfgld008_hashid -- triggers ORA-00911 without quoting
when not matched then
insert(bloc, desc, timestamp, _airbyte_ab_id, _airbyte_emitted_at, _airbyte_normalized_at, _airbyte_ln_tfgld008_hashid)
values(
temp.bloc,
temp.desc,
temp.username,
temp.timestamp,
temp._airbyte_ab_id,
temp._airbyte_emitted_at,
temp._airbyte_normalized_at,
temp._airbyte_ln_tfgld008_hashid
)Expected Behavior
The dbt-oracle incremental upsert strategy should not fail for users that do not have control over the column names of the tables and the column names.
If a column uses a reserved keyword or starts with an invalid character dbt-oracle by default should treat the columns to be quoted and uppercase.
Steps To Reproduce
- Set the
schemaandidentifierflag totrue
dbt-oracle/dbt_adbs_test_project/dbt_project.yml
Lines 8 to 11 in 7348c2b
quoting: database: false identifier: false schema: false - Add one of the above columns that trigger the ORA error to https://github.com/oracle/dbt-oracle/blob/7348c2b21883c156583adc8282876c7a993eb13d/dbt_adbs_test_project/seeds/seed.csv
- Add the column to the incremental materialization test https://github.com/oracle/dbt-oracle/blob/7348c2b21883c156583adc8282876c7a993eb13d/dbt_adbs_test_project/models/us_product_sales_channel_ranking.sql
- Run the tests https://github.com/oracle/dbt-oracle/blob/7348c2b21883c156583adc8282876c7a993eb13d/dbt_adbs_test_project/README.md
Relevant log output using --debug flag enabled
No response
Environment
- OS: Oracle Linux Server 8.6
- Python: 3.9.13
- dbt: 1.0.6What Oracle database version are you using dbt with?
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0 - Production
Additional Context
No response