-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Same temp table is used for different unit tests #222
Comments
Thanks @afillatre, I'm going to transfer this to Here's where we generate the temp name: dbt-adapters/dbt/include/global_project/macros/materializations/tests/unit.sql Lines 9 to 10 in a2292c8
I think the options are:
|
Thanks @jtcohen6. I had that first solution in mind as well. However I do not know if there's a limitation regarding the table's name length in any supported database. |
@afillatre On Postgres the max identifier length is 63 (which is very short!), so the If this is just a matter of passing both the model name + unit test name into {%- set model_test_identifier = model['unique_id'].split('.')[2:] | join('__') -%} -- returns model_name__unit_test_name
{%- set target_relation = this.incorporate(type='table', path={"identifier": model_test_identifier}) -%}
{%- set temp_relation = make_temp_relation(target_relation)-%} |
Wonderful, thanks for the tips. I made it work like that (overriding a macro):
Should I create a PR in the bigQuery connector ? |
Is this a new bug in dbt-bigquery?
Current Behavior
When some unit tests in different models have the same name (like
test_compute_one_line
), some tests can fail depending on racing conditions (threads > 1).The temp table name is
<test_name>__dbt_tmp
, so 1 test can override the content of another one.Expected Behavior
Unit tests should run in parallel, without impacting other one, regardless their naming.
ATM I'm forced to run my tests with
--threads 1
to prevent the issueSteps To Reproduce
--threads
> 1Relevant log output
No response
Environment
Additional Context
No response
The text was updated successfully, but these errors were encountered: