Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Don't Merge] Issue 258 - Update to rank and fix operator #260

Closed
wants to merge 4 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 8 additions & 8 deletions integration_tests/models/schema_tests/data_test.sql
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
select
1 as idx,
'2020-10-21' as date_col,
cast(0 as {{ dbt.type_float() }}) as col_numeric_a,
cast(1 as {{ dbt.type_float() }}) as col_numeric_b,
cast(0 as {{ dbt.type_numeric() }}) as col_numeric_a,
cast(1 as {{ dbt.type_numeric() }}) as col_numeric_b,
'a' as col_string_a,
'b' as col_string_b,
cast(null as {{ dbt.type_string() }}) as col_null,
Expand All @@ -13,8 +13,8 @@ union all
select
2 as idx,
'2020-10-22' as date_col,
1 as col_numeric_a,
0 as col_numeric_b,
cast(1 as {{ dbt.type_numeric() }}) as col_numeric_a,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any database engine that wouldn't automatically cast every subsequent value on insert when the first value has an explicit type? (it doesn't hurt to cast them all, I'm curious and couldn't find the information online)

cast(0 as {{ dbt.type_numeric() }}) as col_numeric_b,
'b' as col_string_a,
'ab' as col_string_b,
null as col_null,
Expand All @@ -25,8 +25,8 @@ union all
select
3 as idx,
'2020-10-23' as date_col,
0.5 as col_numeric_a,
0.5 as col_numeric_b,
cast(0.5 as {{ dbt.type_numeric() }}) as col_numeric_a,
cast(0.5 as {{ dbt.type_numeric() }}) as col_numeric_b,
'c' as col_string_a,
'abc' as col_string_b,
null as col_null,
Expand All @@ -37,8 +37,8 @@ union all
select
4 as idx,
'2020-10-23' as date_col,
0.5 as col_numeric_a,
0.5 as col_numeric_b,
cast(0.5 as {{ dbt.type_numeric() }}) as col_numeric_a,
cast(0.5 as {{ dbt.type_numeric() }}) as col_numeric_b,
'c' as col_string_a,
'abcd' as col_string_b,
null as col_null,
Expand Down
14 changes: 14 additions & 0 deletions integration_tests/models/schema_tests/schema.yml
Original file line number Diff line number Diff line change
Expand Up @@ -505,6 +505,20 @@ models:
value_set: [0.5]
top_n: 1
quote_values: false
data_type: "{{ dbt.type_numeric() }}"
- dbt_expectations.expect_column_most_common_value_to_be_in_set:
value_set: [0.5, 0, 1]
top_n: 2
quote_values: false
data_type: "{{ dbt.type_numeric() }}"
- dbt_expectations.expect_column_most_common_value_to_be_in_set:
value_set: [0.5, 0]
top_n: 2
quote_values: false
data_type: "{{ dbt.type_numeric() }}"
config:
error_if: "=0"
warn_if: "<>1"
- dbt_expectations.expect_column_values_to_be_increasing:
sort_column: col_numeric_a
strictly: false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
value_set,
top_n,
quote_values=True,
data_type="decimal",
data_type=None,
row_condition=None
) -%}

Expand All @@ -22,6 +22,8 @@
row_condition
) %}

{% set data_type = dbt.type_numeric() if not data_type else data_type %}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fyi, Snowflake won't correctly compare a float in the db to a value cast to decimal (default for the test) so makes more sense to use the built-in types across the test to make sure comparisons actually work.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From the official documentation and my testing, inserting floating point data will create a number column type with the smallest precision and scale to fit the value. e.g.: when creating a table with initial data, if a column contains [0, 1, 0.5], the assigned type will be number(2,1). So no information is lost at that point. However casting that to "numeric", "decimal", "number" type without explicit precision and scale will result in rounding and casting it to any "float" or "double" type has the usual accuracy problems.

When comparing, Snowflake will happily attempt to compare any numeric type with the usual issues from floats

comparison result
cast(0.5 as number(2,1)) = cast(0.5 as numeric(28,2)) TRUE
cast(0.5 as float) = cast(0.5 as numeric(28,2)) TRUE
cast(0.5 as float) = cast(0.5 as numeric) FALSE
0.1 + 0.2 = 0.3 TRUE
(cast(0.1 as float) + cast(0.2 as float)) = cast(0.3 as float) FALSE
cast(3.1415926535897932384626433832795028841 as number(38,37)) = cast(3.141592653589793 as float) TRUE

TL;DR: It makes sense to use built-in types everywhere to not get db-engine specific issues.


with value_counts as (

select
Expand All @@ -48,7 +50,7 @@ value_counts_ranked as (

select
*,
row_number() over(order by value_count desc) as value_count_rank
rank() over(order by value_count desc) as value_count_rank
from
value_counts

Expand All @@ -60,7 +62,7 @@ value_count_top_n as (
from
value_counts_ranked
where
value_count_rank = {{ top_n }}
value_count_rank <= {{ top_n }}

),
set_values as (
Expand Down