Skip to content

Releases: snowplow/dbt-snowplow-utils

snowplow-utils v0.14.0

28 Mar 08:59
Compare
Choose a tag to compare

Summary

This version makes some big changes by deprecating our snowplow_incremental materialization and a few of our other macros, with the goal of providing a simpler usage experience and easier maintenance of the package going forward. We've also added a brand new macro, get_sde_or_context to aid working with Self Describing Events or Contexts for our Redshift/Postgres users.

🚨 Breaking Changes 🚨

Deprecated snowplow_incremental materialization

We have deprecated the snowplow_incremental materialization and will be removing it entirely in a future version. In place we are providing an optimization on top of the built-in incremental materialization. To use this optimization for incremental materialized models, each model config must have snowplow_optimize=true and the following must be added to the top level of your dbt_project.yml file:

# dbt_project.yml
...
dispatch:
  - macro_namespace: dbt
    search_order: ['snowplow_utils', 'dbt']

For more information see here.

Deprecated macros

The get_cluster_by and get_partition_by macros have also been deprecated and will be removed in a future version. These should be replaced by get_value_by_target_type which offers the same functionality but more generally.

type_string and type_max_string

type_string has been entirely removed from the package and should be replaced with calls direct to dbt.type_string() instead. In all cases except redshift this should be suitable, for redshift when you need a column of length greater than 256 we provide type_max_string instead. For all other warehouses this is just a wrapper to dbt.type_string().

Features

  • Deprecate get_cluster_by and get_partition_by macro in favor of get_value_by_target_type
  • Remove type_string() and rework type_max_string() to prioritize dbt logic where possible
  • Deprecated old materialization
  • Add new get_sde_or_context macro

Under the hood

  • Remove all internal references to snowplow_incremental materialization
  • Migrate tests to new materialization approach

Docs

  • Update readme

Upgrading

To upgrade bump the package version in your packages.yml file, and follow our migration guide for the above breaking changes.

snowplow-utils v0.14.0-rc2

10 Mar 11:56
Compare
Choose a tag to compare
Pre-release

Summary

This is a pre-release version of the package, we believe it to be in working condition but you may encounter bugs and some features may change before the final release.

This version fixes a few issues from the first release candidate, including failing when you had no data in a scratch table, and deprecates some macros.

As a reminder Users will need to add the following to their dbt_project.yml to benefit from the enhancements:

# dbt_project.yml
...
dispatch:
  - macro_namespace: dbt
    search_order: ['snowplow_utils', 'dbt']

For custom models and more details, please refer to our temporary docs page: https://docs.snowplow.io/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-advanced-usage/dbt-incremental-logic-pre-release/

Features

Deprecate get_cluster_by and get_partition_by macro in favor of get_value_by_target_type
Remove type_string() and rework type_max_string() to prioritize dbt logic where possible
Fix inability to progress when scratch table contained no data
Ensure type consistency for the upsert_date_key throughout query

Under the hood

Remove all internal references to snowplow_incremental materialization
Migrate tests to new materialization approach

snowplow-utils v0.14.0-rc1

06 Mar 09:51
Compare
Choose a tag to compare
Pre-release

Summary

This is a pre-release version of the package, we believe it to be in working condition but you may encounter bugs and some features may change before the final release.

This version of the package begins the migration away from our snowplow_incremental materialization and instead provides an overwrite to the standard incremental materialization to provide the same performance improvements but in a simpler way. We expect users should see little to no performance change from the previous version, please let us know if you see performance degradation for large volumes of data.

Users will need to add the following to their dbt_project.yml to benefit from the enhancements:

# dbt_project.yml
...
dispatch:
  - macro_namespace: dbt
    search_order: ['snowplow_utils', 'dbt']

For custom models and more details, please see more details on our temporary docs page: https://docs.snowplow.io/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-advanced-usage/dbt-incremental-logic-pre-release/

Features

Deprecated old materialization
Add get_merge_sql for materialization
Fix a broken github action for our github pages

Installing

To install this version, use the following in your packages.yml file:

packages:
  - package: snowplow/snowplow_utils
    version: 0.14.0-rc1

snowplow-utils v0.13.2

21 Feb 17:43
Compare
Choose a tag to compare

Summary

This release fixes a compilation error raised if dbt compile is ran on a fresh installation of one of the dbt-snowplow packages. Under the hood we also fix the GitHub pages generation automation and update the pr template.

Features

Fix initial dbt compile error (Close #69)
Fix utils gh pages generation
Update pr template

snowplow-utils v0.13.1

20 Feb 12:42
Compare
Choose a tag to compare

Summary

This release introduces a new cross-db macro - get_array_to_string - to harmonise array to string transformations, as well as adds more features / optimisations to some of the existing macros. There are some automations and simplifications made under the hood for easier maintenance and clarity.

Features

Get string agg optimisations (Close #101)
Add get_array_to_string macro
Fix unnest macro for Postgres and snowflake (Close #105)
Add delimiter parameter to get_split_to_array (Close #106)
Tidy macro inheritance
Add warning for no data returned for model limits
Document macros in yaml, add new macro, prepare to depeciate
Add action for generating docs for pages

snowplow-utils v0.13.0

08 Dec 14:13
Compare
Choose a tag to compare

Summary

This release bumps the dependency of dbt-utils to support v1, and therefore will remove all of the deprecation warnings being displayed for users on later versions of dbt. This version also requires dbt version 1.3 at least

Features

Bump compatibility to dbt-core@1.3 as a minimum (Close #95)
Add standard actions and templates + use utils for databricks connection

snowplow-utils v0.12.3

30 Nov 14:51
Compare
Choose a tag to compare

Summary

This release shifts useful macros from the media player package to the utils as well as adds a new -unnest - macro to be consumed by multiple Snowplow dbt packages.

Features

Add unnest macro #99
Add Media Player macros #79

snowplow-utils v0.12.2

26 Oct 12:35
Compare
Choose a tag to compare

Summary

This release adds compatiblity to dbt-utils v0.9.x, and fixes a bug with the incremental materialization present in dbt-core v1.3.0

Features

  • Bump dbt-utils version for patch fix (Close #92)
  • Fix incremental_strategy default argument bug (Close #97)

snowplow-utils v0.12.1

22 Sep 14:00
Compare
Choose a tag to compare

Summary

This release adds the capability to exclude unwanted versions from the combine_column_version macro under the new exclude_versions argument for the macro. It accepts a list of versions to be excluded.

Features

  • Exclude specific entity versions from combine_column_version macro (Close #91) (Thanks to @bgarf)

snowplow-utils v0.12.0

11 Aug 13:46
Compare
Choose a tag to compare

Summary

This release adds the ability to create indexes in your Postgres model configs when using the snowplow incremental materialization and adds support for the dateadd function for Databricks runtimes below 10.4.

Features

  • Add ability to create indexes from config (Close #83)
  • Add support for Databricks runtimes below 10.4 (Close #85)