Releases: dbt-labs/dbt-external-tables
dbt-external-tables 0.5.0
🚨 Breaking change — this package now requires dbt v0.18.0
Quality of life:
- Use 0.18.0 functionality, in particular the
adapter.dispatch
macro (#40)
dbt-external-tables 0.4.0
🚨 There is a breaking change in this release — the lower bound of dbt-utils
is now 0.4.0
.
This won't affect most users, since a version of dbt-utils in this range is required to achieve 0.17.0 compatibility.
Quality of life:
- Change dbt-utils dependencies to
[>=0.4.0, <0.6.0]
(#37)
dbt-external-tables 0.3.2
dbt-external-tables 0.3.1
dbt-external-tables 0.3.0
This is a minor release that updates to config-version: 2
of dbt_project.yml
.
Breaking changes
- Requires dbt >= v0.17.0
- Access source information from new
graph.sources
object, instead ofgraph.nodes
dbt-external-tables 0.2.0
This is a minor release that significantly changes the behavior of the stage_external_sources
macro to improve its flexibility and performance. Several of the features described below reflect breaking changes compared with the 0.1.x versions of this package.
Features
- On Snowflake, add support for staging external sources via snowpipes.
- If
external.snowpipe
is configured, thestage_external_sources
macro will create an empty table with all specifiedcolumns
plus metadata fields, run a full historical backfill viacopy
statement, and finally create a pipe that wraps the samecopy
statement. - If no
columns
are specified, the pipe will instead pull all data into a single variant column. (This behavior does not work for CSV files.)
- If
- During standard runs, the
stage_external_sources
macro will attempt to "partially refresh" external assets. It will not drop, replace, or change any existing external tables or pipe targets. To fully rebuild all external assets, supply the CLI variableext_full_refresh: true
. - The
stage_external_sources
macro now accepts aselect
argument to stage only specific nodes, using similar syntax todbt source snapshot-freshness
.
Breaking changes
- Properties of the
external
source config which previously accepted string or boolean values, such asauto_refresh
, now expect boolean values in order for therefresh_external_table
to infer whether it should be refreshed or ignored during partial refresh runs. - The
refresh_external_table
macro now returns a Jinja list[]
instead of a string.
Quality of life
- Improved info logging of the DDL/DML that the
stage_external_sources
macro is running. This is intended to provide more visibility into multistep operations, such as staging snowpipes. - Add sample analysis that, when copied to the root project's
analysis
folder, will print out a "dry run" version of all the SQL whichstage_external_sources
would run as an operation. - Add GitHub templates and codeowner
dbt-external-tables 0.1.3
Fixes
stage_external_sources
macro logs the source it is staging as{schema}.{identifier}
(#17)
dbt-external-tables 0.1.2
Fixes
- Wrap the
pattern
argument in single quotes for Snowflake external tables (#16). This was initial intended behavior.
dbt-external-tables 0.1.1
Minor fixes to reflect updates to Snowflake create external table
functionality (#15). Namely:
- By default, Snowflake now grabs all metadata automatically after creating an external table, rather than requiring the user to run
alter external table ... refresh
- Add
pattern
parameter to enable regex-matching of file names housed within the external location
dbt-external-tables 0.1.0
Initial release of core external table functionality for Snowflake and Redshift/Spectrum.
Many thanks to prerelease contributors, beta testers, and discussants:
And to all members of the dbt community who have offered their suggestions and support.