Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding support for Amazon Athena #408

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
add default test setup for athena connection
  • Loading branch information
brendan-cook-87 committed Nov 24, 2023
commit 231f62b9c5a6c9413681d4399d3441f2b8f5f78f
3 changes: 2 additions & 1 deletion integration_test_project/example-env.sh
Original file line number Diff line number Diff line change
@@ -16,7 +16,8 @@ export DBT_ENV_SECRET_DATABRICKS_TOKEN=
export DBT_ENV_SECRET_GCP_PROJECT=
export DBT_ENV_SPARK_DRIVER_PATH= # /Library/simba/spark/lib/libsparkodbc_sbu.dylib on a Mac
export DBT_ENV_SPARK_ENDPOINT= # The endpoint ID from the Databricks HTTP path

export DBT_ENV_ATHENA_S3_STAGING=
export DBT_ENV_ATHENA_S3_DATA=
# dbt environment variables, change these
export DBT_VERSION="1_5_0"
export DBT_CLOUD_PROJECT_ID=
10 changes: 10 additions & 0 deletions integration_test_project/profiles.yml
Original file line number Diff line number Diff line change
@@ -52,3 +52,13 @@ dbt_artifacts:
dbname: postgres
schema: public
threads: 8
athena:
type: athena
s3_staging_dir: "{{ env_var('DBT_ENV_ATHENA_S3_STAGING') }}"
s3_data_dir: "{{ env_var('DBT_ENV_ATHENA_S3_DATA') }}"
s3_data_naming: schema_table_unique
region_name: ap-southeast-2
schema: public
database: awsdatacatalog
seed_s3_upload_args:
ACL: bucket-owner-full-control
13 changes: 13 additions & 0 deletions tox.ini
Original file line number Diff line number Diff line change
@@ -69,6 +69,9 @@ profiles_dir = integration_test_project

[testenv]
passenv =
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION
DBT_PROFILES_DIR
GITHUB_SHA_OVERRIDE
GITHUB_SHA
@@ -85,6 +88,8 @@ passenv =
DBT_ENV_SECRET_GCP_PROJECT
DBT_ENV_SPARK_DRIVER_PATH
DBT_ENV_SPARK_ENDPOINT
DBT_ENV_ATHENA_S3_STAGING
DBT_ENV_ATHENA_S3_DATA
GOOGLE_APPLICATION_CREDENTIALS
DBT_CLOUD_PROJECT_ID
DBT_CLOUD_JOB_ID
@@ -265,6 +270,14 @@ commands =
dbt deps
dbt build --target bigquery --vars '"my_var": "my value"'

[testenv:integration_test_athena]
changedir = integration_test_project
deps = dbt-athena-community~=1.7.0
commands =
dbt clean
dbt deps
dbt build --target athena --vars '"my_var": "my value"'

# Spark integration test (disabled)
[testenv:integration_spark]
changedir = integration_test_project