Skip to content

Commit

Permalink
Refactor integration test (#98)
Browse files Browse the repository at this point in the history
<!-- REMOVE IRRELEVANT COMMENTS BEFORE CREATING A PULL REQUEST -->
## Changes
<!-- Summary of your changes that are easy to understand. Add
screenshots when necessary -->

### Linked issues
<!-- DOC: Link issue with a keyword: close, closes, closed, fix, fixes,
fixed, resolve, resolves, resolved. See
https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword
-->

Resolves #..

### Tests
<!-- How is this tested? Please see the checklist below and also
describe any other relevant tests -->

- [ ] manually tested
- [ ] added unit tests
- [ ] added integration tests
- [ ] verified on staging environment (screenshot attached)
  • Loading branch information
mwojtyczka authored Feb 3, 2025
1 parent a0d2548 commit 0202115
Showing 1 changed file with 6 additions and 5 deletions.
11 changes: 6 additions & 5 deletions tests/integration/fixtures/test_connect.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,11 @@ def serverless_env():


@fixture
def debug_env_bugfix(monkeypatch, debug_env):
# This is a workaround to set shared cluster
# TODO: Update secret vault for acceptance testing and remove the bugfix
monkeypatch.setitem(debug_env, "DATABRICKS_CLUSTER_ID", "1114-152544-29g1w07e")
def set_shared_cluster(monkeypatch, debug_env, env_or_skip):
default_cluster_id = debug_env.get("DATABRICKS_CLUSTER_ID")
monkeypatch.setitem(debug_env, "DATABRICKS_CLUSTER_ID", env_or_skip("TEST_USER_ISOLATION_CLUSTER_ID"))
yield
monkeypatch.setitem(debug_env, "DATABRICKS_CLUSTER_ID", default_cluster_id)


@fixture
Expand All @@ -30,7 +31,7 @@ def spark_serverless_cluster_id(ws):
spark_serverless.stop()


def test_databricks_connect(debug_env_bugfix, ws, spark):
def test_databricks_connect(set_shared_cluster, ws, spark):
rows = spark.sql("SELECT 1").collect()
assert rows[0][0] == 1
assert not is_serverless_cluster(spark, ws)
Expand Down

0 comments on commit 0202115

Please sign in to comment.