Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Config Value spark.sql.mapKeyDedupPolicy not Supported by Databricks SQL Warehouse #9275

Closed
ArtnerC opened this issue May 30, 2024 · 3 comments · Fixed by #9830
Closed

Config Value spark.sql.mapKeyDedupPolicy not Supported by Databricks SQL Warehouse #9275

ArtnerC opened this issue May 30, 2024 · 3 comments · Fixed by #9830

Comments

@ArtnerC
Copy link
Contributor

ArtnerC commented May 30, 2024

Getting the error spark.sql.mapKeyDedupPolicy is not supported by Databricks SQL Warehouses when using ibis pyspark with a Databricks SQL Warehouse Cluster.

See: https://community.databricks.com/t5/data-engineering/spark-settings-in-sql-warehouse/td-p/7959

Set in do_connnect:
https://github.com/ibis-project/ibis/blame/e425ad57899f8ebbea29b57bb53cedb40ebd7193/ibis/backends/pyspark/__init__.py#L180

self._session.conf.set("spark.sql.mapKeyDedupPolicy", "LAST_WIN")

Workaround could be as simple as:

try:
    spark.conf.set("spark.sql.mapKeyDedupPolicy", "LAST_WIN")
except Exception as e:
    if "not available" in str(e):
        print("Likely running in a SQL Warehouse")
    else:
        raise e  # Re-raise other exceptions

but I'm not sure what other approaches there might be.

@jcrist
Copy link
Member

jcrist commented May 31, 2024

Thanks for opening this @ArtnerC! I think we'd happily accept a PR with the fix you suggest if you're interested in submitting one. I think we can skip checking for a specific exception and just ignore any exceptions on that line:

try:
    spark.conf.set("spark.sql.mapKeyDedupPolicy", "LAST_WIN")
except Exception:  # is there a specific exception class we could catch instead of just `Exception`?
    pass

Re: general databricks support, we don't have any immediate plans to set up a databricks testing environment (or a databricks-specific backend if needed), but if it's possible to make things work with just our existing pyspark backend, we'd happily continue to accept bugfixes towards making that work.

@ArtnerC
Copy link
Contributor Author

ArtnerC commented Aug 13, 2024

I tried to run this again to get the exact error class that was throwing but it just.. worked this time.

Nevertheless, the Databricks docs say pretty clearly that job runs using unsupported properties will fail (https://docs.databricks.com/en/release-notes/serverless.html#version-202415), so I've submitted the pr.

@ArtnerC
Copy link
Contributor Author

ArtnerC commented Aug 22, 2024

I was incorrect and the error resurfaced. I was able to track it to a SparkConnectGrpcException and implemented the specific exception check. I also will fall back to a more generic PySparkException since SparkConnect is somewhat new. The latest on this has been tested and fixes this error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

2 participants