Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

compat: Support dask query-planning, drop Python 3.9 #171

Open
wants to merge 17 commits into
base: main
Choose a base branch
from
Open

Conversation

hoxbro
Copy link
Member

@hoxbro hoxbro commented Jan 30, 2025

Resolves #146
Resolves #169

I don't want to support the legacy method, so I'm pinning to 2025.1, the first version with the dask-expr part of the dask library. This version only supports Python 3.10 and forward.

I haven't done any performance benchmarks. This is only to get spatialpandas to work with dask going forward.

reference: https://docs.dask.org/en/stable/dataframe-extend.html

@hoxbro hoxbro changed the title compat: Support dask query-planning compat: Support dask query-planning, drop Python 3.9 Jan 31, 2025
Copy link

codecov bot commented Jan 31, 2025

Codecov Report

Attention: Patch coverage is 85.71429% with 4 lines in your changes missing coverage. Please review.

Project coverage is 77.64%. Comparing base (e58fa82) to head (4ccf675).

Files with missing lines Patch % Lines
spatialpandas/dask.py 83.33% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #171      +/-   ##
==========================================
- Coverage   77.74%   77.64%   -0.11%     
==========================================
  Files          50       50              
  Lines        4871     4875       +4     
==========================================
- Hits         3787     3785       -2     
- Misses       1084     1090       +6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@@ -13,25 +13,22 @@
from dask import delayed
from dask.dataframe.core import get_parallel_type
from dask.dataframe.extensions import make_array_nonempty
from dask.dataframe.partitionquantiles import partition_quantiles
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

USE_PYGEOS = '0'

[environments]
test-39 = ["py39", "test-core", "test", "test-task", "example", "test-example"]
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dask 2025.1 only supports Python 3.10.

)

with warnings.catch_warnings():
warnings.filterwarnings(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This only happens for s3 parquet tests with following warning:

UserWarning: Dask annotations {'retries': 5} detected. Annotations will be ignored when using query-planning.

Full output
❯ pytest "spatialpandas/tests/test_parquet_s3.py::test_read_parquet_dask_remote[glob_parquet]"
/home/shh/.local/conda/envs/holoviz/lib/python3.12/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

  warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
===================================================================================================================== test session starts =====================================================================================================================
platform linux -- Python 3.12.8, pytest-8.3.4, pluggy-1.5.0
Using --randomly-seed=2711051231
rootdir: /home/shh/projects/holoviz/repos/spatialpandas
configfile: pyproject.toml
plugins: hypothesis-6.124.7, anyio-4.8.0, xdist-3.6.1, rerunfailures-15.0, randomly-3.15.0, asyncio-0.25.3, base-url-2.1.0, nbval-0.11.0, playwright-0.6.2
asyncio: mode=Mode.STRICT, asyncio_default_fixture_loop_scope=None
collected 1 item

spatialpandas/tests/test_parquet_s3.py E                                                                                                                                                                                                                [100%]

=========================================================================================================================== ERRORS ============================================================================================================================
________________________________________________________________________________________________ ERROR at setup of test_read_parquet_dask_remote[glob_parquet] ________________________________________________________________________________________________

s3_fixture = ('s3://test_bucket', {'anon': False, 'endpoint_url': 'http://127.0.0.1:5555/'}), sdf =                point
0  Point([0.0, 1.0])
1  Point([2.0, 3.0])
2  Point([4.0, 5.0])
3  Point([6.0, 7.0])

    @pytest.fixture(scope="module")
    def s3_parquet_dask(s3_fixture, sdf):
        path, s3so = s3_fixture
        path = f"{path}/test_dask"
        ddf = dd.from_pandas(sdf, npartitions=2)
>       to_parquet_dask(ddf, path, storage_options=s3so)

spatialpandas/tests/test_parquet_s3.py:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
spatialpandas/io/parquet.py:189: in to_parquet_dask
    dd_to_parquet(
/home/shh/.local/conda/envs/holoviz/lib/python3.12/site-packages/dask/dataframe/dask_expr/io/parquet.py:634: in to_parquet
    out = new_collection(
/home/shh/.local/conda/envs/holoviz/lib/python3.12/site-packages/dask/dataframe/dask_expr/_collection.py:4816: in new_collection
    return get_collection_type(meta)(expr)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <[RecursionError('maximum recursion depth exceeded') raised in repr()] Scalar object at 0x7db63aa19130>
expr = ToParquet(frame=RenameFrame(frame=ResetIndex(frame=df), columns={'index': '__null_dask_index__'}), path='test_bucket/t...'schema': point: fixed_size_binary[8]
__null_dask_index__: int64, 'index_cols': ['__null_dask_index__']}, append=False)

    def __init__(self, expr):
        global _WARN_ANNOTATIONS
        if _WARN_ANNOTATIONS and (annot := get_annotations()):
            _WARN_ANNOTATIONS = False
>           warnings.warn(
                f"Dask annotations {annot} detected. Annotations will be ignored when using query-planning."
            )
E           UserWarning: Dask annotations {'retries': 5} detected. Annotations will be ignored when using query-planning.

/home/shh/.local/conda/envs/holoviz/lib/python3.12/site-packages/dask/dataframe/dask_expr/_collection.py:308: UserWarning
-------------------------------------------------------------------------------------------------------------------- Captured stdout setup --------------------------------------------------------------------------------------------------------------------
Starting a new Thread with MotoServer running on 127.0.0.1:5555...
--------------------------------------------------------------------------------------------------------------------- Captured log setup ----------------------------------------------------------------------------------------------------------------------
INFO     aiobotocore.credentials:credentials.py:567 Found credentials in environment variables.
INFO     werkzeug:_internal.py:97 127.0.0.1 - - [31/Jan/2025 10:43:08] "GET /test_bucket?list-type=2&max-keys=1&encoding-type=url HTTP/1.1" 404 -
INFO     werkzeug:_internal.py:97 127.0.0.1 - - [31/Jan/2025 10:43:08] "GET /test_bucket?location HTTP/1.1" 404 -
INFO     werkzeug:_internal.py:97 127.0.0.1 - - [31/Jan/2025 10:43:08] "PUT /test_bucket HTTP/1.1" 200 -
INFO     werkzeug:_internal.py:97 127.0.0.1 - - [31/Jan/2025 10:43:08] "GET /test_bucket?list-type=2&max-keys=1&encoding-type=url HTTP/1.1" 200 -
INFO     werkzeug:_internal.py:97 127.0.0.1 - - [31/Jan/2025 10:43:08] "GET /test_bucket?list-type=2&max-keys=1&encoding-type=url HTTP/1.1" 200 -
=================================================================================================================== short test summary info ===================================================================================================================
ERROR spatialpandas/tests/test_parquet_s3.py::test_read_parquet_dask_remote[glob_parquet] - UserWarning: Dask annotations {'retries': 5} detected. Annotations will be ignored when using query-planning.
====================================================================================================================== 1 error in 1.91s =======================================================================================================================

@hoxbro
Copy link
Member Author

hoxbro commented Jan 31, 2025

I have tested a dev release with HoloViews and Datashader, and both test suites passed.

@hoxbro hoxbro marked this pull request as ready for review January 31, 2025 16:15
@@ -86,6 +81,9 @@ test-unit = 'pytest spatialpandas/tests -n logical --dist loadgroup'
[feature.test-example.dependencies]
nbval = "*"

[feature.test-example.activation.env]
DASK_SCHEDULER = "single-threaded"
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to try to avoid this kind of error (which also happens on main)

image

'pandas',
'pyarrow >=10',
'pandas >=2.0',
'pyarrow >=14.0.1',
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These two are part of the minimum dependencies of dask

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Cannot import spatialpandas.dask.DaskGeoDataFrame Support dask-expr DataFrame
1 participant