Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare for v0.1.16 Flamingos Release 🦩 #379

Merged
merged 29 commits into from
Aug 26, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
baa3033
Update whats-new.rst
gmaze Aug 19, 2024
0c227bf
Update setup.py
gmaze Aug 19, 2024
e52a9ea
Codespell
gmaze Aug 19, 2024
755de3c
black + flake8
gmaze Aug 19, 2024
c1fc7c0
Delete utilities.py
gmaze Aug 19, 2024
7a98bc1
Add support for python 3.10, drop 3.8
gmaze Aug 19, 2024
a3df885
Improve show_versions following support for py 3.10
gmaze Aug 20, 2024
f4fb383
More 3.10 CI tests support
gmaze Aug 20, 2024
844b415
fix 3.10 env
gmaze Aug 20, 2024
860f3ec
Update casting.py
gmaze Aug 20, 2024
36708d5
xarray < 2024.3
gmaze Aug 20, 2024
90e6625
xarray < 2024.3 in pinned env
gmaze Aug 20, 2024
c062ee0
Update rtd env
gmaze Aug 20, 2024
b937fbd
Update impact.rst
gmaze Aug 20, 2024
3c25411
Update readthedocs.yml
gmaze Aug 20, 2024
cfe17db
Remove py 3.8 env files
gmaze Aug 20, 2024
808e3fe
Update requirements.txt
gmaze Aug 20, 2024
c859335
Update requirements.txt
gmaze Aug 21, 2024
2554876
Upgrade py3.9 pinned versions
gmaze Aug 21, 2024
98cd7b6
Misc
gmaze Aug 21, 2024
74b54a5
Update install.rst
gmaze Aug 21, 2024
e7b210d
Merge branch 'master' into releasev0.1.16
gmaze Aug 21, 2024
3011406
Merge branch 'master' into releasev0.1.16
gmaze Aug 22, 2024
a11d0ca
Merge branch 'master' into releasev0.1.16
gmaze Aug 22, 2024
94e17e3
Merge branch 'master' into releasev0.1.16
gmaze Aug 22, 2024
d8a25d3
Restore use of mocked server
gmaze Aug 22, 2024
8c75ef4
Merge branch 'master' into releasev0.1.16
gmaze Aug 23, 2024
17e2b52
Pin xarray < 2024.3
gmaze Aug 23, 2024
e09eeb2
Merge branch 'master' into releasev0.1.16
gmaze Aug 23, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .codespellrc
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[codespell]
skip = *.nc,*.ipynb,./local_work,./float_source,./binder,./.github,*.log,./.git,./docs/_build,./docs/_static,./argopy/tests/test_data,./build,./docs/mycache_folder,./docs/examples/cache_bgc
skip = *.nc,*.ipynb,./local_work,./float_source,./binder,./.github,*.log,./.git,./docs/_build,./docs/_static,./argopy/tests/test_data,./build,./docs/mycache_folder,./docs/examples/cache_bgc,./argopy/static/assets/*.json
count =
quiet-level = 3
ignore-words-list = PRES, pres, idel
3 changes: 2 additions & 1 deletion .flake8
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@ ignore =
E501,
# line break before binary operator
W503

# whitespace before ':' (https://black.readthedocs.io/en/stable/guides/using_black_with_other_tools.html#e203)
E203
exclude =
# No need to traverse our git directory
.git,
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/pytests-upstream.yml
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ jobs:
strategy:
fail-fast: true
matrix:
python-version: ["3.8", "3.9"]
python-version: ["3.9", "3.10"]
os: ["ubuntu-latest", "macos-latest", "windows-latest"]

steps:
Expand Down Expand Up @@ -200,7 +200,7 @@ jobs:
strategy:
fail-fast: true
matrix:
python-version: ["3.8", "3.9"]
python-version: ["3.9", "3.10"]
os: ["ubuntu-latest", "macos-latest", "windows-latest"]

steps:
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/pytests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ jobs:
max-parallel: 12
fail-fast: false
matrix:
python-version: ["3.8", "3.9"]
python-version: ["3.9", "3.10"]
os: ["ubuntu-latest", "windows-latest", "macos-latest"]
experimental: [false]

Expand Down Expand Up @@ -174,7 +174,7 @@ jobs:
max-parallel: 12
fail-fast: false
matrix:
python-version: ["3.8", "3.9"]
python-version: ["3.9", "3.10"]
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
experimental: [false]

Expand Down
2 changes: 1 addition & 1 deletion CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
level of experience, education, socioeconomic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.

## Our Standards
Expand Down
2 changes: 0 additions & 2 deletions argopy/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@

# Other Import
# from . import utils # noqa: E402
from . import utilities # noqa: E402 # being deprecated until 0.1.15, then remove
from . import stores # noqa: E402
from . import errors # noqa: E402
from . import plot # noqa: E402
Expand Down Expand Up @@ -72,7 +71,6 @@
"ArgoDOI", # Class

# Submodules:
"utilities", # being deprecated until 0.1.15, then remove
# "utils",
"errors",
"plot",
Expand Down
74 changes: 38 additions & 36 deletions argopy/data_fetchers/argovis_data.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,3 @@
#!/bin/env python
# -*coding: UTF-8 -*-
#
# Argo data fetcher for Argovis.
#

import numpy as np
import pandas as pd
import xarray as xr
Expand All @@ -23,7 +17,7 @@
access_points = ["wmo", "box"]
exit_formats = ["xarray"]
dataset_ids = ["phy"] # First is default
api_server = "https://argovis-api.colorado.edu"
api_server = "https://argovis-api.colorado.edu"
api_server_check = "https://argovis-api.colorado.edu/ping"

log = logging.getLogger("argopy.argovis.data")
Expand Down Expand Up @@ -58,7 +52,7 @@ def __init__(
chunks: str = "auto",
chunks_maxsize: dict = {},
api_timeout: int = 0,
**kwargs
**kwargs,
):
"""Instantiate an Argovis Argo data loader

Expand Down Expand Up @@ -95,7 +89,7 @@ def __init__(
"cachedir": cachedir,
"timeout": timeout,
# "size_policy": "head", # deprecated
"client_kwargs": {"headers": {'x-argokey': OPTIONS['argovis_api_key']}},
"client_kwargs": {"headers": {"x-argokey": OPTIONS["argovis_api_key"]}},
}
self.fs = kwargs["fs"] if "fs" in kwargs else httpstore(**self.store_opts)

Expand Down Expand Up @@ -134,9 +128,12 @@ def __repr__(self):
summary = ["<datafetcher.argovis>"]
summary.append("Name: %s" % self.definition)
summary.append("API: %s" % api_server)
api_key = self.fs.fs.client_kwargs['headers']['x-argokey']
if api_key == DEFAULT['argovis_api_key']:
summary.append("API KEY: '%s' (get a free key at https://argovis-keygen.colorado.edu)" % api_key)
api_key = self.fs.fs.client_kwargs["headers"]["x-argokey"]
if api_key == DEFAULT["argovis_api_key"]:
summary.append(
"API KEY: '%s' (get a free key at https://argovis-keygen.colorado.edu)"
% api_key
)
else:
summary.append("API KEY: '%s'" % api_key)
summary.append("Domain: %s" % format_oneline(self.cname()))
Expand Down Expand Up @@ -286,24 +283,32 @@ def json2dataframe(self, profiles):
for profile in data:
# construct metadata dictionary that will be repeated for each level
metadict = {
'date': profile['timestamp'],
'date_qc': profile['timestamp_argoqc'],
'lat': profile['geolocation']['coordinates'][1],
'lon': profile['geolocation']['coordinates'][0],
'cycle_number': profile['cycle_number'],
'DATA_MODE': profile['data_info'][2][0][1],
'DIRECTION': profile['profile_direction'],
'platform_number': profile['_id'].split('_')[0],
'position_qc': profile['geolocation_argoqc'],
'index': 0
"date": profile["timestamp"],
"date_qc": profile["timestamp_argoqc"],
"lat": profile["geolocation"]["coordinates"][1],
"lon": profile["geolocation"]["coordinates"][0],
"cycle_number": profile["cycle_number"],
"DATA_MODE": profile["data_info"][2][0][1],
"DIRECTION": profile["profile_direction"],
"platform_number": profile["_id"].split("_")[0],
"position_qc": profile["geolocation_argoqc"],
"index": 0,
}
# construct a row for each level in the profile
for i in range(len(profile['data'][profile['data_info'][0].index('pressure')])):
for i in range(
len(profile["data"][profile["data_info"][0].index("pressure")])
):
row = {
'temp': profile['data'][profile['data_info'][0].index('temperature')][i],
'pres': profile['data'][profile['data_info'][0].index('pressure')][i],
'psal': profile['data'][profile['data_info'][0].index('salinity')][i],
**metadict
"temp": profile["data"][
profile["data_info"][0].index("temperature")
][i],
"pres": profile["data"][profile["data_info"][0].index("pressure")][
i
],
"psal": profile["data"][profile["data_info"][0].index("salinity")][
i
],
**metadict,
}
rows.append(row)
df = pd.DataFrame(rows)
Expand Down Expand Up @@ -375,8 +380,8 @@ def to_xarray(self, errors: str = "ignore"):
ds.attrs["Fetched_from"] = self.server
try:
ds.attrs["Fetched_by"] = getpass.getuser()
except:
ds.attrs["Fetched_by"] = 'anonymous'
except: # noqa: E722
ds.attrs["Fetched_by"] = "anonymous"
ds.attrs["Fetched_date"] = pd.to_datetime("now", utc=True).strftime("%Y/%m/%d")
ds.attrs["Fetched_constraints"] = self.cname()
ds.attrs["Fetched_uri"] = self.uri
Expand Down Expand Up @@ -435,9 +440,9 @@ def init(self, WMO=[], CYC=None, **kwargs):
def get_url(self, wmo: int, cyc: int = None) -> str:
"""Return path toward the source file of a given wmo/cyc pair"""
if cyc is None:
return f'{self.server}/argo?platform={str(wmo)}&data=pressure,temperature,salinity'
return f"{self.server}/argo?platform={str(wmo)}&data=pressure,temperature,salinity"
else:
return f'{self.server}/argo?id={str(wmo)}_{str(cyc).zfill(3)}&data=pressure,temperature,salinity'
return f"{self.server}/argo?id={str(wmo)}_{str(cyc).zfill(3)}&data=pressure,temperature,salinity"

@property
def uri(self):
Expand Down Expand Up @@ -488,10 +493,7 @@ def init(self, box: list, **kwargs):

def get_url(self):
"""Return the URL used to download data"""
shape = [
[self.BOX[0], self.BOX[2]], # ll
[self.BOX[1], self.BOX[3]] # ur
]
shape = [[self.BOX[0], self.BOX[2]], [self.BOX[1], self.BOX[3]]] # ll # ur
strShape = str(shape).replace(" ", "")
url = self.server + "/argo?data=pressure,temperature,salinity&box=" + strShape
url += "&startDate={}".format(
Expand Down Expand Up @@ -558,4 +560,4 @@ def uri(self):
for box in boxes:
urls.append(Fetch_box(box=box, ds=self.dataset_id).get_url())

return self.url_encode(urls)
return self.url_encode(urls)
36 changes: 23 additions & 13 deletions argopy/data_fetchers/erddap_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,10 @@
from ..utils.format import format_oneline
from ..stores import httpstore
from ..errors import ErddapServerError, DataNotFound
from ..stores import indexstore_pd as ArgoIndex # make sure we work with the Pandas index store
from ..utils import is_list_of_strings, to_list,Chunker
from ..stores import (
indexstore_pd as ArgoIndex,
) # make sure we work with the Pandas index store
from ..utils import is_list_of_strings, to_list, Chunker
from .proto import ArgoDataFetcherProto


Expand Down Expand Up @@ -159,11 +161,15 @@ def __init__( # noqa: C901
# This will be used to:
# - retrieve the list of BGC variables to ask the erddap server
# - get <param>_data_mode information because we can't get it from the server
self.indexfs = kwargs['indexfs'] if 'indexfs' in kwargs else ArgoIndex(
index_file='argo_synthetic-profile_index.txt', # the only available in the erddap
cache=kwargs['cache_index'] if 'cache_index' in kwargs else cache,
cachedir=cachedir,
timeout=timeout,
self.indexfs = (
kwargs["indexfs"]
if "indexfs" in kwargs
else ArgoIndex(
index_file="argo_synthetic-profile_index.txt", # the only available in the erddap
cache=kwargs["cache_index"] if "cache_index" in kwargs else cache,
cachedir=cachedir,
timeout=timeout,
)
)

# To handle bugs in the erddap server, we need the list of parameters on the server:
Expand Down Expand Up @@ -615,7 +621,9 @@ def getNfromncHeader(url):
return N


def post_process(self, this_ds, add_dm: bool = True, URI: list = None): # noqa: C901
def post_process(
self, this_ds, add_dm: bool = True, URI: list = None
): # noqa: C901
"""Post-process a xarray.DataSet created from a netcdf erddap response

This method can also be applied on a regular dataset to re-enforce format compliance
Expand Down Expand Up @@ -667,8 +675,8 @@ def post_process(self, this_ds, add_dm: bool = True, URI: list = None): # noqa:

# In the case of a parallel download, this is a trick to preserve the chunk uri in the chunk dataset:
# (otherwise all chunks have the same list of uri)
Fetched_url = this_ds.attrs.get('Fetched_url', False)
Fetched_constraints = this_ds.attrs.get('Fetched_constraints', False)
Fetched_url = this_ds.attrs.get("Fetched_url", False)
Fetched_constraints = this_ds.attrs.get("Fetched_constraints", False)

# Finally overwrite erddap attributes with those from argopy:
raw_attrs = this_ds.attrs.copy()
Expand All @@ -691,12 +699,14 @@ def post_process(self, this_ds, add_dm: bool = True, URI: list = None): # noqa:
this_ds.attrs["Fetched_from"] = self.erddap.server
try:
this_ds.attrs["Fetched_by"] = getpass.getuser()
except:
this_ds.attrs["Fetched_by"] = 'anonymous'
except: # noqa: E722
this_ds.attrs["Fetched_by"] = "anonymous"
this_ds.attrs["Fetched_date"] = pd.to_datetime("now", utc=True).strftime(
"%Y/%m/%d"
)
this_ds.attrs["Fetched_constraints"] = self.cname() if not Fetched_constraints else Fetched_constraints
this_ds.attrs["Fetched_constraints"] = (
self.cname() if not Fetched_constraints else Fetched_constraints
)
this_ds.attrs["Fetched_uri"] = URI if not Fetched_url else Fetched_url
this_ds = this_ds[np.sort(this_ds.data_vars)]

Expand Down
4 changes: 2 additions & 2 deletions argopy/data_fetchers/gdacftp_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -283,7 +283,7 @@ def _preprocess_multiprof(self, ds):
ds.attrs["Fetched_from"] = self.server
try:
ds.attrs["Fetched_by"] = getpass.getuser()
except:
except: # noqa: E722
ds.attrs["Fetched_by"] = 'anonymous'
ds.attrs["Fetched_date"] = pd.to_datetime("now", utc=True).strftime("%Y/%m/%d")
ds.attrs["Fetched_constraints"] = self.cname()
Expand Down Expand Up @@ -352,7 +352,7 @@ def to_xarray(self, errors: str = "ignore"):
ds.attrs["Fetched_from"] = self.server
try:
ds.attrs["Fetched_by"] = getpass.getuser()
except:
except: # noqa: E722
ds.attrs["Fetched_by"] = 'anonymous'
ds.attrs["Fetched_date"] = pd.to_datetime("now", utc=True).strftime("%Y/%m/%d")
ds.attrs["Fetched_constraints"] = self.cname()
Expand Down
Loading
Loading