Skip to content

Commit

Permalink
deploy: cc5ce8d
Browse files Browse the repository at this point in the history
  • Loading branch information
shanicetbailey committed Jun 20, 2024
1 parent b2a70fc commit 6016e2e
Show file tree
Hide file tree
Showing 5 changed files with 155 additions and 137 deletions.
38 changes: 24 additions & 14 deletions _sources/notebooks/cfc-ocean.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,9 @@
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"### Prerequisites\n",
"Some relavent knowledge on how to use certain packages (e.g. `xarray`, `matplotlib`) would be helpful to you in understanding this tutorial. If you need a refresher on how to employ these packages please refer to the [Pythia Foundations](https://foundations.projectpythia.org/landing-page.html) page. Also, please refer to the Project Pythia's [CMIP6 cookbook](https://projectpythia.org/cmip6-cookbook/README.html) page to familiarize yourself on how to ingest CMIP6 data.\n",
Expand Down Expand Up @@ -96,6 +98,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"**We will need Dask to load in our data into memory to be able to run the rest of the notebook smoothly. Let's initiate our Dask Client here.**\n",
"\n",
"To initiate your ```Client```, navigate to the Dask ![<Dask>](dask_icon_v2.ico \"Dask logo\") tab in the far-left sidebar and click on the `+New` button in the lower left of the sidebar window (you can `Scale` to 8 workers to make the computation faster). Once the cluster is active, a `<>` button will appear left of the `Scale` button . Click on it and it will paste a cell with the Client code (like the one below) - run that and delete this old cell. Click on the `Launch dashboard in JupyterLab` button and the Task Stream, Workers Memory and Progress windows will open. In order to get them to render you have to replace `tcp://127.0.0.1:` with `proxy` in the searchbar at the top of the sidebar window"
]
},
Expand Down Expand Up @@ -1739,7 +1743,7 @@
}
],
"source": [
"# get the path to a specific zarr store (the first one from the dataframe above)\n",
"# get the path to a specific zarr store (the last row from the dataframe above)\n",
"zstore = df_tos['zstore'].values[-1]\n",
"print(zstore)\n",
"\n",
Expand Down Expand Up @@ -1785,7 +1789,7 @@
}
],
"source": [
"# load ocean basin data\n",
"# load ocean basin data to use for masking purposes in the cells below\n",
"basins = regionmask.defined_regions.natural_earth_v4_1_0.ocean_basins_50\n",
"basins.plot(add_ocean=False, add_label=False)"
]
Expand Down Expand Up @@ -2390,6 +2394,8 @@
"metadata": {},
"outputs": [],
"source": [
"#Mask the SST to keep only the Caribbean/Gulf of Mexico region -\n",
"#this excludes the Pacific ocean side of our boxed region\n",
"natl = 0\n",
"sst = ds.tos.where(basin_mask==natl)\n",
"carib = dict(x=slice(-98, -55), y=slice(8, 31), time=slice('1980', '2015'))\n",
Expand All @@ -2402,6 +2408,7 @@
"metadata": {},
"outputs": [],
"source": [
"#manipulated chunking to optimize run\n",
"csst = csst_unchunked.chunk({'time':-1, 'x':1,'y':1}).load()"
]
},
Expand Down Expand Up @@ -3091,6 +3098,7 @@
"metadata": {},
"outputs": [],
"source": [
"#Create a trend line to visualize the increase in SST over time\n",
"from scipy.stats import linregress\n",
"lr = linregress(np.arange(0,12775), csst.mean(['x', 'y']).fillna(0.))\n",
"trend = (lr[0]*np.arange(0, 12775) + lr[1])"
Expand Down Expand Up @@ -4303,6 +4311,7 @@
"metadata": {},
"outputs": [],
"source": [
"#create a land mask for plotting purposes\n",
"land_mask = ds.tos.sel(**carib).isel(time=0).isnull()"
]
},
Expand Down Expand Up @@ -4350,36 +4359,36 @@
"source": [
"### El Niño 3.4 index\n",
"\n",
"This section is borrowed from one of Project Pythia's [Intro to Pandas](https://foundations.projectpythia.org/core/pandas/pandas.html#quick-plots-of-your-data) notebook"
"This section is borrowed from one of Project Pythia's [Intro to Pandas](https://foundations.projectpythia.org/core/pandas/pandas.html#) notebook"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {
"collapsed": true,
"jupyter": {
"outputs_hidden": true
}
},
"execution_count": 1,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Requirement already satisfied: pythia_datasets in /srv/conda/envs/notebook/lib/python3.11/site-packages (2021.9.21)\n",
"Collecting pythia_datasets\n",
" Using cached pythia_datasets-2021.9.21-py3-none-any.whl.metadata (5.3 kB)\n",
"Requirement already satisfied: pooch in /srv/conda/envs/notebook/lib/python3.11/site-packages (from pythia_datasets) (1.8.1)\n",
"Requirement already satisfied: platformdirs>=2.5.0 in /srv/conda/envs/notebook/lib/python3.11/site-packages (from pooch->pythia_datasets) (4.2.2)\n",
"Requirement already satisfied: packaging>=20.0 in /srv/conda/envs/notebook/lib/python3.11/site-packages (from pooch->pythia_datasets) (24.0)\n",
"Requirement already satisfied: requests>=2.19.0 in /srv/conda/envs/notebook/lib/python3.11/site-packages (from pooch->pythia_datasets) (2.32.1)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /srv/conda/envs/notebook/lib/python3.11/site-packages (from requests>=2.19.0->pooch->pythia_datasets) (3.3.2)\n",
"Requirement already satisfied: idna<4,>=2.5 in /srv/conda/envs/notebook/lib/python3.11/site-packages (from requests>=2.19.0->pooch->pythia_datasets) (3.7)\n",
"Requirement already satisfied: urllib3<3,>=1.21.1 in /srv/conda/envs/notebook/lib/python3.11/site-packages (from requests>=2.19.0->pooch->pythia_datasets) (1.26.18)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /srv/conda/envs/notebook/lib/python3.11/site-packages (from requests>=2.19.0->pooch->pythia_datasets) (2024.2.2)\n"
"Requirement already satisfied: certifi>=2017.4.17 in /srv/conda/envs/notebook/lib/python3.11/site-packages (from requests>=2.19.0->pooch->pythia_datasets) (2024.2.2)\n",
"Using cached pythia_datasets-2021.9.21-py3-none-any.whl (8.7 kB)\n",
"Installing collected packages: pythia_datasets\n",
"Successfully installed pythia_datasets-2021.9.21\n"
]
}
],
"source": [
"# Install the ENSO data from Pythia's data library\n",
"! pip install pythia_datasets"
]
},
Expand Down Expand Up @@ -4624,6 +4633,7 @@
"metadata": {},
"outputs": [],
"source": [
"#Select the Nino3.4 data column\n",
"oni_time_unmatched = df['Nino34anom']\n",
"oni_pd = oni_time_unmatched.loc['1982':'2014']\n",
"oni = oni_pd.to_xarray()"
Expand Down Expand Up @@ -4773,7 +4783,7 @@
"metadata": {},
"source": [
"### Summary\n",
"In this notebook we used CMIP6 NOAA/GFDL CM4 data to identify the 99th percentile heat extremes that persisted for $\\geq$10 days. We also plotted the Nino3.4 index and overlaid that timeseries with the extreme SST timeseries. (Future work includes running basic correlation analysis to discern any relationship with ENSO and the extreme SSTs.)\n",
"In this notebook we used CMIP6 NOAA/GFDL CM4 data to identify the 99th percentile heat extremes that persisted for $\\geq$10 days. We also plotted the Nino3.4 index and overlaid that timeseries with the extreme SST timeseries. (Future work includes running basic correlation analysis to discern any relationship with ENSO and the extreme SSTs)\n",
"\n",
"#### What's next?\n",
"Next, we will look at environmental changes in the CMIP6 HighResMIP simulations and calculate changes in genesis potential index, an indicator of tropical cyclone activity for the Caribbean region."
Expand Down
Loading

0 comments on commit 6016e2e

Please sign in to comment.