Skip to content

Commit

Permalink
Merge pull request #94 from COSIMA/separate-ic-and-boundary
Browse files Browse the repository at this point in the history
Separate ic and boundary
  • Loading branch information
ashjbarnes authored Feb 16, 2024
2 parents 2904cb2 + 3f8f0ed commit 3e98442
Show file tree
Hide file tree
Showing 10 changed files with 653 additions and 309 deletions.
12 changes: 11 additions & 1 deletion .github/workflows/testing.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ jobs:
testing:
needs: formatting
runs-on: ubuntu-latest
container: ghcr.io/cosima/regional-test-env:updated
defaults:
run:
shell: bash -el {0}
Expand All @@ -29,6 +30,7 @@ jobs:
- name: Set up Python ${{ matrix.python-version }}
uses: conda-incubator/setup-miniconda@v2
with:
miniconda-version: "latest"
auto-update-conda: true
auto-activate-base: false
environment-file: environment-ci.yml
Expand All @@ -38,14 +40,22 @@ jobs:
python -m pip install .
- name: Install pytest
run: |
python -m pip install pytest pytest-cov
python -m pip install pytest pytest-cov nbval
- name: Test with pytest
run: |
if [[ "${{ matrix.python-version }}" == "3.10" ]]; then
python -m pytest --cov=regional_mom6 --cov-report=xml tests/
else
python -m pytest tests/
fi
- name: Test the example notebook
run: |
ln -s /data demos/PATH_TO_GLORYS_DATA
ln -s /data demos/PATH_TO_GEBCO_FILE
ln -s /build/FRE-NCtools/tools demos/PATH_TO_FRE_TOOLS
ln -s ../ demos/PATH_TO_REGIONAL_MOM6_CODE
python -m pytest --nbval demos/reanalysis-forced.ipynb --nbval-current-env --cov=regional_mom6 --cov-report=xml tests/
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
if: ${{ matrix.python-version == '3.10' }}
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,4 @@ regional_mom6.egg-info
.pytest_cache
.env
env
docker
9 changes: 9 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,15 @@ Source](https://earthsystemmodeling.org/esmpy_doc/release/latest/html/install.ht
instructions to do this in a Conda-free way. With `esmpy` available, you can then install
`regional_mom6` via pip. If your environment doesn't yet have pip, then `conda install pip` should do the job.

It's recommended that you get started with the example notebooks. In this case you'll want to clone this entire repo so that you have access to the `demo` folder. This contains some template configuration files for MOM6, and example notebooks that walk you through the package.

```bash
git clone git@github.com:COSIMA/regional-mom6.git
pip install .
```

Alternatively you can install the python package only. Some functionality will be missing in this case, namely the `setup_run_directory` function as it relies on copying across and modifying the template files.

```bash
pip install git+https://github.com/COSIMA/regional-mom6.git
```
Expand Down
Empty file removed demos/README.md
Empty file.
27 changes: 20 additions & 7 deletions demos/access_om2-forced.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -770,9 +770,8 @@
"metadata": {},
"outputs": [],
"source": [
"expt.ocean_forcing(\n",
" tmpdir, ## Path to ocean foring files\n",
" {\"time\":\"time\",\n",
"# Define a mapping from the MOM5 B grid variables and dimensions to the MOM6 C grid ones\n",
"ocean_varnames = {\"time\":\"time\",\n",
" \"yh\":\"yt_ocean\",\n",
" \"xh\":\"xt_ocean\",\n",
" \"xq\":\"xu_ocean\",\n",
Expand All @@ -781,10 +780,24 @@
" \"eta\":\"eta_t\",\n",
" \"u\":\"u\",\n",
" \"v\":\"v\",\n",
" \"tracers\":{\"salt\":\"salt\",\"temp\":\"temp\"}},\n",
" boundaries = [\"south\",\"north\",\"west\",\"east\"],\n",
" \"tracers\":{\"salt\":\"salt\",\"temp\":\"temp\"}}\n",
"\n",
"# Set up the initial condition\n",
"expt.initial_condition(\n",
" tmpdir, # The directory where the unprocessed initial condition is stored, as definied earlier\n",
" ocean_varnames,\n",
" gridtype=\"B\"\n",
" )"
" )\n",
"\n",
"# Now iterate through our four boundaries \n",
"for i,orientation in enumerate([\"south\",\"north\",\"west\",\"east\"]):\n",
" expt.rectangular_boundary(\n",
" tmpdir / (orientation + \"_unprocessed.nc\"),\n",
" ocean_varnames,\n",
" orientation, # Needs to know the cardinal direction of the boundary\n",
" i + 1, # Just a number to identify the boundary. Indexes from 1 \n",
" gridtype=\"B\"\n",
" )"
]
},
{
Expand All @@ -802,7 +815,7 @@
"metadata": {},
"outputs": [],
"source": [
"expt.FRE_tools((10,10))\n"
"expt.FRE_tools(layout = (10,10))\n"
]
},
{
Expand Down
5 changes: 0 additions & 5 deletions demos/premade_run_directories/common_files/MOM_override
Original file line number Diff line number Diff line change
@@ -1,9 +1,4 @@
## Add override files here

!#override OBC_SEGMENT_001 = "J=0,I=0:N,ORLANSKI" !
!#override OBC_SEGMENT_002 = "J=0,I=0:N,ORLANSKI" !
!#override OBC_SEGMENT_003 = "J=0,I=0:N,ORLANSKI" !
!#override OBC_SEGMENT_004 = "J=0,I=0:N,ORLANSKI" !

#override DT=50
#override DT_THERM=300
61 changes: 42 additions & 19 deletions demos/reanalysis-forced.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,7 @@
"import regional_mom6 as rm\n",
"from pathlib import Path\n",
"from dask.distributed import Client\n",
"client = Client()\n",
"client"
"client = Client() # Start a dask cluster"
]
},
{
Expand Down Expand Up @@ -67,13 +66,13 @@
"daterange = [\"2003-01-01 00:00:00\", \"2003-01-05 00:00:00\"] ## 2003 is a good compimise for GLORYs and JRA forcing as they overlap. JRA ends in 2012, GLORYS starts in 1993\n",
"\n",
"## Place where all your input files go \n",
"inputdir = Path(f\"YOUR_PATH/mom6_inputdirs/{expt_name}/\")\n",
"inputdir = Path(f\"mom6_input_directories/{expt_name}/\")\n",
"\n",
"## Directory where you'll run the experiment from\n",
"rundir = Path(f\"YOUR_PATH/mom6_rundirs/{expt_name}/\")\n",
"rundir = Path(f\"mom6_run_directories/{expt_name}/\")\n",
"\n",
"## Directory where fre tools are stored \n",
"toolpath = Path(\"PATH_TO_COMPILED_FRE_TOOLS\") ## Compiled tools needed for construction of mask tables\n",
"toolpath = Path(\"PATH_TO_FRE_TOOLS\") ## Compiled tools needed for construction of mask tables\n",
"\n",
"## Path to where your raw ocean forcing files are stored\n",
"glorys_path = Path(\"PATH_TO_GLORYS_DATA\" )\n",
Expand Down Expand Up @@ -197,7 +196,12 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"tags": [
"nbval-ignore-output",
"nbval-skip"
]
},
"outputs": [],
"source": [
"expt.topog.depth.plot()"
Expand All @@ -222,9 +226,8 @@
"metadata": {},
"outputs": [],
"source": [
"expt.ocean_forcing(\n",
" glorys_path, ## Path to ocean foring files\n",
" {\"time\":\"time\",\n",
"# Define a mapping from the GLORYS variables and dimensions to the MOM6 ones\n",
"ocean_varnames = {\"time\":\"time\",\n",
" \"y\":\"latitude\",\n",
" \"x\":\"longitude\",\n",
" \"zl\":\"depth\",\n",
Expand All @@ -234,10 +237,24 @@
" \"tracers\":{\"salt\":\"so\",\n",
" \"temp\":\"thetao\"\n",
" }\n",
" },\n",
" boundaries = [\"south\",\"north\",\"west\",\"east\"],\n",
" gridtype=\"A\" ## Grid type. This is an Arakawa A grid sice velocities and tracers are all on the same points\n",
")"
" }\n",
"\n",
"# Set up the initial condition\n",
"expt.initial_condition(\n",
" glorys_path / \"ic_unprocessed.nc\", # The directory where the unprocessed initial condition is stored, as definied earlier\n",
" ocean_varnames,\n",
" gridtype=\"A\"\n",
" )\n",
"\n",
"# Now iterate through our four boundaries \n",
"for i,orientation in enumerate([\"south\",\"north\",\"west\",\"east\"]):\n",
" expt.rectangular_boundary(\n",
" glorys_path / (orientation + \"_unprocessed.nc\"),\n",
" ocean_varnames,\n",
" orientation, # Needs to know the cardinal direction of the boundary\n",
" i + 1, # Just a number to identify the boundary. Indexes from 1 \n",
" gridtype=\"A\"\n",
" )"
]
},
{
Expand All @@ -255,7 +272,7 @@
"metadata": {},
"outputs": [],
"source": [
"expt.FRE_tools((10,10)) ## Here the tuple defines the processor layout\n"
"expt.FRE_tools(layout=(10,10)) ## Here the tuple defines the processor layout\n"
]
},
{
Expand Down Expand Up @@ -289,7 +306,11 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"tags": [
"nbval-skip"
]
},
"outputs": [],
"source": [
"expt.setup_era5(\"PATH_TO_ERA5_DATA/era5/single-levels/reanalysis\")"
Expand All @@ -301,7 +322,9 @@
"source": [
"## Step 8: Modify the default input directory to make a (hopefully) runnable configuration out of the box\n",
"\n",
"This step copies the default directory, and modifies the `MOM_layout` files to match your experiment by inserting the right number of x,y points and cpu layout. If you use Payu to run mom6, set the `using_payu` flag to `True` and an example `config.yaml` file will be copied to your run directory. This still needs to be modified manually to work with your projects, executable etc."
"This step copies the default directory, and modifies the `MOM_layout` files to match your experiment by inserting the right number of x,y points and cpu layout. If you use Payu to run mom6, set the `using_payu` flag to `True` and an example `config.yaml` file will be copied to your run directory. This still needs to be modified manually to work with your projects, executable etc.\n",
"\n",
"You also need to pass the path to where you git cloned the mom6-regional code. This just ensures that the funciton can find the premade run directories."
]
},
{
Expand All @@ -310,7 +333,7 @@
"metadata": {},
"outputs": [],
"source": [
"expt.setup_run_directory(surface_forcing = \"era5\",using_payu = False)"
"expt.setup_run_directory(\"PATH_TO_REGIONAL_MOM6_CODE\",surface_forcing = \"era5\",using_payu = False)"
]
},
{
Expand All @@ -331,9 +354,9 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python [conda env:analysis3-unstable]",
"display_name": "Python [conda env:analysis3-23.10] *",
"language": "python",
"name": "conda-env-analysis3-unstable-py"
"name": "conda-env-analysis3-23.10-py"
},
"language_info": {
"codemirror_mode": {
Expand Down
1 change: 0 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@ dependencies = [
"scipy>=1.2.0",
"xarray",
"xesmf>=0.8",
"PyYAML>=6.0.1",
"f90nml>=1.4.1",
]

Expand Down
Loading

0 comments on commit 3e98442

Please sign in to comment.