-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding regression test notebooks on hold #67
Comments
Currently the environment.yml file sets the following environment variables:
Are you not seeing these variables when in the JWST_validation_notebook environment? |
Yes, sorry, thanks Brian! I dumped my notes for things to check without checking yet. So we can cross the TEST_BIGDATA item off. |
With respect to the other issues:
So, if it meets with your approval, I'll start out by making changes to my repository that address 4 and 5 above and, if they pass testing, I'll make a PR. Then, from that base, I can try making myself a copy of your regression notebooks, and seeing what happens when I run convert.py on them locally. If they work that way, you can add the notebooks to a new PR, and we'll mark it as closing this issue once accepted and merged. Sound good? |
Yeah I was trying to think about this. I'm worried because accessing and running the unit and regression test scripts in the JWST pipeline repo might be a little different than accessing the normal pipeline software for our validation testing notebooks (the description is here: https://github.com/spacetelescope/jwst#unit-tests and the following sections). The pytests run based off the conftest.py file in the JWST pipeline repo: https://github.com/spacetelescope/jwst/blob/master/jwst/conftest.py, so it seems like they assume there's a local copy of the jwst software to use. If you want to try some things, I'm totally fine with it. I just had to think a little more about how this might work. I submitted a PR to update the pytest code in the pipeline software so hopefully that will be worked out soon. |
So, just out of curiosity, do any of the above notebooks directly use pytest? Currently nothing in the notebook validation repo uses it at all, either in our CI code (which just runs nbpages.check_nbs) or our Jenkins code (which runs the repo's convert.py file). |
The normal notebooks that we have in the jwst_validation_notebook repo don't use pytest. But the "unit test" notebooks and "regression test" notebooks that I wanted to add pull in tests from the jwst pipeline repo, which rely on pytest. I'm not sure if the unit test or regression test notebooks would work in our infrastructure, but I wanted to try to explore whether we could make them work (as long as it's not a huge time sink for all of us!) |
Regression test notebooks are staged here: /grp/jwst/wit/nircam/canipe/validation_notebook_staging/
There are a couple outstanding questions to investigate so I'm not submitting a PR yet.
1. New environment variable needed:
export TEST_BIGDATA=https://bytesalad.stsci.edu/artifactory
DONE2. These access the pipeline artifactory instance. I'm not sure if this will work with our infrastructure.
3. They take a long time to run.
4. These depend on the same pytest update as the unit tests: https://jira.stsci.edu/browse/JP-1881
5. These also require the same extra packages as the unit tests (see PR #63 )
ipython
pytest-xdist
pytest-html
pip install -e .[test,docs]
The text was updated successfully, but these errors were encountered: