Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding regression test notebooks on hold #67

Open
aliciacanipe opened this issue Jan 27, 2021 · 6 comments
Open

Adding regression test notebooks on hold #67

aliciacanipe opened this issue Jan 27, 2021 · 6 comments
Assignees

Comments

@aliciacanipe
Copy link
Collaborator

aliciacanipe commented Jan 27, 2021

Regression test notebooks are staged here: /grp/jwst/wit/nircam/canipe/validation_notebook_staging/

There are a couple outstanding questions to investigate so I'm not submitting a PR yet.

1. New environment variable needed: export TEST_BIGDATA=https://bytesalad.stsci.edu/artifactory DONE
2. These access the pipeline artifactory instance. I'm not sure if this will work with our infrastructure.
3. They take a long time to run.
4. These depend on the same pytest update as the unit tests: https://jira.stsci.edu/browse/JP-1881
5. These also require the same extra packages as the unit tests (see PR #63 )
ipython
pytest-xdist
pytest-html
pip install -e .[test,docs]

@aliciacanipe aliciacanipe self-assigned this Jan 27, 2021
@aliciacanipe aliciacanipe linked a pull request Jan 27, 2021 that will close this issue
@york-stsci
Copy link
Collaborator

Currently the environment.yml file sets the following environment variables:

Are you not seeing these variables when in the JWST_validation_notebook environment?

@aliciacanipe
Copy link
Collaborator Author

Yes, sorry, thanks Brian! I dumped my notes for things to check without checking yet. So we can cross the TEST_BIGDATA item off.

@york-stsci
Copy link
Collaborator

With respect to the other issues:

  • I can test accessing the pipeline instance. If it doesn't work, would it be practical to put copies of the required data into
    our own instance?
  • We already have notebooks that take up to about half an hour to run. What does 'a long time' mean here?
  • The repository doesn't currently have a conftest.py file that I can find. If you like, I can test locally what happens when I
    create such a file, and give it the content in that JIRA link. If it doesn't break anything that's currently there, then I don't
    see a problem with adding it.
  • Adding those packages to the environment.yml file should be relatively easy, and I don't see the existence of new
    packages as having a significant chance of breaking things that already work (although I would, of course, test that).

So, if it meets with your approval, I'll start out by making changes to my repository that address 4 and 5 above and, if they pass testing, I'll make a PR. Then, from that base, I can try making myself a copy of your regression notebooks, and seeing what happens when I run convert.py on them locally. If they work that way, you can add the notebooks to a new PR, and we'll mark it as closing this issue once accepted and merged.

Sound good?

@aliciacanipe
Copy link
Collaborator Author

Yeah I was trying to think about this. I'm worried because accessing and running the unit and regression test scripts in the JWST pipeline repo might be a little different than accessing the normal pipeline software for our validation testing notebooks (the description is here: https://github.com/spacetelescope/jwst#unit-tests and the following sections). The pytests run based off the conftest.py file in the JWST pipeline repo: https://github.com/spacetelescope/jwst/blob/master/jwst/conftest.py, so it seems like they assume there's a local copy of the jwst software to use. If you want to try some things, I'm totally fine with it. I just had to think a little more about how this might work. I submitted a PR to update the pytest code in the pipeline software so hopefully that will be worked out soon.

@york-stsci
Copy link
Collaborator

So, just out of curiosity, do any of the above notebooks directly use pytest? Currently nothing in the notebook validation repo uses it at all, either in our CI code (which just runs nbpages.check_nbs) or our Jenkins code (which runs the repo's convert.py file).

@aliciacanipe
Copy link
Collaborator Author

The normal notebooks that we have in the jwst_validation_notebook repo don't use pytest. But the "unit test" notebooks and "regression test" notebooks that I wanted to add pull in tests from the jwst pipeline repo, which rely on pytest. I'm not sure if the unit test or regression test notebooks would work in our infrastructure, but I wanted to try to explore whether we could make them work (as long as it's not a huge time sink for all of us!)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants