From 6e5ae60d81d53fe325c5be16f4bfdb5edfac22fc Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Mon, 21 Oct 2024 20:21:48 +0530 Subject: [PATCH] Update docs with regression data (#2842) * Update comment with git regression data * update hyperlink * Update file name * Update running tests page * Removing outdated updating test data procedures * Remove azure pipelines from tardis docs * remove integration tests section * Remove integration tests mention from docs * Update docs as per review * Update docs as per review * Fix hyperlink on homepage * update team page hyperlink * update testing pipeline description * Remove manual procedure heading * Change tarids into upper case letters * Add commonly followed steps in current tardis pipelines * Fix headings * Change sunday from all caps to only initial letter as capital * Update testing pipeline description * update group email --- CODE_OF_CONDUCT.md | 2 +- GOVERNANCE.md | 2 +- docs/contributing/development/azure_links.inc | 15 - .../development/continuous_integration.rst | 315 ++---------------- docs/contributing/development/index.rst | 2 +- docs/contributing/development/links.inc | 1 - .../development/running_tests.rst | 8 +- .../development/update_refdata.rst | 56 ---- .../development/update_regression_data.rst | 27 ++ docs/index.rst | 2 +- docs/multiindex_isotope_decay_data.ipynb | 2 +- docs/quickstart.ipynb | 2 +- 12 files changed, 61 insertions(+), 373 deletions(-) delete mode 100644 docs/contributing/development/azure_links.inc delete mode 100644 docs/contributing/development/update_refdata.rst create mode 100644 docs/contributing/development/update_regression_data.rst diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index e904a91ac74..8562360134f 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -15,4 +15,4 @@ As members of the community, This code of conduct has been adapted from the Astropy Code of Conduct, which in turn uses parts of the PSF code of conduct. -**To report any violations of the code of conduct, please contact a member of the [TARDIS core team](https://tardis-sn.github.io/team/community_roles/index.html) (the email tardis.supernova.code@gmail.com is monitored by the core team) or the Ombudsperson (see the [team page](https://tardis-sn.github.io/team/community_roles/index.html); who is outside of the TARDIS collaboration and will treat reports confidentially).** +**To report any violations of the code of conduct, please contact a member of the [TARDIS core team](https://tardis-sn.github.io/people/core/) (the email tardiscollaboration@gmail.com is monitored by the core team) or the Ombudsperson (see the [team page](https://tardis-sn.github.io/people/core/); who is outside of the TARDIS collaboration and will treat reports confidentially).** diff --git a/GOVERNANCE.md b/GOVERNANCE.md index d0563d3d05e..7b9f508473c 100644 --- a/GOVERNANCE.md +++ b/GOVERNANCE.md @@ -1,3 +1,3 @@ # TARDIS Collaboration Governance -Please visit our website to learn more about the [TARDIS Governance](https://tardis-sn.github.io/team/governance/). +Please visit our website to learn more about the [TARDIS Governance](https://tardis-sn.github.io/people/governance/). diff --git a/docs/contributing/development/azure_links.inc b/docs/contributing/development/azure_links.inc deleted file mode 100644 index af352a44a42..00000000000 --- a/docs/contributing/development/azure_links.inc +++ /dev/null @@ -1,15 +0,0 @@ - -.. azure stuff -.. _azure DevOps: http://azure.microsoft.com/en-us/services/devops/?nav=mi -.. _azure documentation section on triggers: https://docs.microsoft.com/en-us/azure/devops/pipelines/repos/github?view=azure-devops&tabs=yaml#ci-triggers -.. _azure documentation section on variables: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch -.. _azure documentation section on jobs: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml -.. _azure documentation section on templates: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops -.. _checking out multiple repositories: https://github.com/microsoft/azure-pipelines-yaml/blob/master/design/multi-checkout.md#behavior-changes-in-multi-checkout-mode -.. general stuff -.. _continuous integration: https://en.wikipedia.org/wiki/Continuous_integration -.. _update it manually: https://docs.microsoft.com/en-us/azure/devops/repos/git/import-git-repository?view=azure-devops#what-if-my-source-repository-contains-git-lfs-objects -.. _error 413: https://developercommunity.visualstudio.com/content/problem/867488/git-lfs-push-got-413-error.html - -.. vim: ft=rstS - diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index b8b181b8c7d..45db365ccf1 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -10,269 +10,45 @@ or a change is merged into the *master* branch, a service will clone the repository, checkout to the current commit and execute all the TARDIS tests. This helps us to detect bugs immediately. - -Azure Repos ------------ - -Azure Repos is just another service to store Git repositories. -Currently, we use Azure Repos to mirror ``tardis-refdata`` -repository since Azure does not impose limits on LFS bandwidth -nor storage. - -**To clone this repository:** - -.. code-block:: bash - - git clone https://tardis-sn@dev.azure.com/tardis-sn/TARDIS/_git/tardis-refdata - -**To download a LFS file through HTTPS:** - -.. code-block:: none - - https://dev.azure.com/tardis-sn/TARDIS/_apis/git/repositories/tardis-refdata/items?path=atom_data/kurucz_cd23_chianti_H_He.h5&resolveLfs=true - -This mirror is automatically synced by `a GitHub workflow`. If you want -to `update it manually`_, remember to set ``git config http.version HTTP/1.1`` -to avoid `error 413`_ while pushing large files. - - -Azure Pipelines & GitHub Actions --------------------------------- - -Currently, we use the `Azure DevOps`_ service to run most of our -pipelines and GitHub Actions for some others (called "workflows"). The -following sections explains briefly the different components of a -pipeline/workflow, mostly focused on the Azure service. +GitHub Actions +-------------- A pipeline (or a workflow) is essentially a :term:`YAML` configuration file with different sections such as variables, jobs and steps. These files run commands or tasks when they are triggered by some event, like a commit being pushed to a certain branch. -Pipelines on Azure must be created through the web UI for the first time. -Then, making changes to an existing pipeline is as easy as making a pull -request. To create a new workflow on GitHub, just create a new YAML file -in ``.github/workflows``. - - -Triggers --------- - -First thing to do is telling the pipeline when it should run. In -Azure, *trigger* (also known as the CI trigger) sets up the pipeline -to run every time changes are pushed to a branch. - -.. code-block:: yaml - - trigger: - - master - -If some trigger is not specified then the default configuration -is assumed. - -.. code-block:: yaml - - trigger: - branches: - include: - - '*' - - pr: - branches: - include: - - '*' - -This means the pipeline will start running every time changes are -merged to any branch of the repository, or someone pushes new -commits to a pull request. - -If you want to run a pipeline only manually set both triggers to -*none*. - -.. code-block:: yaml - - trigger: none - - pr: none - -Notice that you can test changes in a pipeline by activating the PR -trigger on a new pull request, even if that trigger is disabled on -the YAML file present in the *master* branch. - -On GitHub Actions these triggers are named ``push`` and ``pull_request``, -and works mostly in the same way. - -.. warning:: Triggers also can be set on the Azure's web interface - too, but this action is discouraged, since it overrides - any trigger specified in the YAML file and could lead to - confusing situations. - -There are more useful triggers such as the *cron* trigger, see the -`Azure documentation section on triggers`_ for more information. - - -Variables ---------- - -Variable syntax -=============== - -Azure Pipelines supports three different ways to reference variables: -*macro*, *template expression*, and *runtime expression*. Each syntax -can be used for a different purpose and has some limitations. - -.. image:: images/variables.png - :align: center - -**What syntax should I use?** Use *macro syntax* if you are providing -input for a task. Choose a *runtime expression* if you are working with -conditions and expressions. If you are defining a variable in a template, -use a *template expression*. - - -Define variables -================ - -Usually, we define variables at the top of the YAML file. - -.. code-block:: yaml - - variables: - my.var: 'foo' - - steps: - - bash: | - echo $(my.var) - -When a variable is defined at the top of a YAML, it will be available -to all jobs and stages in the pipeline as a *global variable*. -Variables at the *stage* level override variables at the *root* level, -while variables at the *job* level override variables at the *root* -and *stage* level. - -Also, variables are available to scripts through environment variables. -The name is upper-cased and ``.`` is replaced with ``_``. For example - -.. code-block:: yaml - - variables: - my.var: 'foo' - - steps: - - bash: | - echo $MY_VAR - -To set a variable from a script task, use the ``task.setvariable`` logging -command. - -.. code-block:: yaml - - steps: - - - bash: | - echo "##vso[task.setvariable variable=my.var]foo" - - - bash: | - echo $(my.var) - -See the `Azure documentation section on variables`_ for more information. - - -Predefined variables --------------------- - -The most important (and confusing) predefined variables are the ones related -to paths in Azure: - -* All folders for a given pipeline are created under ``Agent.BuildDirectory`` - variable, alias ``Pipeline.Workspace``. This includes subdirectories like - ``/s`` for sources or ``/a`` for artifacts. - -* Path to source code varies depending on how many repositories we fetch. - For example, source code is located under the ``Build.Repository.LocalPath`` - variable (alias ``Build.SourcesDirectory``) when fetching a single repository, - but after fetching a second repository code is moved automatically to - ``Build.Repository.LocalPath/repository-name``. - -See the Azure documentation to learn more about `checking out multiple repositories`_. - - -Jobs ----- - -You can organize your pipeline into jobs. Every pipeline has at least one job. -A job is a series of steps that run sequentially as a unit. In other words, -a job is the smallest unit of work that can be scheduled to run. - - -.. code-block:: yaml - - jobs: - - job: myJob - - pool: - vmImage: 'ubuntu-latest' - - steps: - - bash: echo "Hello world" - -Jobs can run in parallel (for example: run the same job on multiple OSes) or -depend on a previous job. - -See the `Azure documentation section on jobs`_ for more information. - +Currently, we use GitHub Actions to run all of our pipelines. Making changes to an existing +pipeline is as easy as making a pull request. To create a new GitHub Action workflow, +just create a new YAML file in ``.github/workflows``. TARDIS Pipelines ----------------- - -Brief description of pipelines already implemented on Azure or GitHub Actions. - - -The default template -==================== - -Templates let you define reusable content, logic, and parameters. It functions -like an include directive in many programming languages (content from one file -is inserted into another file). - -The common set of steps used across most TARDIS pipelines resides in the -"default" template: +================ -- Force ``set -e`` on all Bash steps. -- Set TARDIS custom variables. -- Fetch TARDIS main repository. -- Fetch TARDIS reference data repository from mirror (optional). -- Configure Anaconda for Linux and macOS agents. -- Install Mamba package manager (optional). -- Install TARDIS environment (optional). -- Build and install TARDIS (optional). +Brief description of pipelines already implemented on TARDIS -It was written to make pipelines easier to create and maintain. For example, -to start a new pipeline use:: +Streamlined Steps for TARDIS Pipelines +======================================== - steps: - - template: templates/default.yml - parameters: - useMamba: true +We have a common set of steps which are utilized in TARDIS pipelines to streamline the process: -**List of template parameters:** +Common Steps +------------ -- ``fetchDepth`` (*int*): the depth of commits to fetch from ``tardis`` repository, - default is ``0`` (no limit). -- ``fetchRefdata`` (*bool*): fetch the ``tardis-refdata`` repository from Azure Repos, - default is ``false``. -- ``refdataRepo`` (*option*): source of the ``tardis-refdata`` repository, - options are ``azure`` (default) or ``github``. -- ``useMamba`` (*bool*): use the ``mamba`` package manager instead of ``conda``, - default is ``false``. -- ``tardisEnv`` (*bool*): setup the TARDIS environment, default is ``true``. +1. **Use `setup_lfs` Action** + - If you need access to regression or atomic data, incorporate the `setup_lfs` action to ensure proper handling of large file storage. -**List of predefined custom variables:** +2. **Use `setup_env` Action** + - To configure your environment effectively, utilize the `setup_env` action. This will help establish the necessary variables and settings for your pipeline. -- ``tardis.dir`` is equivalent to ``$(Build.SourcesDirectory)/tardis``. -- ``refdata.dir`` is equivalent to ``$(Build.SourcesDirectory)/tardis-refdata``. +3. **Run Configuration** + - Ensure that your pipeline runs with the appropriate shell settings. You can define this in your YAML configuration as follows: -See the `Azure documentation section on templates`_ for more information. + .. code-block:: yaml + + defaults: + run: + shell: bash -l {0} Documentation build pipeline @@ -291,9 +67,7 @@ See the :ref:`Documentation Preview ` section for more information. Testing pipeline ================ -The `testing pipeline`_ (CI) consists basically in the same job running twice -in parallel (one for each OS) with the steps from the default template, plus -extra steps to run the tests and upload the coverage results. +The `testing pipeline`_ (CI) comprises multiple concurrent jobs. Each of these jobs runs tests across two distinct categories—continuum and rpacket tracking—and supports two different operating systems. Additionally, there are extra steps involved in executing the tests and uploading the coverage results Authors pipeline @@ -315,44 +89,7 @@ In the near future we want to auto-update the citation guidelines in the Release pipeline ================ -Publishes a new release of TARDIS every sunday at 00:00 UTC. - - -Compare reference data pipeline -=============================== - -This pipeline compares two versions of the reference data. It's triggered manually via -the Azure Pipelines web UI, or when a TARDIS contributor leaves the following comment -on a pull request: -:: - - /AzurePipelines run compare-refdata - -For brevity, you can comment using ``/azp`` instead of ``/AzurePipelines``. - -By default, generates new reference data for the ``HEAD`` of the pull request. Then, -compares against latest reference data stored in ``tardis-refdata`` repository. If -you want to compare two different labels (SHAs, branches, tags, etc.) uncomment and -set the ``ref1.hash`` and ``ref2.hash`` variables in -``.github/workflows/compare-refdata.yml`` on your pull request. For example: -.. code-block:: yaml - - ref1.hash: 'upstream/pr/11' - ref2.hash: 'upstream/master' - -The web UI also allows to compare any version of the reference data by providing those -variables at runtime, but the access to the dashboard is restricted to a small group -of developers. - -.. warning:: If using the Azure dashboard, do not define ``ref1.hash`` and ``ref2.hash`` - between quotation marks or **the pipeline will fail**. This does not apply for - the YAML file. - -Finally, the report is uploaded to the -`OpenSupernova.org server `_ -following the ``/`` folder structure. If the pipeline fails, also a report is -generated, but not necessarily gives useful debug information (depends on which step the -pipeline has failed). +Publishes a new release of TARDIS every Sunday at 00:00 UTC. TARDIS Carsus Compatibility Check @@ -371,4 +108,4 @@ The Save Atomic Files Workflow The Save Atomic Files workflow runs every week but can also be triggered manually. It runs the "Bridge" and sends an artifact containing the generated atomic data file and the comparison notebook to Moria. This workflow has a separate job to indicate if the -bridge has failed. +bridge has failed. \ No newline at end of file diff --git a/docs/contributing/development/index.rst b/docs/contributing/development/index.rst index 53fde2cc824..4b72e603677 100644 --- a/docs/contributing/development/index.rst +++ b/docs/contributing/development/index.rst @@ -30,6 +30,6 @@ the core team (active maintainers) of TARDIS. :maxdepth: 2 continuous_integration - update_refdata + update_regression_data matterbridge debug_numba diff --git a/docs/contributing/development/links.inc b/docs/contributing/development/links.inc index 517b1d39948..f8b6bfd0061 100644 --- a/docs/contributing/development/links.inc +++ b/docs/contributing/development/links.inc @@ -2,5 +2,4 @@ .. include:: known_projects.inc .. include:: this_project.inc .. include:: git_links.inc -.. include:: azure_links.inc .. include:: matterbridge.inc \ No newline at end of file diff --git a/docs/contributing/development/running_tests.rst b/docs/contributing/development/running_tests.rst index 8143ac6c36f..f9438d1dbca 100644 --- a/docs/contributing/development/running_tests.rst +++ b/docs/contributing/development/running_tests.rst @@ -4,13 +4,9 @@ Running tests ************* -There are two basic categories of tests in TARDIS: 1) the unit -tests, and 2) the integration tests. Unit tests check the outputs of individual functions, -while the integration tests check entire runs for different setups of TARDIS. +In TARDIS, we focus primarily on unit tests. These tests check the outputs of individual functions, ensuring that each component behaves as expected. -The unit tests run very quickly and thus are executed after every suggested change -to TARDIS. The integration tests are much more costly and thus are only executed -every few days on a dedicated server. +Unit tests run quickly and are executed after every suggested change to TARDIS, allowing for immediate feedback and maintaining code quality. All of them are based on the excellent ``astropy-setup-helpers`` package and `pytest `_. diff --git a/docs/contributing/development/update_refdata.rst b/docs/contributing/development/update_refdata.rst deleted file mode 100644 index a3d53c3e0fe..00000000000 --- a/docs/contributing/development/update_refdata.rst +++ /dev/null @@ -1,56 +0,0 @@ -.. _update refdata: - -************************* -Update the Reference Data -************************* - -A special kind of tests are executed only when ``pytest`` is called alongside the ``--refdata`` flag. -These tests compares the output of the TARDIS code (mostly arrays) against the information stored -in the reference data files. - -TARDIS stores reference data in the `tardis-refdata `_ -repository. This repository also has a mirror hosted in Azure Pipelines (synchronized automatically by a -GitHub workflow) since this Microsoft service does not have limitations in bandwidth nor storage. - -Sometimes, this data needs to be updated. The procedure to update these files manually is not trivial -and has been automated recently thanks to the `NumFOCUS `_ support. - - -================= -Default Procedure -================= - -Imagine you are working on a new feature (or fix) for TARDIS, you have opened a pull request and the -reference data tests are failing in the testing pipeline. This could happen for many reasons: - -A. There's a problem in your code. -B. Your code is OK, but the reference data is outdated. -C. The pipeline is broken. - -If you think your could be dealing with scenario B, then: - -#. Write ``/azp run compare-refdata`` in a comment on your PR. -#. Analyze the results and discuss if the reference data effectively requires an update. -#. Update the reference data by writing ``/azp run update-refdata`` on a new comment. - -.. note:: - - - If you don't have enough privileges to run the pipelines, tag a TARDIS developer capable of doing so. - - If any of these two pipelines fail, please tag a `TARDIS team member `_ responsible for CI/CD. - -If everything went well, the reference data will have been updated by the TARDIS bot and the commit -message should include the pull request number that triggered the update. - -================ -Manual Procedure -================ - -The manual procedure is documented for debugging purposes and should not be used in general. - -#. Activate the ``tardis`` environment. -#. Fork and clone the ``tardis-refdata`` repository. -#. Follow the instructions at the top of the notebook ``tardis-refdata/notebooks/ref_data_compare.ipynb``. -#. Go to your local ``tardis`` repository and make sure you are working on the branch you want to generate new reference data from. -#. Generate new reference data with ``pytest tardis --refdata=/path/to/tardis-refdata --generate-reference``. -#. Run the ``ref_data_compare.ipynb`` notebook and check the results. -#. Make a new branch in ``tardis-refdata``, push your new reference data and open a pull request. diff --git a/docs/contributing/development/update_regression_data.rst b/docs/contributing/development/update_regression_data.rst new file mode 100644 index 00000000000..078726e71c0 --- /dev/null +++ b/docs/contributing/development/update_regression_data.rst @@ -0,0 +1,27 @@ +.. _update regression-data: + +************************* +Update the Regression Data +************************* + +A special kind of tests are executed only when ``pytest`` is called alongside the ``--regression-data`` flag. These tests compare the output of the TARDIS code (mostly arrays) against the information stored in the regression data files. + +TARDIS stores regression data in the `tardis-regression-data `_ repository. Sometimes, this data needs to be updated. The procedure to update these files has been simplified, allowing for a more straightforward process. + +Imagine you are working on a new feature (or fix) for TARDIS, and you have opened a pull request. If the regression data tests are failing, this could happen for various reasons: + +A. There's a problem in your code. +B. Your code is OK, but the regression data is outdated. +C. The pipeline is broken. + +If you suspect scenario B, please follow these instructions: + +#. Activate the ``tardis`` environment. +#. Fork and clone the ``tardis-regression-data`` repository. +#. Follow any necessary instructions within your local copy. +#. Go to your local ``tardis`` repository and ensure you are working on the branch from which you want to generate new regression data. +#. Generate new regression data with ``pytest tardis --regression-data=/path/to/tardis-regression-data --generate-reference``. +#. Check your results and ensure everything is correct. +#. Make a new branch in ``tardis-regression-data``, push your new regression data, and open a pull request. + +If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. \ No newline at end of file diff --git a/docs/index.rst b/docs/index.rst index 76a77a0246a..9fd09123640 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -19,7 +19,7 @@ TARDIS Core Package Documentation TARDIS is an open-source Monte Carlo radiative-transfer spectral synthesis code for 1D models of supernova ejecta. It is designed for rapid spectral modelling of supernovae. It is developed and maintained by a -`multi-disciplinary team `_ +`multi-disciplinary team `_ including software engineers, computer scientists, statisticians, and astrophysicists. diff --git a/docs/multiindex_isotope_decay_data.ipynb b/docs/multiindex_isotope_decay_data.ipynb index 9cf75e41ff0..f633bf4f320 100644 --- a/docs/multiindex_isotope_decay_data.ipynb +++ b/docs/multiindex_isotope_decay_data.ipynb @@ -67,7 +67,7 @@ } ], "source": [ - "# Download the atom data file from tardis-refdata repo to run this cell.\n", + "# Download the atom data file from tardis-regression-data repo to run this cell.\n", "download_atom_data('kurucz_cd23_chianti_H_He')\n", "atom_data_file = 'kurucz_cd23_chianti_H_He.h5'\n", "atom_data = AtomData.from_hdf(atom_data_file)" diff --git a/docs/quickstart.ipynb b/docs/quickstart.ipynb index 87c9ad0d23f..7a9625d3a23 100644 --- a/docs/quickstart.ipynb +++ b/docs/quickstart.ipynb @@ -43,7 +43,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You can also obtain a copy of the atomic data from the [tardis-refdata](https://github.com/tardis-sn/tardis-refdata/tree/master/atom_data) repository." + "You can also obtain a copy of the atomic data from the [tardis-regression-data](https://github.com/tardis-sn/tardis-regression-data/tree/main/atom_data) repository." ] }, {