Skip to content

Commit

Permalink
Update docs with regression data (#2842)
Browse files Browse the repository at this point in the history
* Update comment with git regression data

* update hyperlink

* Update file name

* Update running tests page

* Removing outdated updating test data procedures

* Remove azure pipelines from tardis docs

* remove integration tests section

* Remove integration tests mention from docs

* Update docs as per review

* Update docs as per review

* Fix hyperlink on homepage

* update team page hyperlink

* update testing pipeline description

* Remove manual procedure heading

* Change tarids into upper case letters

* Add commonly followed steps in current tardis pipelines

* Fix headings

* Change sunday from all caps to only initial letter as capital

* Update testing pipeline description

* update group email
  • Loading branch information
KasukabeDefenceForce authored Oct 21, 2024
1 parent df65feb commit 6e5ae60
Show file tree
Hide file tree
Showing 12 changed files with 61 additions and 373 deletions.
2 changes: 1 addition & 1 deletion CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,4 @@ As members of the community,

This code of conduct has been adapted from the Astropy Code of Conduct, which in turn uses parts of the PSF code of conduct.

**To report any violations of the code of conduct, please contact a member of the [TARDIS core team](https://tardis-sn.github.io/team/community_roles/index.html) (the email tardis.supernova.code@gmail.com is monitored by the core team) or the Ombudsperson (see the [team page](https://tardis-sn.github.io/team/community_roles/index.html); who is outside of the TARDIS collaboration and will treat reports confidentially).**
**To report any violations of the code of conduct, please contact a member of the [TARDIS core team](https://tardis-sn.github.io/people/core/) (the email tardiscollaboration@gmail.com is monitored by the core team) or the Ombudsperson (see the [team page](https://tardis-sn.github.io/people/core/); who is outside of the TARDIS collaboration and will treat reports confidentially).**
2 changes: 1 addition & 1 deletion GOVERNANCE.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# TARDIS Collaboration Governance

Please visit our website to learn more about the [TARDIS Governance](https://tardis-sn.github.io/team/governance/).
Please visit our website to learn more about the [TARDIS Governance](https://tardis-sn.github.io/people/governance/).
15 changes: 0 additions & 15 deletions docs/contributing/development/azure_links.inc

This file was deleted.

315 changes: 26 additions & 289 deletions docs/contributing/development/continuous_integration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,269 +10,45 @@ or a change is merged into the *master* branch, a service will
clone the repository, checkout to the current commit and execute
all the TARDIS tests. This helps us to detect bugs immediately.


Azure Repos
-----------

Azure Repos is just another service to store Git repositories.
Currently, we use Azure Repos to mirror ``tardis-refdata``
repository since Azure does not impose limits on LFS bandwidth
nor storage.

**To clone this repository:**

.. code-block:: bash
git clone https://tardis-sn@dev.azure.com/tardis-sn/TARDIS/_git/tardis-refdata
**To download a LFS file through HTTPS:**

.. code-block:: none
https://dev.azure.com/tardis-sn/TARDIS/_apis/git/repositories/tardis-refdata/items?path=atom_data/kurucz_cd23_chianti_H_He.h5&resolveLfs=true
This mirror is automatically synced by `a GitHub workflow`. If you want
to `update it manually`_, remember to set ``git config http.version HTTP/1.1``
to avoid `error 413`_ while pushing large files.


Azure Pipelines & GitHub Actions
--------------------------------

Currently, we use the `Azure DevOps`_ service to run most of our
pipelines and GitHub Actions for some others (called "workflows"). The
following sections explains briefly the different components of a
pipeline/workflow, mostly focused on the Azure service.
GitHub Actions
--------------

A pipeline (or a workflow) is essentially a :term:`YAML` configuration file
with different sections such as variables, jobs and steps. These files
run commands or tasks when they are triggered by some event, like a
commit being pushed to a certain branch.

Pipelines on Azure must be created through the web UI for the first time.
Then, making changes to an existing pipeline is as easy as making a pull
request. To create a new workflow on GitHub, just create a new YAML file
in ``.github/workflows``.


Triggers
--------

First thing to do is telling the pipeline when it should run. In
Azure, *trigger* (also known as the CI trigger) sets up the pipeline
to run every time changes are pushed to a branch.

.. code-block:: yaml
trigger:
- master
If some trigger is not specified then the default configuration
is assumed.

.. code-block:: yaml
trigger:
branches:
include:
- '*'
pr:
branches:
include:
- '*'
This means the pipeline will start running every time changes are
merged to any branch of the repository, or someone pushes new
commits to a pull request.

If you want to run a pipeline only manually set both triggers to
*none*.

.. code-block:: yaml
trigger: none
pr: none
Notice that you can test changes in a pipeline by activating the PR
trigger on a new pull request, even if that trigger is disabled on
the YAML file present in the *master* branch.

On GitHub Actions these triggers are named ``push`` and ``pull_request``,
and works mostly in the same way.

.. warning:: Triggers also can be set on the Azure's web interface
too, but this action is discouraged, since it overrides
any trigger specified in the YAML file and could lead to
confusing situations.

There are more useful triggers such as the *cron* trigger, see the
`Azure documentation section on triggers`_ for more information.


Variables
---------

Variable syntax
===============

Azure Pipelines supports three different ways to reference variables:
*macro*, *template expression*, and *runtime expression*. Each syntax
can be used for a different purpose and has some limitations.

.. image:: images/variables.png
:align: center

**What syntax should I use?** Use *macro syntax* if you are providing
input for a task. Choose a *runtime expression* if you are working with
conditions and expressions. If you are defining a variable in a template,
use a *template expression*.


Define variables
================

Usually, we define variables at the top of the YAML file.

.. code-block:: yaml
variables:
my.var: 'foo'
steps:
- bash: |
echo $(my.var)
When a variable is defined at the top of a YAML, it will be available
to all jobs and stages in the pipeline as a *global variable*.
Variables at the *stage* level override variables at the *root* level,
while variables at the *job* level override variables at the *root*
and *stage* level.

Also, variables are available to scripts through environment variables.
The name is upper-cased and ``.`` is replaced with ``_``. For example

.. code-block:: yaml
variables:
my.var: 'foo'
steps:
- bash: |
echo $MY_VAR
To set a variable from a script task, use the ``task.setvariable`` logging
command.

.. code-block:: yaml
steps:
- bash: |
echo "##vso[task.setvariable variable=my.var]foo"
- bash: |
echo $(my.var)
See the `Azure documentation section on variables`_ for more information.


Predefined variables
--------------------

The most important (and confusing) predefined variables are the ones related
to paths in Azure:

* All folders for a given pipeline are created under ``Agent.BuildDirectory``
variable, alias ``Pipeline.Workspace``. This includes subdirectories like
``/s`` for sources or ``/a`` for artifacts.

* Path to source code varies depending on how many repositories we fetch.
For example, source code is located under the ``Build.Repository.LocalPath``
variable (alias ``Build.SourcesDirectory``) when fetching a single repository,
but after fetching a second repository code is moved automatically to
``Build.Repository.LocalPath/repository-name``.

See the Azure documentation to learn more about `checking out multiple repositories`_.


Jobs
----

You can organize your pipeline into jobs. Every pipeline has at least one job.
A job is a series of steps that run sequentially as a unit. In other words,
a job is the smallest unit of work that can be scheduled to run.


.. code-block:: yaml
jobs:
- job: myJob
pool:
vmImage: 'ubuntu-latest'
steps:
- bash: echo "Hello world"
Jobs can run in parallel (for example: run the same job on multiple OSes) or
depend on a previous job.

See the `Azure documentation section on jobs`_ for more information.

Currently, we use GitHub Actions to run all of our pipelines. Making changes to an existing
pipeline is as easy as making a pull request. To create a new GitHub Action workflow,
just create a new YAML file in ``.github/workflows``.

TARDIS Pipelines
----------------

Brief description of pipelines already implemented on Azure or GitHub Actions.


The default template
====================

Templates let you define reusable content, logic, and parameters. It functions
like an include directive in many programming languages (content from one file
is inserted into another file).

The common set of steps used across most TARDIS pipelines resides in the
"default" template:
================

- Force ``set -e`` on all Bash steps.
- Set TARDIS custom variables.
- Fetch TARDIS main repository.
- Fetch TARDIS reference data repository from mirror (optional).
- Configure Anaconda for Linux and macOS agents.
- Install Mamba package manager (optional).
- Install TARDIS environment (optional).
- Build and install TARDIS (optional).
Brief description of pipelines already implemented on TARDIS

It was written to make pipelines easier to create and maintain. For example,
to start a new pipeline use::
Streamlined Steps for TARDIS Pipelines
========================================

steps:
- template: templates/default.yml
parameters:
useMamba: true
We have a common set of steps which are utilized in TARDIS pipelines to streamline the process:

**List of template parameters:**
Common Steps
------------

- ``fetchDepth`` (*int*): the depth of commits to fetch from ``tardis`` repository,
default is ``0`` (no limit).
- ``fetchRefdata`` (*bool*): fetch the ``tardis-refdata`` repository from Azure Repos,
default is ``false``.
- ``refdataRepo`` (*option*): source of the ``tardis-refdata`` repository,
options are ``azure`` (default) or ``github``.
- ``useMamba`` (*bool*): use the ``mamba`` package manager instead of ``conda``,
default is ``false``.
- ``tardisEnv`` (*bool*): setup the TARDIS environment, default is ``true``.
1. **Use `setup_lfs` Action**
- If you need access to regression or atomic data, incorporate the `setup_lfs` action to ensure proper handling of large file storage.

**List of predefined custom variables:**
2. **Use `setup_env` Action**
- To configure your environment effectively, utilize the `setup_env` action. This will help establish the necessary variables and settings for your pipeline.

- ``tardis.dir`` is equivalent to ``$(Build.SourcesDirectory)/tardis``.
- ``refdata.dir`` is equivalent to ``$(Build.SourcesDirectory)/tardis-refdata``.
3. **Run Configuration**
- Ensure that your pipeline runs with the appropriate shell settings. You can define this in your YAML configuration as follows:

See the `Azure documentation section on templates`_ for more information.
.. code-block:: yaml
defaults:
run:
shell: bash -l {0}
Documentation build pipeline
Expand All @@ -291,9 +67,7 @@ See the :ref:`Documentation Preview <doc-preview>` section for more information.
Testing pipeline
================

The `testing pipeline`_ (CI) consists basically in the same job running twice
in parallel (one for each OS) with the steps from the default template, plus
extra steps to run the tests and upload the coverage results.
The `testing pipeline`_ (CI) comprises multiple concurrent jobs. Each of these jobs runs tests across two distinct categories—continuum and rpacket tracking—and supports two different operating systems. Additionally, there are extra steps involved in executing the tests and uploading the coverage results


Authors pipeline
Expand All @@ -315,44 +89,7 @@ In the near future we want to auto-update the citation guidelines in the
Release pipeline
================

Publishes a new release of TARDIS every sunday at 00:00 UTC.


Compare reference data pipeline
===============================

This pipeline compares two versions of the reference data. It's triggered manually via
the Azure Pipelines web UI, or when a TARDIS contributor leaves the following comment
on a pull request:
::

/AzurePipelines run compare-refdata

For brevity, you can comment using ``/azp`` instead of ``/AzurePipelines``.

By default, generates new reference data for the ``HEAD`` of the pull request. Then,
compares against latest reference data stored in ``tardis-refdata`` repository. If
you want to compare two different labels (SHAs, branches, tags, etc.) uncomment and
set the ``ref1.hash`` and ``ref2.hash`` variables in
``.github/workflows/compare-refdata.yml`` on your pull request. For example:
.. code-block:: yaml
ref1.hash: 'upstream/pr/11'
ref2.hash: 'upstream/master'
The web UI also allows to compare any version of the reference data by providing those
variables at runtime, but the access to the dashboard is restricted to a small group
of developers.

.. warning:: If using the Azure dashboard, do not define ``ref1.hash`` and ``ref2.hash``
between quotation marks or **the pipeline will fail**. This does not apply for
the YAML file.

Finally, the report is uploaded to the
`OpenSupernova.org server <http://opensupernova.org/~azuredevops/files/refdata-results/>`_
following the ``<pr>/<commit>`` folder structure. If the pipeline fails, also a report is
generated, but not necessarily gives useful debug information (depends on which step the
pipeline has failed).
Publishes a new release of TARDIS every Sunday at 00:00 UTC.


TARDIS Carsus Compatibility Check
Expand All @@ -371,4 +108,4 @@ The Save Atomic Files Workflow
The Save Atomic Files workflow runs every week but can also be triggered manually.
It runs the "Bridge" and sends an artifact containing the generated atomic data file
and the comparison notebook to Moria. This workflow has a separate job to indicate if the
bridge has failed.
bridge has failed.
2 changes: 1 addition & 1 deletion docs/contributing/development/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,6 @@ the core team (active maintainers) of TARDIS.
:maxdepth: 2

continuous_integration
update_refdata
update_regression_data
matterbridge
debug_numba
1 change: 0 additions & 1 deletion docs/contributing/development/links.inc
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,4 @@
.. include:: known_projects.inc
.. include:: this_project.inc
.. include:: git_links.inc
.. include:: azure_links.inc
.. include:: matterbridge.inc
Loading

0 comments on commit 6e5ae60

Please sign in to comment.