Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Execute new commondata generation in CI #2099

Open
wants to merge 10 commits into
base: master
Choose a base branch
from
34 changes: 34 additions & 0 deletions .github/workflows/check_newcd.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# A CI script with github workflow to test the new commondata
name: Test new commondata

on:
push:
workflow_dispatch:

jobs:
test-commondata:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4
with:
fetch-tags: true
fetch-depth: 0
- name: Install NNPDF data package 🐍
run: pip install ./nnpdf_data/'[filter]'
- name: Run the filters 📦
shell: bash -l {0}
run: |
here=$PWD
readarray -d '' array < <(find ./nnpdf_data/nnpdf_data/new_commondata -name "filter.py" -print0)
for datname in "${array[@]}"; do dirpath=${datname%/*}; cd $dirpath; python filter.py || exit $?; cd $here; done
- name: Check for modified files 🛎️
uses: tj-actions/verify-changed-files@v20
id: verify-changed-files
with:
fail-if-changed: "false"
- name: List all changed tracked and untracked files 🛎️
env:
CHANGED_FILES: ${{ steps.verify-changed-files.outputs.changed_files }}
run: |
echo "Changed files: $CHANGED_FILES"
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,17 @@
import pandas as pd
import yaml

# Definition of various topologies used in Polarized Dijets
# NOTE: the observable is symmetric for jet1 and jet2,
# so 1 and 2 are not ordered in pT.
TOPO_DEF = {
"A": {"abs_eta1_min": 0.3, "abs_eta1_max": 0.9, "abs_eta2_min": 0.3, "abs_eta2_max": 0.9},
"B": {"abs_eta1_min": 0, "abs_eta1_max": 0.3, "abs_eta2_min": 0.3, "abs_eta2_max": 0.9},
"C": {"abs_eta1_min": 0, "abs_eta1_max": 0.3, "abs_eta2_min": 0, "abs_eta2_max": 0.3},
"D": {"abs_eta1_min": 0.3, "abs_eta1_max": 0.9, "abs_eta2_min": 0.3, "abs_eta2_max": 0.9},
"I": {"abs_eta_min": 0, "abs_eta_max": 0.9},
}


def read_central_values(path: Path) -> np.ndarray:
"""Read the central values from the theory predictions.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

import numpy as np

from nnpdf_data.filter_utils.eic_utils import (
from nnpdf_data.filter_utils.poldata_utils import (
fluctuate_data,
read_central_values,
read_excel,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

import numpy as np

from nnpdf_data.filter_utils.eic_utils import read_excel, write_data
from nnpdf_data.filter_utils.poldata_utils import read_excel, write_data

np.random.seed(1234567890)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

import numpy as np

from nnpdf_data.filter_utils.eic_utils import (
from nnpdf_data.filter_utils.poldata_utils import (
fluctuate_data,
read_central_values,
read_excel,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

import numpy as np

from nnpdf_data.filter_utils.eic_utils import (
from nnpdf_data.filter_utils.poldata_utils import (
fluctuate_data,
read_central_values,
read_excel,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

import numpy as np

from nnpdf_data.filter_utils.eic_utils import (
from nnpdf_data.filter_utils.poldata_utils import (
fluctuate_data,
read_central_values,
read_excel,
Expand Down
Loading