Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
69de2c1
kick fix of emulatiors
corentinravoux Dec 3, 2025
9f0a9dd
Fixing all ruff linting warning except __init__
corentinravoux Dec 3, 2025
d2ed309
Regeneration of all docstring from copilot with google format
corentinravoux Dec 3, 2025
e55c754
action correction test
corentinravoux Dec 3, 2025
403fd89
fix
corentinravoux Dec 3, 2025
7a735af
correct one messy pydoc
corentinravoux Dec 4, 2025
68b7392
installing cosmoprimo for the tests
corentinravoux Dec 4, 2025
6421db8
Correcting a bug introduced by BASTIEEEEEEENNNN :p
corentinravoux Dec 5, 2025
70dc396
fixing bugs
corentinravoux Dec 11, 2025
c7668cd
changing flip data to include more realistic velocity and density fie…
corentinravoux Dec 18, 2025
08d7201
recreate all the test
corentinravoux Dec 18, 2025
f8abc4f
fix tests
corentinravoux Dec 19, 2025
e705e7b
Reorganization of the package to start the inclusion of other submodu…
corentinravoux Dec 19, 2025
cf6e545
fix
corentinravoux Dec 19, 2025
30ca043
correcting all the scripts and fix bugs introduced
corentinravoux Dec 19, 2025
b11f12a
fix install in tests
corentinravoux Dec 19, 2025
2966259
test removing conftest
corentinravoux Jan 8, 2026
8bc866f
moving of all the models into analytical, cleaner structure
corentinravoux Jan 8, 2026
f6fac2e
correcting the main notebooks
corentinravoux Jan 8, 2026
a453e1e
moving secret logo call
corentinravoux Jan 15, 2026
af4f0f3
fix
corentinravoux Jan 19, 2026
54f1798
fix tests
corentinravoux Jan 19, 2026
b3bfa66
retiring Fisher vectors, as the regular vectors work, and the Fisher …
corentinravoux Jan 19, 2026
dc065a4
beginning of fisher debugging, rcrk24 seems bugged right now, to check
corentinravoux Jan 19, 2026
26ec6a8
small fix for fisher and zdep models
corentinravoux Jan 23, 2026
9b83996
fix all notebooks
corentinravoux Jan 23, 2026
6e94637
fix one notebook
corentinravoux Jan 23, 2026
7d2c31d
fix bug
corentinravoux Jan 23, 2026
ab947d6
copilot reviews
corentinravoux Jan 23, 2026
55124bf
revert shitty AI comments
corentinravoux Jan 23, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/bumpversion.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
steps:
- uses: actions/checkout@v4
with:
token: ${{ secrets.PAT_MW_TOKEN_ACTION }}
token: ${{ secrets.GITHUB_TOKEN }}

- name: setup git
run: |
Expand Down Expand Up @@ -46,5 +46,5 @@ jobs:
- name: Push changes
uses: ad-m/github-push-action@master
with:
github_token: ${{ secrets.PAT_MW_TOKEN_ACTION }}
github_token: ${{ secrets.GITHUB_TOKEN }}
tags: true
9 changes: 2 additions & 7 deletions .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,14 +27,9 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install flake8 pytest
python -m pip install pytest fastparquet sympy
python -m pip install git+https://github.com/cosmodesi/cosmoprimo
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Test with pytest
run: |
pytest
12 changes: 6 additions & 6 deletions docs/DataVector.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ DataVector class

FLIP includes a :py:class:`~flip.data_vector.basic.DataVector` abstract class that is used to build different classes
to easily handle the data you want to use and to pass them to one of the different likelihood
implemented in the :py:mod:`flip.likelihood` module.
implemented in the :py:mod:`flip.covariance.likelihood` module.


Using the :py:class:`~flip.data_vector.basic.DataVector` class you can obtain data and variance / covariance:
Expand Down Expand Up @@ -44,7 +44,7 @@ The :py:class:`~flip.data_vector.basic.Dens` class is used on example data as:
import pandas as pd
from flip import data_vector

grid = pd.read_parquet("flip/flip/data/density_data.parquet")
grid = pd.read_parquet("flip/flip/data/data_density.parquet")
grid.rename(columns={'density_err': 'density_error',
'rcom': 'rcom_zobs'}, inplace=True)
DataDensity = data_vector.Dens(grid.to_dict(orient='list'))
Expand All @@ -65,7 +65,7 @@ The :py:class:`~flip.data_vector.basic.DirectVel` class is used on example data
import numpy as np
from flip import data_vector

data_velocity = pd.read_parquet("flip/flip/data/velocity_data.parquet"))
data_velocity = pd.read_parquet("flip/flip/data/data_velocity.parquet"))
data_velocity.rename(columns={'vpec': 'velocity'}, inplace=True)
data_velocity["velocity_error"] = np.zeros(len(data_velocity["vpec"])

Expand Down Expand Up @@ -108,7 +108,7 @@ The DataVector is initialised as:
import pandas as pd
from flip import data_vector

data_velocity = pd.read_parquet("flip/flip/data/velocity_data.parquet"))
data_velocity = pd.read_parquet("flip/flip/data/data_velocity.parquet"))
DataVel = data_vector.snia_vectors.VelFromSALTfit(
data_velocity.to_dict(orient='list'),
velocity_estimator='full'
Expand Down Expand Up @@ -142,13 +142,13 @@ It is initialised as:
import pandas as pd
from flip import data_vector

grid = pd.read_parquet("flip/flip/data/density_data.parquet")
grid = pd.read_parquet("flip/flip/data/data_density.parquet")
grid.rename(columns={'density_err': 'density_error',
'rcom': 'rcom_zobs'}, inplace=True)

DataDensity = data_vector.Dens(grid.to_dict(orient='list'))

data_velocity = pd.read_parquet("flip/flip/data/velocity_data.parquet"))
data_velocity = pd.read_parquet("flip/flip/data/data_velocity.parquet"))
DataVel = data_vector.snia_vectors.VelFromSALTfit(
data_velocity.to_dict(orient='list'),
velocity_estimator='full'
Expand Down
14 changes: 2 additions & 12 deletions flip/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,18 +5,8 @@
from flip.utils import create_log

log = create_log()
from . import (
covariance,
data,
data_vector,
fisher,
fitter,
gridding,
likelihood,
power_spectra,
utils,
)
from .plot_utils import __secret_logo__
from . import covariance, data, data_vector, power_spectra, utils
from .utils import __secret_logo__

try:
import jax
Expand Down
File renamed without changes.
16 changes: 7 additions & 9 deletions flip/covariance/__init__.py
Original file line number Diff line number Diff line change
@@ -1,16 +1,14 @@
"""Init file of the flip.covariance package."""

from . import (
adamsblake17,
adamsblake17plane,
adamsblake20,
carreres23,
analytical,
contraction,
cov_utils,
emulators,
lai22,
ravouxcarreres,
ravouxnoanchor25,
rcrk24,
fisher,
fitter,
generator,
likelihood,
symbolic,
)
from .covariance import CovMatrix

11 changes: 11 additions & 0 deletions flip/covariance/analytical/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
from . import (
adamsblake17,
adamsblake17plane,
adamsblake20,
carreres23,
genericzdep,
lai22,
ravouxcarreres,
ravouxnoanchor25,
rcrk24,
)
Original file line number Diff line number Diff line change
Expand Up @@ -8,20 +8,49 @@


def angle_between(ra_0, ra_1, dec_0, dec_1):
"""Compute cos of the angle between r0 and r1."""
"""Compute cos of the angle between two sky directions.

Args:
ra_0: Right ascension of first object (radians).
ra_1: Right ascension of second object (radians).
dec_0: Declination of first object (radians).
dec_1: Declination of second object (radians).

Returns:
Cosine of the angular separation.
"""
cos_alpha = np.cos(ra_1 - ra_0) * np.cos(dec_0) * np.cos(dec_1) + np.sin(
dec_0
) * np.sin(dec_1)
return cos_alpha


def separation(r_0, r_1, cos_alpha):
"""Compute separation between r_0 and r_1."""
"""Compute comoving separation given distances and angular cosine.

Args:
r_0: Comoving distance of first object.
r_1: Comoving distance of second object.
cos_alpha: Cosine of angular separation.

Returns:
Scalar separation between points.
"""
return np.sqrt(r_0**2 + r_1**2 - 2 * r_0 * r_1 * cos_alpha)


def window_vv(r_0, r_1, cos_alpha, sep, j0kr, j2kr):
"""Note: here, the bisector angle definition is used to compute"""
"""Wide-angle window for vv using bisector definition.

Args:
r_0, r_1: Comoving distances of the two objects.
cos_alpha: Cosine of angle between directions.
sep: Comoving separation.
j0kr, j2kr: Spherical Bessel terms evaluated on ``k*sep``.

Returns:
Window values per k contributing to vv covariance integral.
"""
win = 1 / 3 * (j0kr + j2kr)
alpha = np.arccos(np.clip(cos_alpha, -1.0, 1.0))
phi = cov_utils.compute_phi_bisector_theorem(sep, alpha, r_0, r_1)
Expand All @@ -30,19 +59,49 @@ def window_vv(r_0, r_1, cos_alpha, sep, j0kr, j2kr):


def window_vg(r_0, r_1, cos_alpha, sep, j1kr):
"""Note: here, the bisector angle definition is used to compute"""
"""Wide-angle window for gv using bisector definition.

Args:
r_0, r_1: Comoving distances of the two objects.
cos_alpha: Cosine of angle between directions.
sep: Comoving separation.
j1kr: Spherical Bessel term ``j_1(k*sep)``.

Returns:
Window values per k contributing to gv covariance integral.
"""
alpha = np.arccos(np.clip(cos_alpha, -1.0, 1.0))
phi = cov_utils.compute_phi_bisector_theorem(sep, alpha, r_0, r_1)
win = j1kr * np.cos(phi)
return win


def intp(win, k, pk):
"""Integrate window times spectrum over k using trapezoidal rule.

Args:
win: Window array per k.
k: Wavenumber grid.
pk: Spectrum values at k.

Returns:
Scalar integral value.
"""
pint = win.T * pk
return np.trapz(pint, x=k)


def compute_coef_gg(k, pk, coord):
"""Coefficient contributing to gg covariance for a single pair.

Args:
k: Wavenumber grid.
pk: Matter spectrum values at k.
coord: Tuple/list ``(ra_i, ra_j, dec_i, dec_j, r_i, r_j)``.

Returns:
Scalar integral value for gg.
"""
cos = angle_between(coord[0], coord[1], coord[2], coord[3])
sep = separation(coord[4], coord[5], cos)
ksep = np.outer(k, sep)
Expand All @@ -52,6 +111,13 @@ def compute_coef_gg(k, pk, coord):


def compute_coef_gv(k, pk, coord):
"""Coefficient contributing to gv covariance for a single pair.

Args mirror those of ``compute_coef_gg`` but using cross-spectrum and window.

Returns:
Scalar integral value for gv.
"""
cos = angle_between(coord[0], coord[1], coord[2], coord[3])
sep = separation(coord[4], coord[5], cos)
ksep = np.outer(k, sep)
Expand All @@ -62,6 +128,13 @@ def compute_coef_gv(k, pk, coord):


def compute_coef_vv(k, pk, coord):
"""Coefficient contributing to vv covariance for a single pair.

Args mirror those of ``compute_coef_gg`` but using velocity spectrum and window.

Returns:
Scalar integral value for vv.
"""
cos = angle_between(coord[0], coord[1], coord[2], coord[3])
sep = separation(coord[4], coord[5], cos)
ksep = np.outer(k, sep)
Expand All @@ -81,6 +154,18 @@ def covariance_vv(
size_batch=100_000,
number_worker=8,
):
"""Compute velocity-velocity covariance (plane-parallel wide-angle bisector).

Args:
ra_v, dec_v, rcomov_v: Velocity tracer coordinates and distances.
wavenumber: Wavenumber grid for velocity spectrum.
power_spectrum: Spectrum values at wavenumber.
size_batch: Number of pairs per batch.
number_worker: Number of parallel workers.

Returns:
Flattened covariance vector with variance at index 0.
"""
N = len(ra_v)
n_task = int((N * (N + 1)) / 2) - N
batches = []
Expand Down Expand Up @@ -114,6 +199,10 @@ def covariance_gv(
size_batch=100_000,
number_worker=8,
):
"""Compute density-velocity covariance (plane-parallel wide-angle bisector).

Returns flattened covariance vector with variance at index 0.
"""
number_objects_g = len(ra_g)
number_objects_v = len(ra_v)

Expand Down Expand Up @@ -147,6 +236,7 @@ def covariance_gg(
size_batch=100_000,
number_worker=8,
):
"""Compute density-density covariance (plane-parallel)."""
N = len(ra_g)
n_task = int((N * (N + 1)) / 2) - N
batches = []
Expand Down Expand Up @@ -176,23 +266,19 @@ def generate_covariance(
coordinates_density=None,
**kwargs,
):
"""
The generate_covariance function generates the covariance matrix for a given model type, power spectrum, and coordinates.
"""Generate covariance blocks using Adams & Blake (2017) plane approximation.

Wide-angle definition uses the bisector. Supports ``gg``, ``vv``, and ``gv``.

Args:
model_kind: Determine which covariance matrices are generated, and the coordinates_density and coordinates_velocity parameters are used to generate the covariance matrices
power_spectrum_dict: Pass the power spectrum to the function
coordinates_velocity: Define the coordinates of the velocity field
coordinates_density: Define the coordinates of the density field
**kwargs: Pass keyword arguments to the function
: Generate the covariance matrix for a given model
The wide angle definition is bisector.
model_kind: One of ``"density"``, ``"velocity"``, ``"full"``, ``"density_velocity"``.
power_spectrum_dict: Dict providing required spectra grids and values.
coordinates_velocity: Tuple ``(ra, dec, rcom)`` for velocity tracers.
coordinates_density: Tuple ``(ra, dec, rcom)`` for density tracers.
**kwargs: Forwarded to low-level covariance functions (batch/workers).

Returns:
A dictionary of covariance matrices

Doc Author:
Trelent
Tuple ``(covariance_dict, number_densities, number_velocities, los_definition)``.
"""
cov_utils.check_generator_need(
model_kind,
Expand Down
Loading
Loading