Epistemic-First Cosmological Inference
ledger is the reference implementation for the Parochial by Construction (PbC) research framework. It provides the tooling to audit cosmological datasets for evidence of "Context Coupling"—the hypothesis that the history of the universe is not a pre-existing territory, but a dependency network fixed by the act of observation.
Standard cosmological inference (Temporal Evolution, or TE) assumes the past is fixed and independent of the observer. Causal Evolution (CE) proposes that the "Past" is a resource consumed to explain the "Present." The specific constraints of the observer (the Context) impose a tax on the history that can be reconstructed.
The Ledger Protocol identifies three specific "Phase-Locking" tracks:
- P1 (Primordial): The coupling of the CMB to the satellite's scan strategy.
- P6 (Early Structure): The coupling of the first galaxies to the deep-field window.
- P5 (Late Structure): The coupling of local clustering to target-selection bookkeeping.
- P1 (Planck): NULL.
- Initial Finding: Strong coupling (9.8σ).
- Audit: Vanished upon foreground cleaning (1.3σ).
- Cause: Solar System foreground contamination (Zodiacal Light).
- P6 (COSMOS): NULL.
- Initial Finding: Strong rotational anisotropy (11.2σ).
- Audit: Vanished upon masking/shuffling.
- Cause: Survey window function (Rectangular Grid) aliasing.
- P5 (DESI): NULL.
- Finding: Consistent with standard ΛCDM stationarity.
ledger/
├── data/
│ ├── raw/ # Planck NPIPE (R4.00), DESI Iron, COSMOS2020 - Ignored by git
│ ├── processed/ # P1/P5/P6 Audit Results (.json) & Decoherence Plots
│ └── mocks/ # Low-res synthetic data for CI/Testing
├── docs/
│ └── tex/ # LaTeX sources for Research Programs (P1, P2)
├── notebooks/ # Jupyter notebooks for prototyping and visualization
├── scripts/
│ ├── download_planck.sh # Fetcher for public NPIPE/PR3 data
│ ├── generate_mock.py # Generates synthetic data for testing
│ ├── 01_context_builder.py # Step 1: Raw Maps -> Context Vector (c)
│ ├── 02_planck_audit.py # Differential (HRD) Phase-Locking Audit
│ ├── 03_desi_p5_audit.py # Self-consistent clustering commutator
│ ├── 06_jwst_p6_audit.py # High-z field variance estimator
│ ├── download_*.sh # Integrated fetchers for IRSA and NERSC mirrors
│ └── run_injection.py # Validation: Sensitivity testing
├── src/
│ └── pbc/ # Core Python Package
│ ├── context.py # Logic for building T_ctx from exposure maps
│ ├── decoherence.py # Logic for cross-epoch significance mapping
│ ├── dual.py # Causal Evolution posterior math (Woodbury identities)
│ ├── stats.py # Estimators for P1-P6 diagnostics
│ └── utils/ # I/O and HEALPix wrappers
├── tests/
│ └── science/ # Physics validation (Null tests, Symmetry checks)
└── pyproject.toml # Package definition and dependencies
Prerequisites: Python 3.9+
# 1. Clone the repository
git clone [https://github.com/your-org/ledger.git](https://github.com/your-org/ledger.git)
cd ledger
# 2. Create a virtual environment (Recommended)
python -m venv .venv
source .venv/bin/activate
# 3. Install in editable mode
# Includes 'numpy', 'healpy', 'tqdm'
pip install -e ".[dev]"If you cannot download the 2TB Planck archive, use the Simulation Mode. This generates synthetic "Standard Model" skies and "Dipole" scan strategies to verify the pipeline logic.
- Generate Mock Data: Creates synthetic FITS files in
data/raw/.
python scripts/generate_mock_data.py- Build the Context Template: Extracts the "Cost Vector" (c) from the mock scan strategy.
python scripts/01_context_builder.py --nside 512- Audit the Ledger: Cross-correlates the Mock Record with the Mock Context.
python scripts/02_planck_p1_audit.py --nside 512
# Expected: S_gamma ~ 0.0 (Null Result)To run the actual scientific analysis on Planck 2018/PR4 data:
- Download Data: Fetches the necessary Frequency and Component maps (requires ~2GB).
./scripts/download_planck.sh- Run High-Res Analysis:
# Step 1: Build the Context (NSIDE 2048)
python scripts/01_context_builder.py --nside 2048
# Step 2: Audit the Record
python scripts/02_planck_p1_audit.py --nside 2048We adhere to a strict "Stop-Loss" protocol. Before claiming a result, the pipeline must pass Null (Bias) and Injection (Sensitivity) tests.
- Null Test: Verify Sγ is consistent with 0 on random skies.
pytest tests/science/test_null.py- Sensitivity Test: Verify we can recover a hidden signal (λ=0.05).
python scripts/run_injection.py --lambda-inj 0.05 --nsims 100MIT
