Skip to content

A cosmological inference pipeline to test for observer-dependent context coupling in Planck and DESI datasets.

License

Notifications You must be signed in to change notification settings

ajhewitt/ledger

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ledger

Epistemic-First Cosmological Inference

ledger is the reference implementation for the Parochial by Construction (PbC) research framework. It provides the tooling to audit cosmological datasets for evidence of "Context Coupling"—the hypothesis that the history of the universe is not a pre-existing territory, but a dependency network fixed by the act of observation.

1. Theory & Intention

Standard cosmological inference (Temporal Evolution, or TE) assumes the past is fixed and independent of the observer. Causal Evolution (CE) proposes that the "Past" is a resource consumed to explain the "Present." The specific constraints of the observer (the Context) impose a tax on the history that can be reconstructed.

The Ledger Protocol identifies three specific "Phase-Locking" tracks:

  1. P1 (Primordial): The coupling of the CMB to the satellite's scan strategy.
  2. P6 (Early Structure): The coupling of the first galaxies to the deep-field window.
  3. P5 (Late Structure): The coupling of local clustering to target-selection bookkeeping.

2. Global Audit Results (Updated 2025-12-29)

  1. P1 (Planck): NULL.
    • Initial Finding: Strong coupling (9.8σ).
    • Audit: Vanished upon foreground cleaning (1.3σ).
    • Cause: Solar System foreground contamination (Zodiacal Light).
  2. P6 (COSMOS): NULL.
    • Initial Finding: Strong rotational anisotropy (11.2σ).
    • Audit: Vanished upon masking/shuffling.
    • Cause: Survey window function (Rectangular Grid) aliasing.
  3. P5 (DESI): NULL.
    • Finding: Consistent with standard ΛCDM stationarity.

3. Repository Structure

ledger/
├── data/
│   ├── raw/                # Planck NPIPE (R4.00), DESI Iron, COSMOS2020 - Ignored by git
│   ├── processed/          # P1/P5/P6 Audit Results (.json) & Decoherence Plots
│   └── mocks/              # Low-res synthetic data for CI/Testing
├── docs/
│   └── tex/                # LaTeX sources for Research Programs (P1, P2)
├── notebooks/              # Jupyter notebooks for prototyping and visualization
├── scripts/
│   ├── download_planck.sh  # Fetcher for public NPIPE/PR3 data
│   ├── generate_mock.py    # Generates synthetic data for testing
│   ├── 01_context_builder.py # Step 1: Raw Maps -> Context Vector (c)
│   ├── 02_planck_audit.py    # Differential (HRD) Phase-Locking Audit
│   ├── 03_desi_p5_audit.py     # Self-consistent clustering commutator
│   ├── 06_jwst_p6_audit.py     # High-z field variance estimator
│   ├── download_*.sh       # Integrated fetchers for IRSA and NERSC mirrors
│   └── run_injection.py    # Validation: Sensitivity testing
├── src/
│   └── pbc/                # Core Python Package
│       ├── context.py      # Logic for building T_ctx from exposure maps
│       ├── decoherence.py  # Logic for cross-epoch significance mapping
│       ├── dual.py         # Causal Evolution posterior math (Woodbury identities)
│       ├── stats.py        # Estimators for P1-P6 diagnostics
│       └── utils/          # I/O and HEALPix wrappers
├── tests/
│   └── science/            # Physics validation (Null tests, Symmetry checks)
└── pyproject.toml          # Package definition and dependencies

4. Installation

Prerequisites: Python 3.9+

# 1. Clone the repository
git clone [https://github.com/your-org/ledger.git](https://github.com/your-org/ledger.git)
cd ledger

# 2. Create a virtual environment (Recommended)
python -m venv .venv
source .venv/bin/activate

# 3. Install in editable mode
# Includes 'numpy', 'healpy', 'tqdm'
pip install -e ".[dev]"

5. Quick Start (Simulation Mode)

If you cannot download the 2TB Planck archive, use the Simulation Mode. This generates synthetic "Standard Model" skies and "Dipole" scan strategies to verify the pipeline logic.

  1. Generate Mock Data: Creates synthetic FITS files in data/raw/.
python scripts/generate_mock_data.py
  1. Build the Context Template: Extracts the "Cost Vector" (c) from the mock scan strategy.
python scripts/01_context_builder.py --nside 512
  1. Audit the Ledger: Cross-correlates the Mock Record with the Mock Context.
python scripts/02_planck_p1_audit.py --nside 512
# Expected: S_gamma ~ 0.0 (Null Result)

6. Real Data Workflow (Discovery Mode)

To run the actual scientific analysis on Planck 2018/PR4 data:

  1. Download Data: Fetches the necessary Frequency and Component maps (requires ~2GB).
./scripts/download_planck.sh
  1. Run High-Res Analysis:
# Step 1: Build the Context (NSIDE 2048)
python scripts/01_context_builder.py --nside 2048

# Step 2: Audit the Record
python scripts/02_planck_p1_audit.py --nside 2048

7. Validation & Safety

We adhere to a strict "Stop-Loss" protocol. Before claiming a result, the pipeline must pass Null (Bias) and Injection (Sensitivity) tests.

  • Null Test: Verify is consistent with 0 on random skies.
pytest tests/science/test_null.py
  • Sensitivity Test: Verify we can recover a hidden signal (λ=0.05).
python scripts/run_injection.py --lambda-inj 0.05 --nsims 100

License

MIT

About

A cosmological inference pipeline to test for observer-dependent context coupling in Planck and DESI datasets.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published