Skip to content
DaminK edited this page Sep 9, 2022 · 27 revisions
Wipar Logo

WIPAR (Widefield Imaging Pipeline for Analysis and Regression) is a data pipeline for processing and analysing task-specific (widefield) calcium imaging data through neural decoding. Here, calcium activity is a proxy for neuronal activations. It provides stand-alone functionalities to visualize the data analysis as well as enabling the export of processed data for other visualization purposes.

Feature Overview

  • Brain Alignment (between different sessions and subjects)
    • Automatic registration with novel MesoNet (under development)
  • Different data-driven & anatomical parcellations
    • Including novel locaNMF to obtain interpretable, data-driven brain sub-regions
  • Different brain connectivity measurements
    • Functional connectivity (statistical relationship)
    • Effective connectivity (est. causal influence)
      • Novel MOU-EC fits multivariate ornstein uhlenbeck process as generative network model
      • Can be constrained by structural connectivity (under development)

Installation

  • Required Software
  • Clone this repository
  • Required Files
    • Put the experimental data into repository-path/resources/experiment/"mouse-id"/"experiment-date"/ with "mouse-id" and "experiment-date" being provided by you
      • e.g. ../repository/resources/experiment/GN06/2021-01-20_10-15-16
  • Have a look at the Trouble Shooting if you encounter problems during the setup

Pipeline Usage (test installation)

All commands assume you followed the default install procedure for Snakemake within a conda virtual environment

  1. Activate conda virtual environment
    • conda activate snakemake
  2. Customize config file (or use default config to test pipeline installation)
    • Detailed description can be found here
  3. Run pipeline
    • snakemake -j4 --use-conda "rule" with "rule" being replace by:
      • test_installation covers all processing steps
      • decoding_performance performs neural decoding with full feature space and plots results across all features and parcellations
      • reduce_biomarkers performs recursive feature elimination to select most discriminative features and visualizes them in an interactive glassbrain plot
    • Parameters are detailed here (like #Cores, overwrite-flags, ...)
  • For usage within a cluster environment (SLURM) refer to this page

Results

  • Visualizations (for direct interpretation)

    • Decoding performance plotted...
      • ...for each feature and decoder individually
      • ...across all features and decoders for each parcellation
      • ...across all parcellations and decoders for feature
    • Interactive glassbrain plot of biomarkers (Example)
      • Visualizes neural population of each spatial component
      • Shows discriminative brain connections (biomarkers) annotated with...
        • ...neighbourhoods / potential sub-circuits
        • ...corresponding average weight in decoder (under development)
        • ...corresponding rank from recursive feature elimination (under development)
  • Processed Data (for further processing/visualizing outside of pipeline)

    • Aligned calcium activity
    • Parcellated calcium activity
    • Calculated features (including brain connectivity measurements)
    • Trained decoders
    • Decoders accuracy on test sets
    • Selected biomarkers

Development

  • Architecture

Pipeline Architecture
  • (β–Ά) Running & configuring pipeline (for User)
    (πŸ› ) Extending pipeline functions (for Developers)

    .
    β”œβ”€β”€ ci_lib (πŸ› ) | custom python package containing all custom functions for the pipeline steps
    β”œβ”€β”€ workflow (πŸ› ) | Snakemake logic like rules, envs and entry scripts
    β”œβ”€β”€ config (β–Ά) | config files for pipeline runs
    β”œβ”€β”€ resources (β–Ά) | experimental data and brain atlases
    β”œβ”€β”€ results (β–Ά) | processed data and plots
    └── SLURM | batch files to run on computational clusters

    • Unifiction | Loading various data formats into standardized DecompData-class
    • Registration | Aligning the imaging data of different sessions and subject
    • Parcellation | Decomposes the pixel-wise activity data into activation levels (Temporal Components) of neural population (Spatial Components) (Intuitive explanation, look corresponding page for more details)
    • Trial & Condition Extraction |
    • Feature Calculation | Obtains features like Mean, FC, EC, ... based on the Temporal Components
    • Neural Decoding |
    • Recursive Feature Elimination |
    • Visualization | Produces the above mentioned plots

For Users

For Devs

    • Loading & Preprocessing
    • Parcellation
    • Filtering
    • Condition Extraction
    • Feature Calculation
    • Recursive Feature Elimination
    • Deconding
    • Plotting
Clone this wiki locally