-
Notifications
You must be signed in to change notification settings - Fork 1
Home
WIPAR (Widefield Imaging Pipeline for Analysis and Regression) is a data pipeline for processing and analysing task-specific (widefield) calcium imaging data through neural decoding. Here, calcium activity is a proxy for neuronal activations. It provides stand-alone functionalities to visualize the data analysis as well as enabling the export of processed data for other visualization purposes.
- Brain Alignment (between different sessions and subjects)
- Automatic registration with novel MesoNet (under development)
- Different data-driven & anatomical parcellations
- Including novel locaNMF to obtain interpretable, data-driven brain sub-regions
- Different brain connectivity measurements
- Functional connectivity (statistical relationship)
- Effective connectivity (est. causal influence)
- Novel MOU-EC fits multivariate ornstein uhlenbeck process as generative network model
- Can be constrained by structural connectivity (under development)
- Required Software
- Python >= 3.8 (and Python == 3.6 for using locaNMF)
- Anaconda
- Snakemake
- Clone this repository
- Required Files
- Put the experimental data into
repository-path/resources/experiment/"mouse-id"/"experiment-date"/
with "mouse-id" and "experiment-date" being provided by you- e.g.
../repository/resources/experiment/GN06/2021-01-20_10-15-16
- e.g.
- Put the experimental data into
- Have a look at the Trouble Shooting if you encounter problems during the setup
All commands assume you followed the default install procedure for Snakemake within a conda virtual environment
- Activate conda virtual environment
conda activate snakemake
- Customize config file (or use default config to test pipeline installation)
- Detailed description can be found here
- Run pipeline
-
snakemake -j4 --use-conda "rule"
with "rule" being replace by:-
test_installation
covers all processing steps -
decoding_performance
performs neural decoding with full feature space and plots results across all features and parcellations -
reduce_biomarkers
performs recursive feature elimination to select most discriminative features and visualizes them in an interactive glassbrain plot
-
- Parameters are detailed here (like #Cores, overwrite-flags, ...)
-
- For usage within a cluster environment (SLURM) refer to this page
-
- Decoding performance plotted...
- ...for each feature and decoder individually
- ...across all features and decoders for each parcellation
- ...across all parcellations and decoders for feature
- Interactive glassbrain plot of biomarkers (Example)
- Visualizes neural population of each spatial component
- Shows discriminative brain connections (biomarkers) annotated with...
- ...neighbourhoods / potential sub-circuits
- ...corresponding average weight in decoder (under development)
- ...corresponding rank from recursive feature elimination (under development)
- Decoding performance plotted...
-
- Aligned calcium activity
- Parcellated calcium activity
- Calculated features (including brain connectivity measurements)
- Trained decoders
- Decoders accuracy on test sets
- Selected biomarkers
-
(βΆ) Running & configuring pipeline (for User)
(π ) Extending pipeline functions (for Developers).
βββ ci_lib (π ) | custom python package containing all custom functions for the pipeline steps
βββ workflow (π ) | Snakemake logic like rules, envs and entry scripts
βββ config (βΆ) | config files for pipeline runs
βββ resources (βΆ) | experimental data and brain atlases
βββ results (βΆ) | processed data and plots
βββ SLURM | batch files to run on computational clusters -
-
Unifiction | Loading various data formats into standardized
DecompData-class
- Registration | Aligning the imaging data of different sessions and subject
-
Parcellation | Decomposes the pixel-wise activity data into activation levels (
Temporal Components
) of neural population (Spatial Components
) (Intuitive explanation, look corresponding page for more details) - Trial & Condition Extraction |
-
Feature Calculation | Obtains features like Mean, FC, EC, ... based on the
Temporal Components
- Neural Decoding |
- Recursive Feature Elimination |
- Visualization | Produces the above mentioned plots
-
Unifiction | Loading various data formats into standardized
-
- Loading & Preprocessing
- Parcellation
- Filtering
- Condition Extraction
- Feature Calculation
- Recursive Feature Elimination
- Deconding
- Plotting