Skip to content

bothlab/mak-hpc-pipelines

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Matthias' HPC Pipelines

This repository is likely only useful as a reference for most people. It contains various pipelines to analyze data fully automatically on a HPC cluster that is running Slurm.

It has pipelines for:

  • Miniscope image analysis using Minian
  • DeepLabCut training
  • DeepLabCut inference
  • Older pipelines for Miniscope analysis using MIN1PIPE and CaImAn

If you are using Syntalos for data acquisition, this tool can create the necessary Slurm configuration fully automatically and execute it on the cluster.

You can use JobTool/create-jobs-for-task.py for that purpose.

Setup

This tool wasn't originally intended to be used by others, so documentation is a bit lacking. In brief, to get this working on the bwHPC Helix cluster (adjust for your particular HPC installation):

  1. Log into the entry node

  2. Clone this repository into a directory next to your data, e.g. if this directory is called HPC, the layout may look like this:

    ├── Data
    ├── DLCProjects
    └── HPC
    

    This directory may lie on the HPC's data storage system.

  3. Set up the workspaces initially by running <makhpcpath>/Setup/setup-all.sh youremail@institution.com, replacing the E-Mail with a valid institute email so you are notified when the workspace is about to expire (this repository also contains a script to extend it).

  4. Update your .bashrc for convenience: Run nano ~/.bashrc and scroll to the bottom of the text file. Then add the line: source $( ws_find conda )/conda/etc/profile.d/conda.sh to it and save the file. Then log out and log in to the cluster again.

  5. Run ./<makhpcpath>/Tools/clone-all.sh

  6. Run:

    cd ~
    ln -s <makhpcpath>/JobTool/create-jobs-for-task.py create-jobs-for-task

This way you can now, immediately after login, run ./create-jobs-for-task to schedule tasks.

Run analysis

To run a Minian analysis task for EDL directories generated by Syntalos, you can just run: ./create-jobs-for-task minian-analyze -d Data/<directory-with-edl-dirs>

The data path after -d is relative to the directory where the utilities directory is placed in. Data will not be analyzed twice, so once it is analyzed, no second job will be scheduled. You can limit the animals/experiments by passing pattern flags to the job tool command.

Running jobs can be viewed by using squeue -l.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published