Skip to content

Commit

Permalink
Merge pull request #517 from SPPearce/crukmi
Browse files Browse the repository at this point in the history
Update CRUKMI config
  • Loading branch information
SPPearce committed Jun 29, 2023
2 parents 02a93a4 + f817111 commit 3248fa4
Show file tree
Hide file tree
Showing 4 changed files with 11 additions and 30 deletions.
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,6 @@ Currently documentation is available for the following pipelines within specific
- rnavar
- [MUNIN](docs/pipeline/rnavar/munin.md)
- sarek
- [Cancer Research UK Manchester Institute](docs/pipeline/sarek/crukmi.md)
- [EVA](docs/pipeline/sarek/eva.md)
- [MUNIN](docs/pipeline/sarek/munin.md)
- [SBC_SHARC](docs/pipeline/sarek/sbc_sharc.md)
Expand Down
14 changes: 7 additions & 7 deletions conf/crukmi.config
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ singularity {
}

process {
beforeScript = 'module load apps/singularity/3.8.0'
executor = 'pbs'
beforeScript = 'module load apps/apptainer/1.0.0'
executor = 'slurm'

errorStrategy = {task.exitStatus in [143,137,104,134,139,140] ? 'retry' : 'finish'}
maxErrors = '-1'
Expand All @@ -35,20 +35,20 @@ process {
}

withLabel:process_high {
cpus = { check_max( 16 * task.attempt, 'cpus' ) }
memory = { check_max( 80.GB * task.attempt, 'memory' ) }
cpus = { check_max( 48 * task.attempt, 'cpus' ) }
memory = { check_max( 256.GB * task.attempt, 'memory' ) }
}

}

executor {
name = 'pbs'
name = 'slurm'
queueSize = 1000
pollInterval = '10 sec'
}

params {
max_memory = 2000.GB
max_cpus = 32
max_memory = 4000.GB
max_cpus = 96
max_time = 72.h
}
18 changes: 0 additions & 18 deletions conf/pipeline/sarek/crukmi.config

This file was deleted.

8 changes: 4 additions & 4 deletions docs/crukmi.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
# nf-core/configs: Cancer Research UK Manchester Institute Configuration

All nf-core pipelines have been successfully configured for the use on the HPC (phoenix) at Cancer Research UK Manchester Institute.
All nf-core pipelines have been successfully configured for the use on the HPC (griffin) at Cancer Research UK Manchester Institute.

To use, run the pipeline with `-profile crukmi`. This will download and launch the [`crukmi.config`](../conf/crukmi.config) which has been pre-configured with a setup suitable for the phoenix HPC. Using this profile, singularity images will be downloaded to run on the cluster.
To use, run the pipeline with `-profile crukmi`. This will download and launch the [`crukmi.config`](../conf/crukmi.config) which has been pre-configured with a setup suitable for the griffin HPC. Using this profile, singularity images will be downloaded to run on the cluster and stored in a centralised location.

Before running the pipeline you will need to load Nextflow using the environment module system, for example via:

```bash
## Load Nextflow and Singularity environment modules
## Load Nextflow environment modules
module purge
module load apps/nextflow/22.04.5
```

The pipeline should always be executed inside a workspace on the `/scratch/` system. All of the intermediate files required to run the pipeline will be stored in the `work/` directory. It is recommended to delete this directory after the pipeline has finished successfully because it can get quite large, and all of the main output files will be saved in the `results/` directory.
The pipeline should always be executed inside a workspace on the `/scratch/` system. All of the intermediate files required to run the pipeline will be stored in the `work/` directory. It is recommended to delete this directory after the pipeline has finished successfully because it can get quite large.

0 comments on commit 3248fa4

Please sign in to comment.