Skip to content

Segmentation fault error #9

@blasarre

Description

@blasarre

Hello, and thanks for developing this tool. I installed laca on my university HPC following the "Installation from GitHub repository" instructions. Unfortunately, I keep hitting a Segmentation fault error, both when analyzing my own data and when using the test data you provided in raw.fastq.gz. I've included the log text from the run with your test data below for reference. Do you have any insights/suggestions as to the cause of this error and how to remedy it?

Thanks for your help,
Breah

�[38;20m2025-02-25 12:59:49,102 - root - INFO - LACA version: 0.3+2.g5fcefb2.dirty (laca.py:61)�[0m
�[38;20m2025-02-25 12:59:49,390 - root - DEBUG - Executing: snakemake all --directory '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2' --snakefile '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/Snakefile' --configfile '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/config.yaml' --use-conda --conda-prefix '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Database/conda_envs' --use-singularity --singularity-prefix '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Database/singularity_envs' --singularity-args '--bind /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/resources/guppy_barcoding/:/opt/ont/guppy/data/barcoding/,/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Reads'  --rerun-triggers mtime --rerun-incomplete --scheduler greedy --jobs 128 --nolock   --resources mem=717 mem_mb=734877 java_mem=610       (laca.py:105)�[0m
Config file config.yaml is extended by additional config specified via the command line.
Building DAG of jobs...
Pulling singularity image docker://genomicpariscentre/guppy:3.3.3.
Creating conda environment ../envs/laca/laca/workflow/envs/cutadapt.yaml...
Downloading and installing remote packages.
Environment for /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/rules/../envs/cutadapt.yaml created (location: Database/conda_envs/e4b3be83aa2be3b1302cbdf4cab39043_)
Creating conda environment ../envs/laca/laca/workflow/envs/isONcorCon.yaml...
Downloading and installing remote packages.
Environment for /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/rules/../envs/isONcorCon.yaml created (location: Database/conda_envs/aefa100ed5426368adafb85ee09974d6_)
Creating conda environment ../envs/laca/laca/workflow/envs/mmseqs2.yaml...
Downloading and installing remote packages.
Environment for /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/rules/../envs/mmseqs2.yaml created (location: Database/conda_envs/d0b0384d02d03d94aee67487f393ebd1_)
Creating conda environment ../envs/laca/laca/workflow/envs/q2plugs.yaml...
Downloading and installing remote packages.
Environment for /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/rules/../envs/q2plugs.yaml created (location: Database/conda_envs/6eae29c2b8398a61e8ca33b90dd729dc_)
Using shell: /usr/bin/bash
Provided cores: 128
Rules claiming more threads will be scaled down.
Provided resources: mem=717, mem_mb=734877, java_mem=610
Job stats:
job                      count    min threads    max threads
---------------------  -------  -------------  -------------
all                          1              1              1
check_primers_repseqs        1              2              2
cls_isONclust                1              1              1
cls_kmerCon                  1              1              1
cls_meshclust                1              1              1
col_q2blast_batch            1              1              1
collect_consensus            1              1              1
combine_cls                  1              1              1
combine_fastq                1              1              1
count_matrix                 1              1              1
demux_check                  1              1              1
drep_consensus               1              6              6
exclude_empty_fqs            1              1              1
get_taxonomy                 1              1              1
get_tree                     1              1              1
guppy                        1              6              6
isONclust                    1              6              6
matrix_seqid                 1              1              1
q2_fasttree                  1              6              6
q2_repseqs                   1              1              1
q2export_tree                1              1              1
rename_drep_seqs             1              1              1
repseqs_split                1              1              1
total                       23              1              6

Select jobs to execute...

[Tue Feb 25 13:04:24 2025]
localrule guppy:
    output: demux_guppy
    log: logs/demultiplex/guppy.log
    jobid: 9
    benchmark: benchmarks/demultiplex/guppy.txt
    reason: Missing output files: demux_guppy
    threads: 6
    resources: tmpdir=/scratch/blasarre/6062851, mem=50

Activating singularity image /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Database/singularity_envs/be79a9f6f5e87678ce46ad686c92cb19.simg
INFO:    Converting SIF file to temporary sandbox...
ONT Guppy barcoding software version 3.3.3+fa743a6
input path:         /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Reads
save path:          demux_guppy
arrangement files:  barcode_arrs_16S-GXO192.cfg 
min. score front:   60
min. score rear:    60


Found 1 fastq files.

0%   10   20   30   40   50   60   70   80   90   100%
|----|----|----|----|----|----|----|----|----|----|
***************************************************
Done in 182432 ms.
INFO:    Cleaning up image...
[Tue Feb 25 13:07:28 2025]
Finished job 9.
1 of 23 steps (4%) done
Select jobs to execute...

[Tue Feb 25 13:07:28 2025]
localcheckpoint demux_check:
    input: demux_guppy
    output: demultiplexed
    log: logs/demultiplex/check.log
    jobid: 8
    benchmark: benchmarks/demultiplex/check.txt
    reason: Missing output files: demultiplexed; Input files updated by another job: demux_guppy
    resources: tmpdir=/scratch/blasarre/6062851
DAG of jobs will be updated after completion.

[Tue Feb 25 13:07:29 2025]
Finished job 8.
2 of 23 steps (9%) done
Creating conda environment ../envs/laca/laca/workflow/envs/yacrd.yaml...
Downloading and installing remote packages.
Environment for /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/rules/../envs/yacrd.yaml created (location: Database/conda_envs/0f356a0f79553ccf51ae2f747bef105c_)
Select jobs to execute...

[Tue Feb 25 13:07:31 2025]
localrule collect_fastq:
    input: demultiplexed/BRK13
    output: qc/BRK13.fastq
    log: logs/demultiplex/collect_fastq/BRK13.log
    jobid: 40
    benchmark: benchmarks/demultiplex/collect_fastq/BRK13.txt
    reason: Missing output files: qc/BRK13.fastq; Updated input files: demultiplexed/BRK13
    wildcards: barcode=BRK13
    resources: tmpdir=/scratch/blasarre/6062851

[Tue Feb 25 13:07:31 2025]
Finished job 40.
3 of 30 steps (10%) done
Select jobs to execute...

[Tue Feb 25 13:07:31 2025]
rule check_primers:
    input: qc/BRK13.fastq
    output: qc/primers_passed/BRK13F.fastq, qc/primers_unpassed/BRK13F.fastq
    log: logs/qc/check_primersF/BRK13.log
    jobid: 39
    benchmark: benchmarks/qc/check_primersF/BRK13.txt
    reason: Missing output files: qc/primers_passed/BRK13F.fastq, qc/primers_unpassed/BRK13F.fastq; Input files updated by another job: qc/BRK13.fastq
    wildcards: barcode=BRK13
    threads: 6
    resources: tmpdir=/scratch/blasarre/6062851, mem=10, time=1

Activating conda environment: Database/conda_envs/e4b3be83aa2be3b1302cbdf4cab39043_
[Tue Feb 25 13:07:34 2025]
Finished job 39.
4 of 30 steps (13%) done
Removing temporary output qc/BRK13.fastq.
Select jobs to execute...

[Tue Feb 25 13:07:34 2025]
rule check_primersR:
    input: qc/primers_unpassed/BRK13F.fastq
    output: qc/primers_passed/BRK13R.fastq, qc/primers_unpassed/BRK13.fastq
    log: logs/qc/check_primersR/BRK13.log
    jobid: 41
    benchmark: benchmarks/qc/check_primersR/BRK13.txt
    reason: Missing output files: qc/primers_passed/BRK13R.fastq; Input files updated by another job: qc/primers_unpassed/BRK13F.fastq
    wildcards: barcode=BRK13
    threads: 6
    resources: tmpdir=/scratch/blasarre/6062851, mem=10, time=1

Activating conda environment: Database/conda_envs/e4b3be83aa2be3b1302cbdf4cab39043_
[Tue Feb 25 13:07:35 2025]
Finished job 41.
5 of 30 steps (17%) done
Removing temporary output qc/primers_unpassed/BRK13F.fastq.
Removing temporary output qc/primers_unpassed/BRK13.fastq.
Select jobs to execute...

[Tue Feb 25 13:07:35 2025]
rule revcomp_fq_combine:
    input: qc/primers_passed/BRK13F.fastq, qc/primers_passed/BRK13R.fastq
    output: qc/primers_passed/BRK13R_revcomp.fastq, qc/primers_passed/BRK13.fastq
    jobid: 38
    reason: Missing output files: qc/primers_passed/BRK13.fastq; Input files updated by another job: qc/primers_passed/BRK13F.fastq, qc/primers_passed/BRK13R.fastq
    wildcards: barcode=BRK13
    threads: 2
    resources: tmpdir=/scratch/blasarre/6062851, mem=10, time=1

[Tue Feb 25 13:07:35 2025]
Finished job 38.
6 of 30 steps (20%) done
Removing temporary output qc/primers_passed/BRK13F.fastq.
Removing temporary output qc/primers_passed/BRK13R.fastq.
Removing temporary output qc/primers_passed/BRK13R_revcomp.fastq.
Select jobs to execute...

[Tue Feb 25 13:07:35 2025]
rule minimap2ava_yacrd:
    input: qc/primers_passed/BRK13.fastq
    output: qc/yacrd/BRK13.paf
    log: logs/qc/yacrd/BRK13_ava.log
    jobid: 42
    benchmark: benchmarks/qc/yacrd/BRK13_ava.txt
    reason: Missing output files: qc/yacrd/BRK13.paf; Input files updated by another job: qc/primers_passed/BRK13.fastq
    wildcards: barcode=BRK13
    threads: 6
    resources: tmpdir=/scratch/blasarre/6062851, mem=50, time=1

Activating conda environment: Database/conda_envs/0f356a0f79553ccf51ae2f747bef105c_
/usr/bin/bash: line 1: 580155 Segmentation fault      (core dumped) minimap2 -x ava-ont -g 500 -f 10000 -t 6 qc/primers_passed/BRK13.fastq qc/primers_passed/BRK13.fastq > qc/yacrd/BRK13.paf 2> logs/qc/yacrd/BRK13_ava.log
[Tue Feb 25 13:07:35 2025]
Error in rule minimap2ava_yacrd:
    jobid: 42
    input: qc/primers_passed/BRK13.fastq
    output: qc/yacrd/BRK13.paf
    log: logs/qc/yacrd/BRK13_ava.log (check log file(s) for error details)
    conda-env: /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Database/conda_envs/0f356a0f79553ccf51ae2f747bef105c_
    shell:
        minimap2 -x ava-ont -g 500 -f 10000 -t 6 qc/primers_passed/BRK13.fastq qc/primers_passed/BRK13.fastq > qc/yacrd/BRK13.paf 2> logs/qc/yacrd/BRK13_ava.log
        (one of the commands exited with non-zero exit code; note that snakemake uses bash strict mode!)

Removing output files of failed job minimap2ava_yacrd since they might be corrupted:
qc/yacrd/BRK13.paf
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: .snakemake/log/2025-02-25T125949.996297.snakemake.log
�[31;1m2025-02-25 13:07:35,704 - root - CRITICAL - Command 'snakemake all --directory '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2' --snakefile '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/Snakefile' --configfile '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/config.yaml' --use-conda --conda-prefix '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Database/conda_envs' --use-singularity --singularity-prefix '/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Database/singularity_envs' --singularity-args '--bind /lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/envs/laca/laca/workflow/resources/guppy_barcoding/:/opt/ont/guppy/data/barcoding/,/lustre/hdd/LAS/gbeattie-lab/blasarre/Microbiome_ONT/laca_test_2/Reads'  --rerun-triggers mtime --rerun-incomplete --scheduler greedy --jobs 128 --nolock   --resources mem=717 mem_mb=734877 java_mem=610      ' returned non-zero exit status 1. (laca.py:113)�[0m

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions