You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 29, 2023. It is now read-only.
I'm trying right now with the "--hasNonVariantSegments" flag removed, but this is data generated from gVCF files, after processing with the non-variant-segment-transformer, so it should have non-variant segments.
I'm not really sure what this means or how I can go about debugging it. Any ideas are greatly appreciated!
The text was updated successfully, but these errors were encountered:
I recommend clicking through to the detailed logs and seeing if there is any more info there. The mostly likely issue is an OutOfMemory exception somewhere because the merge operation needs to co-locate all data for a contiguous genomic region on a single machine to perform the merge. If that is the issue, I recommend trying highmem machines. If you still see an OOM, then try smaller genomics regions by decreasing the value of --binSize.
I am trying to apply the IdentityByState pipeline to my variant data, but it reliably (n=4) fails with a write error.
Error message:
Command:
I'm trying right now with the "--hasNonVariantSegments" flag removed, but this is data generated from gVCF files, after processing with the non-variant-segment-transformer, so it should have non-variant segments.
I'm not really sure what this means or how I can go about debugging it. Any ideas are greatly appreciated!
The text was updated successfully, but these errors were encountered: