Memory exceeds above 100G for WGS gVCF (pgscatalog-match) #368
-
Hi Team, I am using whole genome GRCh38 gVCF as input. I configured the memory limit upto 110G and 32CPUs. The pgscatalog match method use till the 110G memory and then it get killed automatically or getting crashed with below error. Unzipped gVCF file size falls around 45G.
My question: Is there any way to chunk the gVCF into small chunks and merge back the results later? Anyone can help on it? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
I would suggest splitting the data by chromosome if you haven't already (some relevant docs: https://pgsc-calc.readthedocs.io/en/latest/how-to/prepare.html). For instance by running a similar command per-chromosome:
|
Beta Was this translation helpful? Give feedback.
-
Thanks for details and it was useful. |
Beta Was this translation helpful? Give feedback.
I would suggest splitting the data by chromosome if you haven't already (some relevant docs: https://pgsc-calc.readthedocs.io/en/latest/how-to/prepare.html). For instance by running a similar command per-chromosome: