-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mapping rate no longer reported by any pipeline #85
Comments
I now usually use multiqc stats as my mapping rate. |
running multiqc via
|
Which mapper were you using? I suspect the outputs of our logs do not match the required input for some of our mappers in MultiQC. I know this is the case for salmon in transdiffexpres and maybe I think for STAR in mapping. I think it is due to the way we redirect the outputs to logs. |
We mostly use STAR, Salmon and BWA. BWA isn't even supported by MultiQC, mostly because i don't think it outputs a log file of any sort. |
For STAR: However, the output of our star mapping produces this. When I run the pipeline for STAR is generates the appropriate output for both bowtie and STAR (im using our pipeline test data), but obviously not bwa . The reason they down support BWA is because the logs don't produce anything worth parsing so their idea was to rely on downstream tools. See: MultiQC/MultiQC#162 Did your mapper fail or is there something else that prevented logs being output from STAR? |
The particular example here is BWA (which is probably the mapper we use the most - we do most RNAseq with salmon these days). We used to actually calculate the mapping rate rather than rely on logs. |
Since the mapping pipeline was split into mapping and bamstats, as far as I can tell no pipeline now reports very basic statistics about mapped files, such as % mapping rate , % spliced reads etc.
By preference I think that the mapping pipeline should report this for two reasons:
I will try to have a look at this and the mapping tuples/compression option thing #80 this week if I find time.
The text was updated successfully, but these errors were encountered: