So much data loss after doing filterAndTrim in DADA2

Hi,
I have a problem in filtering my data in DADA2. First, I removed primers using cutadapt.
Then in the DADA2 filterAndTrim step using truncLen = c(245, 240), I lost so much data and I can’t figure it out.

Reads: V3-V4 region using 341F/802R, 250 cycle. So (500-(802-341))=39 bp overlap.

My command: out <- filterAndTrim(fnFs, filtFs, fnRs, filtRs, truncLen=c(245,240),
maxN=0, maxEE=c(1,1), truncQ=2, rm.phix=TRUE,
compress=TRUE, multithread=TRUE)

This is the result after running the command:
reads.in reads.out
SRR5234517_1.fastq.gz 218448 1785
SRR5234518_1.fastq.gz 160615 1365
SRR5234519_1.fastq.gz 196344 1962
SRR5234520_1.fastq.gz 230331 2008
SRR5234521_1.fastq.gz 233277 1289
SRR5234522_1.fastq.gz 226978 1474
SRR5234523_1.fastq.gz 236371 1738
SRR5234524_1.fastq.gz 210106 1164
SRR5234525_1.fastq.gz 210090 1160
SRR5234526_1.fastq.gz 147470 1460
SRR5234527_1.fastq.gz 94833 416
SRR5234528_1.fastq.gz 111036 680
SRR5234529_1.fastq.gz 79993 1111
SRR5234530_1.fastq.gz 210188 1955

Hi @SaharB! I moved this to the “Other Bioinformatics Tools” section of the forum, since this question is not related to QIIME 2. I suggest you get in touch with the DADA2 team via their official channels. Thanks!

Thanks a lot @thermokarst

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.