DADA2 Error: No reads passed filter

I keep receiving an error from the DADA2 plugin suggesting that no reads are able to pass through the filter, but when looking at the quality plots, everything appears fairly normal. There are around 90 samples with an average of 128,489 reads per sample.

I have read the two other posts related to this error, it was recommended to try changing the length, this did not work for me. The forward and reverse reads are about 300 bp long and I tried truncating them at 250, 190 and 100 with no avail. It appears as the reads are fairly good quality so I'm not sure why this is happening. I've been able to pass this command before on a few different data sets, with similar looking quality plots.

It was also suggested to try using denoise-single instead of denoise-paired. This also didn't work for me and I again retrieved the same error. (The error I posted below is actually what was returned when I used Denoise-single).

I'm running this through qiime2-2018.11 on Cyverse Discovery Environment in Jupyter Notebook, and all commands leading up to the DADA2 Denoise step have worked.
Cyverse suggested using Deblur instead of DADA2, this also didn't work and another error was raised, but I would really like to use both my forward and reverse reads, so I would really prefer using dada2 denoise-paired if at all possible.

Commands ran

qiime tools import --type MultiplexedPairedEndBarcodeInSequence --input-path sequences --output-path paired-multiplexed.qza --input-format MultiplexedPairedEndBarcodeInSequenceDirFmt

qiime cutadapt demux-paired --i-seqs paired-multiplexed.qza --m-forward-barcodes-file barcodemap.txt --m-forward-barcodes-column BarcodeSequence --o-per-sample-sequences demultiplexed-seqs.qza --o-untrimmed-sequences untrimmed.qza

qiime cutadapt trim-paired --i-demultiplexed-sequences demultiplexed-seqs.qza --p-front-f GTAAAACGACGGCCAG --o-trimmed-sequences demux.qza --verbose

qiime demux summarize --i-data demux.qza --o-visualization demux.qzv

qiime dada2 denoise-paired --i-demultiplexed-seqs demux.qza --p-trim-left-f 13 --p-trim-left-r 13 --p-trunc-len-f 190 --p-trunc-len-r 190 --o-table table.qza --o-representative-sequences rep-seqs.qza --o-denoising-stats denoising-stats.qza --verbose

qiime dada2 denoise-single --i-demultiplexed-seqs demux.qza --p-trunc-len 190 --p-n-threads 0 --o-table table.qza --o-representative-sequences rep-seqs.qza --o-denoising-stats denoising-stats.qza --verbose

Error message

~/vice$ qiime dada2 denoise-single --i-demultiplexed-seqs demux.qza --p-trunc-len 190 --p-n-threads 0--o-table table.qza --o-representative-sequences rep-seqs.qza --o-denoising-stats denoising-stats.qza --verbose
Running external command line application(s). This may print messages to stdout and/or stderr.
The command(s) being run are below. These commands cannot be manually re-run as they will depend on temporary files that no longer exist.

Command: run_dada_single.R /var/tmp/q2-SingleLanePerSampleSingleEndFastqDirFmt-c9jltyfr /var/tmp/tmpp1yud6sq/output.tsv.biom /var/tmp/tmpp1yud6sq/track.tsv /var/tmp/tmpp1yud6sq 190 0 2.0 2 Inf consensus 1.0 0 1000000 NULL 16

Fatal error: cannot create 'R_TempDir'
Traceback (most recent call last):
File "/opt/conda/envs/qiime2-2018.11/lib/python3.5/site-packages/q2_dada2/_denoise.py", line 152, in _denoise_single
run_commands([cmd])
File "/opt/conda/envs/qiime2-2018.11/lib/python3.5/site-packages/q2_dada2/_denoise.py", line 36, in run_commands
subprocess.run(cmd, check=True)
File "/opt/conda/envs/qiime2-2018.11/lib/python3.5/subprocess.py", line 398, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['run_dada_single.R', '/var/tmp/q2-SingleLanePerSampleSingleEndFastqDirFmt-c9jltyfr', '/var/tmp/tmpp1yud6sq/output.tsv.biom', '/var/tmp/tmpp1yud6sq/track.tsv', '/var/tmp/tmpp1yud6sq', '190', '0', '2.0', '2', 'Inf', 'consensus', '1.0', '0', '1000000', 'NULL', '16']' returned non-zero exit status 2

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/opt/conda/envs/qiime2-2018.11/lib/python3.5/site-packages/q2cli/commands.py", line 274, in call
results = action(**arguments)
File "", line 2, in denoise_single
File "/opt/conda/envs/qiime2-2018.11/lib/python3.5/site-packages/qiime2/sdk/action.py", line 231, in bound_callable
output_types, provenance)
File "/opt/conda/envs/qiime2-2018.11/lib/python3.5/site-packages/qiime2/sdk/action.py", line 362, in callable_executor
output_views = self._callable(**view_args)
File "/opt/conda/envs/qiime2-2018.11/lib/python3.5/site-packages/q2_dada2/_denoise.py", line 187, in denoise_single
band_size='16')
File "/opt/conda/envs/qiime2-2018.11/lib/python3.5/site-packages/q2_dada2/_denoise.py", line 159, in _denoise_single
" filter." % trunc_len)
ValueError: No reads passed the filter. trunc_len (190) may be longer than read lengths, or other arguments (such as max_ee or trunc_q) may be preventing reads from passing the filter.

Plugin error from dada2:

No reads passed the filter. trunc_len (190) may be longer than read lengths, or other arguments (such as max_ee or trunc_q) may be preventing reads from passing the filter.

See above for debug info.

Hi!
Can you repeat your qiime dada2 denoise-paired command with adding and playing with
–p-max-ee (integer less than 2.0)
or
–p-trunc-q (INTEGER more than 2.0)

1 Like

@mjoi15,
What is the amplicon length you expect? What primers/gene target are you using?

Note that you do not need to truncate forward and reverse at the same positions — this may give you better overlap if you have long amplicons.

Is the plot you shared from this visualization? Could you please share the QZV file here?

1 Like

This is 16S Illumina Miseq data. The environmental samples were amplified using 341F/ 785R primers producing 450 bp amplicons.

demux.qzv (301.7 KB)

Hello,
Thanks for the suggestion! I tried adding both parameters individually changing --p-max-ee at 1.0, and 0.5, and --p-trunc-q at 10, 20, and 30.

Unfortunately the same error was produced every time. :confused:

dada2 requires a minimum 20 nt overlap to successfully merge. So 190 + 190 will fail — 250 + 250 will also fail because the reverse reads are lower quality and you will be permitting too much low-quality sequence by trimming at 250. Try 270 forward + 210 reverse, see how that performs.

HOWEVER, merging reads is clearly not problem here, since denoise-single is also failing! Something else is wrong.

One possibility: I notice that your forward reads are a range of different lengths. This is surely because you are using cutadapt trim-paired to trim adapter sequences. This should not cause a problem (I do this all the time in my own work), but I am suspicious since denoise-single is failing. What happens if you run denoise-single on the untrimmed sequences? You can use the --p-trim-len parameter to trim off any adapter/primer that you expect on 5' end of the forward reads, and you should not expect to find adapter sequences anywhere else in the reads anyway since this is 16S data and you do not have massive length variation, so you should not expect read-through issues (which would result in your reads going right through the reverse primer/adapter) — right?

1 Like

@Nicholas_Bokulich
I tried

as well as a couple of 10 increment changes at both positions, but unfortunately none of them worked.

I then tried

my untrimmed.qza is not the right semantic type, but I tried this on my demultiplexed-seqs.qza and again no reads passed.

Commands Used
qiime dada2 denoise-single --i-demultiplexed-seqs demultiplexed-seqs.qza --p-trunc-len 230 --p-trim-left 15 --p-n-threads 0 --o-table table-u.qza --o-representative-sequences rep-seqs-u.qza --o-denoising-stats denoising-stats-u.qza --verbose

Okay, this is very unusual, since your reads are very good quality and plenty long!

I think I have found the issue:

It looks like perhaps this is a permissions issue, and you cannot write files to the default temp directory that dada2 is attempting to access. Chances are if you ran this on your own computer all would work, but since you are using a remote environment (cyverse) this issue is occurring.

One way to get around this may be to define a different temporary directory to which you do have write permission. See here:

You should probably get in contact with the system admin to discuss permissions and what temp directory to use.

2 Likes

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.