error in qiime dada2 denoise-paired command

Hello Qiime2 team,
I am using qiime2-2019.10 and got the following error using qiime dada2 denoise-paired command:

*****below is complete error
Command:
qiime dada2 denoise-paired --i-demultiplexed-seqs demux-paired-end.qza --p-trim-left-f 0 --p-trim-left-r 0 --p-trunc-len-f 245 --p-trunc-len-r 245 --o-table table.qza --o-representative-sequences rep-seqs.qza --o-denoising-stats denoising-stats.qza --p-n-threads 0 --verbose

Running external command line application(s). This may print messages to stdout and/or stderr.
The command(s) being run are below. These commands cannot be manually re-run as they will depend on temporary files that no longer exist.

Command: run_dada_paired.R /tmp/tmpcqewjq9m/forward /tmp/tmpcqewjq9m/reverse /tmp/tmpcqewjq9m/output.tsv.biom /tmp/tmpcqewjq9m/track.tsv /tmp/tmpcqewjq9m/filt_f /tmp/tmpcqewjq9m/filt_r 248 248 0 0 2.0 2.0 2 consensus 1.0 0 1000000

R version 3.5.1 (2018-07-02)
Loading required package: Rcpp
DADA2: 1.10.0 / Rcpp: 1.0.2 / RcppParallel: 4.4.4

  1. Filtering Error in sendMaster(try(lapply(X = S, FUN = FUN, …), silent = TRUE)) :
    write error, closing pipe to the master
    Error in names(answer) <- names1 :
    β€˜names’ attribute [104] must be the same length as the vector [102]
    Execution halted
    Traceback (most recent call last):
    File β€œ/home/scebmeta/bin/miniconda3/envs/qiime2-2019.10/lib/python3.6/site-packages/q2_dada2/_denoise.py”, line 257, in denoise_paired
    run_commands([cmd])
    File β€œ/home/scebmeta/bin/miniconda3/envs/qiime2-2019.10/lib/python3.6/site-packages/q2_dada2/_denoise.py”, line 36, in run_commands
    subprocess.run(cmd, check=True)
    File β€œ/home/scebmeta/bin/miniconda3/envs/qiime2-2019.10/lib/python3.6/subprocess.py”, line 418, in run
    output=stdout, stderr=stderr)
    subprocess.CalledProcessError: Command β€˜[β€˜run_dada_paired.R’, β€˜/tmp/tmpcqewjq9m/forward’, β€˜/tmp/tmpcqewjq9m/reverse’, β€˜/tmp/tmpcqewjq9m/output.tsv.biom’, β€˜/tmp/tmpcqewjq9m/track.tsv’, β€˜/tmp/tmpcqewjq9m/filt_f’, β€˜/tmp/tmpcqewjq9m/filt_r’, β€˜248’, β€˜248’, β€˜0’, β€˜0’, β€˜2.0’, β€˜2.0’, β€˜2’, β€˜consensus’, β€˜1.0’, β€˜0’, β€˜1000000’]’ returned non-zero exit status 1.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File β€œ/home/scebmeta/bin/miniconda3/envs/qiime2-2019.10/lib/python3.6/site-packages/q2cli/commands.py”, line 328, in call
results = action(**arguments)
File β€œ</home/scebmeta/bin/miniconda3/envs/qiime2-2019.10/lib/python3.6/site-packages/decorator.py:decorator-gen-463>”, line 2, in denoise_paired
File β€œ/home/scebmeta/bin/miniconda3/envs/qiime2-2019.10/lib/python3.6/site-packages/qiime2/sdk/action.py”, line 240, in bound_callable
output_types, provenance)
File β€œ/home/scebmeta/bin/miniconda3/envs/qiime2-2019.10/lib/python3.6/site-packages/qiime2/sdk/action.py”, line 383, in callable_executor
output_views = self._callable(**view_args)
File β€œ/home/scebmeta/bin/miniconda3/envs/qiime2-2019.10/lib/python3.6/site-packages/q2_dada2/_denoise.py”, line 272, in denoise_paired
" and stderr to learn more." % e.returncode)
Exception: An error was encountered while running DADA2 in R (return code 1), please inspect stdout and stderr to learn more.

Plugin error from dada2:

An error was encountered while running DADA2 in R (return code 1), please inspect stdout and stderr to learn more.

See above for debug info.


Please help me to solve this issue,
Thanks in advance,
Khemlal

Hi @khemlalnirmalkar,

I would suggest to switch to a more recent version of qiime if it is possible.
Still I would not expect this to fix your error, which is most likely a memory issue as discussed in: :sleeping:Denoising error, names attribute must be the same length as vector

That is, reducing the number of threads you are using should fix the problem!
Cheers

2 Likes

This is the bulk of the error message, it is a bit cryptic, but it indicates that you have over-subscribed your host machine - you need to specify fewer threads when running q2-dada2.

1 Like

Yes, thanks @thermokarst @llenzi,
with fewer threads, it worked,