An error was encountered while running DADA2 in R (return code -9), please inspect stdout and stderr to learn more.

Dear everyone,
I am trying using qiime2 to handle my microbiome sequences. But i was stuck into dada2 for merging data and denoising. Here is my screenshot and debug log file. I really need helps, crying…


debug-return_error_code_9.txt (2.6 KB)

Welcome to the forum, @1111!
I can try to help, but will need some more details. Would you mind sharing the logfile mentioned in your error message? Is this a native QIIME 2 installation, or is there anything special about your computing environment (e.g. are you using a virtual machine, a shared high-performance system, or anything out of the ordinary)?

The only other thing that jumps out at me (and this is a longshot) - it looks like there’s a funny line break after --i-demul. The error you’re getting doesn’t look like a normal QIIME 2 syntax error, but is there any chance that line break is causing you problems?

Best,
Chris :honeybee:

@ChrisKeefe thanks very much! I attach the debug logfile under my picture, named debug-return_error_code_9.
And this native installation under ubuntu16.04 system.Emm, as for computer enviroment, I don’t discern any special things, maybe python version is 3.6.7?I only upgrade my python version…
The internal between --i-demultiplexed may be caused by screenshot, I check my code and the spell is right.
Is there possibility that my little run memory lead to this problem?

Sorry I missed that logfile in the first pass, @1111! Your instinct is probably right - there’s a very good chance that this is a memory issue. We see sigkill signals like this often when the system’s RAM is overloaded, and the OS sends a sigkill in an effort to fix the situation. :expressionless:

As a general note - don’t update python within QIIME 2 conda environments - sometimes a new version introduces breaking changes. Assuming you’re running this in a conda environment, though, updating your global Python version shouldn’t affect anything, and python 3.6 works great with QIIME 2 anyway.

How much RAM does your system have available? Do you have access to an institutional compute cluster, or a machine with more memory you could try running this on? Unless your data set is very large, DADA2 should only take a couple of hours. Some other processes downstream may be similarly memory-intensive, but many of them come with parameters you can use to “chunk” the data to reduce memory usage in exchange for increased run time.

Best,
Chris :sheep:

2 Likes

thanks a lot, it is nice to hear your reply! This computer RAM is just 4Gb, I check my target data and this .fq files should be 1.3Gb…so, it may be unfeasible to treat in my own computer?
While our laboratory doesn’t have access to bigger computer cluster. Could you mind tell me the minimize RAM that could run this data in self computer? Thanks again! :smiley: :smiley:

1 Like

Unfortunately, @1111, I can’t. :crying_cat_face: estimating required ram depends on:

  • data size
  • data complexity
  • other processes you have running
  • probably other stuff I’m not thinking of.

Your data set isn’t huge, so trial and error will probably get you there. Before you go buying more RAM, you should know that, though out-of-memory errors are common, sigkill errors could happen for other reasons too: maybe the system went to sleep, rebooted, ran out of storage space, or something.

Some things to try:

  • make sure your machine doesn’t suspend/sleep/hibernate while you’re running the command. This could cause a sigkill
  • shut down any extra processes when you run denoise-paired. Firefox is great, but will take up memory you may need.
  • try adjusting --p-n-reads-learn to a smaller number. Maybe 100,000? This parameter constrains the size of the data set used for training the error model, and running at a lower number should reduce memory required in exchange for some loss in model quality.

In a pinch, you could also rent a computer on the interwebs and run your analysis there. Setup would be a little hassle, but they can be quite cheap, and the time savings/convenience of freeing up your computer could make it worthwhile.

3 Likes

Grate for your kind advices! And later I would try it again.
I am sorry to bother you that I have another question about this step… When dada2 failed, I changed to use deblur for denoising and ASV. It’s weird that deblur takes shorter time to finish, just about half an hour to finish. emmm, unbelievable :sleeping:

.

Yeah, they’re very different methods, so that doesn’t surprise me a whole lot. Until early 2019, it was not uncommon for DADA2 runs to take multiple days. The error correction and quality filtering DADA2 provides still made it worth the time for the analyses I was working on, though. The DADA2 developers have done a great job increasing efficiency, and some 2-day jobs are now running for me in an hour and change. Good luck!

1 Like

Huh, I finally finish this long step. After I browse related topics, I find that one solution is to move this importing demuxed data to original fastq files, and when demux.qza files stay with fastq files, things could be done!