Q2 picrust2 error with the custom-tree-pipeline

Hello everyone,

I am trying to use the q2 picrust2 plugin with my data following the steps in the q2-picrust2-Tutorial. However, I am facing a bit of difficulty running the custom-tree-pipeline step.

This is what I have done so far;
I used the PICRUSt reference files to create a tree using the q2-fragment-insertion pipeline;
qiime fragment-insertion sepp --i-representative-sequences rep_seqs.qza --p-threads 1 --i-reference-alignment reference.fna.qza
–i-reference-phylogeny reference.tre.qza
–output-dir bugz_placed_out
I assume this worked fine since I ended up with the tree.qza and placements.qza artifacts within the bugz_placed-out folder.

I then proceeded to the qiime picrust2 custom-tree-pipeline;
qiime picrust2 custom-tree-pipeline --i-table table.qza --i-tree bugz_placed_out/tree.qza --output-dir q2-picrust2-output --p-threads 1 --p-hsp-method mp --p-max-nsti 2

However after about 50 minutes I got this error;
Error running this command:
hsp.py -i KO -t /tmp/tmp96mo78e3/placed_seqs.tre -p 1 -o /tmp/tmp96mo78e3/picrust2_out/KO_predicted -m mp

I would greatly appreciate the time taken to help me.
Thanks in advance

Can you re-run this command with the addition of the --verbose flag added on to the command? Then please copy and paste the complete output here. Thanks!

Thank you @thermokarst for the prompt response!
Here is a screenshot of the output

It seems to be a memory issue. I wonder how much memory is needed.


Me too — I have no idea. cc @gmdouglas

Hi @Andrew_Bugz,

How many ASVs are you making predictions for? You shouldn’t need more than 8 GB of RAM for typical datasets (e.g. of ~1000 ASVs). For huge datasets you may need 16 GB of RAM or more though (for instance generating predictions for 40,000 ASVs takes usually around 9 GB of RAM).


1 Like

Thank you @gmdouglas and @thermokarst
I have about 1,848 features with a total frequency of about 2,400,000. Problem is, I have not been able to Increase my RAM beyond 5GB.
I think I might have to split my dataset in some way. Do you think that might work @gmdouglas ?

Thank you very much @gmdouglas and @thermokarst for taking the time to help.

I managed to increase the RAM to 7GB and it worked after some 4 hours.

Thank you once again and have a pleasant day.