Welcome to the forum, @Carlmed00 !
Carlmed00:
However, when I tried it using silva instead, it runs quite some time until I get an error (picture attached in this thread). I am supposing that this is an issue regarding my memory, however it seems that it only occurs if I use Silva. I am unsure what I might have done wrong since it works with Greengenes and the only part I alter in the script is the classifier file.
You are correct, that's a memory error. SILVA is ~4X as large as Greengenes so takes much more memory and time to use. See this post for some options to reduce memory load:
This is a memory error; essentially you do not have enough memory to open the SILVA classifier on your computer.
This is low for the SILVA classifier — it will often take up to 32GB+ if left to its own devices!
There’s a couple of things you might be able to do.
First, you can use the --p-reads-per-batch parameter (e.g., set to 1000 or 2000) to reduce the number of reads classified at a time. However, it looks like you do not have enough memory to load the SILVA classifier, not the reads, s…
Carlmed00:
While I run the taxonomical classification script using Silva, I notice that it eats up a large chunk of file space. Since it generates an error, there is no output file. I am unsure how to clear that space up. Seems like it is not stored in my cache. I am just worried in my future analysis that it’ll just eat up spaces that I wouldn’t know where to locate and clear.
Any temp files should automatically clear when the job complete but let us know if you're still experiencing a problem.
I hope that helps!
1 Like