Using GPU with QIIME2 on Linux - CUDA PYTHON?

I have been advised to consider using GPU-computing. We are ready to purchase the relevant hardware but I would like to better understand whether there is any specific compiling needed in order to take advantage of GPU when running QIIME2 on a Linux machine. We are not experts, and would appreciate any advice and explanations on the subject. Thanks!

Hi @shira!

We aren't GPU experts, either! Since QIIME 2 is just a framework that primarily wraps existing tools (not exclusively, but generally), not much "in" the QIIME 2 framework or codebase has any need for GPU acceleration. As far as 3rd-party tools that are exposed via QIIME 2 plugins, I personally can't think of any that can leverage a GPU.

TLDR, QIIME 2 won't be able to take advantage of GPUs, generally speaking (again, not as a rule).

Thanks for your answer @thermokarst. Anyone using Qiime2 installed locally rather than on a server, is aware of a few fairly computing-heavy steps such as denoising and classification. Depending on the platform and on the strength of the computer, these steps can take many hours and up to a few days, or never even get completed in some cases. Is it your understanding that a GPU accelerator would not make a difference when running plugins such as dada2 or classify-sklearn? Alternatively, would you argue that it could be rather complicated to find and install the appropriate compiler and not worth the trouble? If at all possible, I would be happy to give it a try and document the difference for the benefit of the qiime community.

Correct.

Yes. Even if you do get a compiler up and running, it seems unlikely to me that the dozens (and dozens) of QIIME 2 dependencies will be compatible (or even just play nicely) with the new GPU-enabled runtime.

Personally, I would focus on investing in a system with plenty of individual CPU cores and lots of memory, don't worry about anything "novel". While gaining traction for sure, GPU computing is still pretty niche, IMO, I suspect you will be hard-pressed to find many tools in the arena of microbiome bioinformatics to make that kind of investment worthwhile.

2 Likes

Thanks a lot @thermokars - I appreciate your answer. I will give up on the GPU dream for now :slight_smile:

2 Likes

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.