Looking for Pre-trained Silva Classifier (V1-V3)

Hello everbody,

Does anyone know where I could get the SILVA V1-V3 pre-trained classifier? I have not found this classifier for V1-V3 regions (27F and 534R primers).

I tried to train but unfortunately, my QIIME2 came back with Killed: 9 error which is due to lack of memory. I also tried using SILVA 138 full length as a classifier but it also returned Killed:9 error

Thanks :slight_smile:

1 Like

Hi Aline,

So, I trained a classifier for this region just a few weeks ago using the SILVA 138-99 full length sequences (2020.8) using these primers: 27FYM & 519R, but do want to let you know that I had some issues with classifications for this region that we ultimately think stemmed from subpar DNA extraction/amplification methods.

You can find the post here if you would like more info. But you're welcome to try this classifier if it could work for your purposes while waiting for someone else to respond.

Here are the silva links used to train the classifier:
wget https://data.qiime2.org/2020.8/common/silva-138-99-seqs.qza
wget https://data.qiime2.org/2020.8/common/silva-138-99-tax.qza

Also, the qza file is too large to upload here. But if this would work for you, I can send you a google drive link to download it if you feel comfortable sending a direct message to me.

Wishing you all the best.

4 Likes

Hey el502,

Thank you so much for your kind help. It will definitely be very useful to run my analysis with your pre-trained classifier. I'll send you a direct message right now!

Thanks again for your help :slight_smile: :clap:

Greetings!!

1 Like

@vetalinesantana

It looks like you will want to make sure to use QIIME 2 2020.8, otherwise the scikit-learn versions won't match and you will get a version mismatch error.

1 Like

Hey Keegan,

Thank you for your help. I just updated to 2021.8 but apparently, the error persists because I don't have sufficient space on my computer :frowning:

I tried this command:

(qiime2-2021.8) Aline-MBP:Analysis_Silva_full_length vetalinesantana$ qiime feature-classifier classify-sklearn --i-classifier silva-138-99-nb-classifier.qza --i-reads rep-seqs.qza --o-classification taxonomy.qza

I got this error:

 (1/2) Invalid value for '--i-classifier': There was not enough space left on

'/tmp' to extract the artifact 'silva-138-99-nb-classifier.qza'. (Try
setting $TMPDIR to a directory with more space, or increasing the size of
'/tmp')
(2/2) Invalid value for '--i-reads': There was not enough space left on
'/tmp' to extract the artifact 'rep-seqs.qza'. (Try setting $TMPDIR to a
directory with more space, or increasing the size of '/tmp')

I tried to create a TMPDIR (No space left on device classifier TMPDIR - #3 by thermokarst) and I got this error:

(qiime2-2021.8) Aline-MBP:Analise_Silva_full_length vetalinesantana$ mkdir /data

mkdir: /data: Read-only file system

(qiime2-2021.8) Aline-MBP:Analise_Silva_full_length vetalinesantana$ export TMPDIR='/data'

(qiime2-2021.8) Aline-MBP:Analise_Silva_full_length vetalinesantana$ echo $TMPDIR

/data

(1/2) Invalid value for '--i-classifier': There was not enough space left on '/tmp' to extract the artifact 'silva-138-99-nb-classifier.qza'. (Try
  setting $TMPDIR to a directory with more space, or increasing the size of '/tmp') (2/2) Invalid value for '--i-reads': There was not enough space left on '/tmp' to extract the artifact 'rep-seqs.qza'. (Try setting $TMPDIR to a directory with more space, or increasing the size of '/tmp')

Do you know the minimum space to be able to perform my analysis with SILVA full length classifier? These are my computer settings:

qiime2-2021.8) Aline-MBP:Analise_Silva_full_length vetalinesantana$ df -h
Filesystem                                                      Size   Used  Avail Capacity iused      ifree %iused  Mounted on
/dev/disk1s5s1                                                 113Gi   14Gi   12Gi    56%  553788 1181264652    0%   /
devfs                                                          189Ki  189Ki    0Bi   100%     654          0  100%   /dev
/dev/disk1s4                                                   113Gi  3.0Gi   12Gi    21%       3 1181818437    0%   /System/Volumes/VM
/dev/disk1s2                                                   113Gi  520Mi   12Gi     5%    2534 1181815906    0%   /System/Volumes/Preboot
/dev/disk1s6                                                   113Gi  5.3Mi   12Gi     1%      18 1181818422    0%   /System/Volumes/Update
/dev/disk1s1                                                   113Gi   83Gi   12Gi    88%  691200 1181127240    0%   /System/Volumes/Data
map auto_home                                                    0Bi    0Bi    0Bi   100%       0          0  100%   /System/Volumes/Data/home
/Users/vetalinesantana/Downloads/FannyWidget-v2.3.0/Fanny.app  113Gi   63Gi   30Gi    68%  718654 1181099786    0%   /private/var/folders/rc/h6g74q8j7b52b1d7drc57lhw0000gn/T/AppTranslocation/A60CA238-FB61-4B8E-9BCD-5495387BAEE5

Thanks!!!

@vetalinesantana

It looks like you know this, but I want to make sure that it is really clear for future readers. You are encountering two completely different errors.

  1. Not having enough memory to be able to train the classifiers(hopefully resolved by @el502 providing a pretrained classifier for you).

  2. Not having enough temp directory space to perform the analysis.

Let's work on getting the second issue worked out!

It looks like you are on the right track to solving this issue by trying to create a new temp directory to use for the analysis. However, it looks like you do not have the correct permissions for the location that you are trying create it.

You simply need to create and export a directory in a location that you do have read/write permissions for.

mkdir /Users/yourusername/qiimetmpdir
export TMPDIR='/Users/yourusername/qiime_tmp_dir'

Here you can find lots of other examples of people having this same issue. And of course if you still are having trouble with it, hop back on here and we'll get it sorted :slightly_smiling_face:

You want to at least make sure that the disk/partition that your temp directory is on has more space free than the size of your input files.

2 Likes

Thank you again for your clarification!!