Classifying using Silva, memory error

Dear all,

first of all, I want to specify that my TMPDIR has 113gb available and is set on /media/nec3_HDD. I’ve tried to run the command with a different number of cores from 2 to 13, and I work on a machine with 65gb of RAM.

This is my disk usage:

Filesystem Size Used Avail Use% Mounted on
udev 32G 0 32G 0% /dev
tmpfs 6.3G 116M 6.2G 2% /run
/dev/vda1 9.9G 2.5G 7.0G 26% /
tmpfs 32G 15G 18G 46% /dev/shm
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 32G 0 32G 0% /sys/fs/cgroup
/dev/vdc 493G 355G 113G 76% /media/nec3_HDD
/dev/vdb 473G 70M 449G 1% /mnt
tmpfs 6.3G 0 6.3G 0% /run/user/1000

I am running the following command:

qiime feature-classifier classify-sklearn
–i-classifier …/…/…/…/…/reference_datasets/16S/silva-132-99-nb-classifier.qza
–i-reads representative_sequences.qza
–output-dir taxonomy_silva
–p-n-jobs 2

This is part of the error:


/media/nec3_HDD/miniconda3/envs/qiime2-2018.8/lib/python3.5/site-packages/sklearn/pipeline.py in predict_proba(self=Pipeline(memory=None,
steps=[(‘feat_ext’, H…class_prior=None,
fit_prior=False)]]), X=(b’ATGTGCAAGTCTCAGGATTAGATACCCGTGTAGTCGCTGCGATGTG…TCAGCAGGAATGCCGAGACCGATCTCGTATGCCGTCTTCTGCTTGAAA’, b’AAGGAGTATACGTGACCTATGAACTCAGGAGTCCCTACGGGCGCTG…CGCTGCGATGTGCAAGTCTCAGTCCTAGTACGAGATCGGAAGAGCGGT’, b’TGGGGAATCTTGGACAATGGGGGCAACCCTGATCCAGCGACGCCGC…CACCGGGGAGGATGATGACGTTACCCGGAGAAGAAGCACCGGCTAACT’, b’ACTGCATAGTGACCTATGAACTCAGGAGTCCCTACGGGCGCTGCGA…CGGTTCAGCAGGAATGCCGAGACCGATCTCGTATGCCGTCTTCTGCTT’, b’TCGAGAATCTTCCGCAATGGACGAAAGTCTGACGGAGCGACGCCGC…CGTAGGGGAGGAAATTTTGACCGATCCTAGGAGGAAGCGCAGGCTAAG’, b’CTAAGCCTGTGACCTATGAACTCAGGAGTCCCTACGGGTAGTCGCT…TCGCTGCGATGTGCAAGTCTCAGAGCGTAGCAGATCGGAAGAGCGGTT’, b’TGGGGAATATTGCGCAATGGACGAAAGTCTGACGCAGCGACGCCGC…CAAGAGGGAAGAAACCTATCATGAATAATACTCATGGTAATTGACGGT’, b’TCGAGAATAATTCACAATGGGCGAAAGCCTGATGGTGCAACGCCGC…CATCAGGGAGTAAGACCTGGGTGTTAATAGCACACAGGGTTGATAGTA’, b’CTAAGCCTGTGACCTATGAACTCAGGAGTCAGGATTAGATACCCTG…GGTCCCTTGAGGACTTAGTGACGCAGCTAACGCAATAAGTAGACCGCC’, b’TCGAGAATCTTCGGCAATGGACGAAAGTCTGACCGAGCAATGCCGC…AGTGAGGAGGAAGGGCCCGTGCAGAGCGGGTCTTGACCGATCCACAGT’, b’TGAGGAATTTTGCGCAATGGGGGAAACCCTGACGCAGCAACGCCGC…CAACTGGGAAGAAATTACCATTATTTAACAGATGGTGGTATTGACGGT’, b’TCGGGAATTTTGGGCAATGGGCGAAAGCCTGACCCAGCAACGCCGC…AGAATAGGAAGAATAAATGACGGTACTATTTATAAGGTCCGGCTAACT’, b’TCGAGAATCTTCGGCAATGCGCGAAAGCGTGACCGAGCGATGCCGC…AGTTGGGAGGAAGGACCTGTGAAGAGCAGGTTTTGACCGATCTTCAGT’, b’CCGAAGTATGTGACCTATGAACTCAGGAGTCCCTACGGGTAGTCGC…GAAGAGCGGTTCAGCAGGAATGCCGAGACCGATCTCGTATGCCGTCTT’, b’TCGAGAATCTTCCGCAATGGACGCAAGTCTGACGGAGCGACGCCGC…CACGAGTTAAGAAAGGTGCAGCGTGAATAGCGTTGTATTTGACGTAAG’, b’ACTGCATAGTGACCTATGAACTCAGGAGTCCCTACGGGTAGTCGCT…TCGCTGCGATGTGCAAGTCTCAGGCTCAGGAAGATCGGAAGAGCGGTT’, b’TAACGAATATTCCGCAATGCGCGAAAGCGTGACGGAGCAATGCCGC…CAGGGTTTAGGAATCCATGACCAGACCCAAAGGAAGGACCGGCTAACT’, b’GAAGGAATATTGGGCAATGGGCGAAAGCCTGACCCAGCGACGCCGT…CACACGTTAGGAAAGTTGTATGGTTAATACCCATGCGAATTGACAAAG’, b’TCGAGAATCTTCGGCAATGGGCGCAAGCCTGACCGAGCGACGCCGC…CAGAGGGGAGGAAATGCCTGGTAACCCCAGGTTTGACCTATCCTCAGA’, b’GCGCGGAACCTTTACAATGCACGCAAGTGTGATAAGGGGATCCCAA…GTATCTTGGCGAATAAGTGGTGGGTAAGACATGTGCCAGCCGCCGCGG’, …))
352 “”"
353 Xt = X
354 for name, transform in self.steps[:-1]:
355 if transform is not None:
356 Xt = transform.transform(Xt)
–> 357 return self.steps[-1][-1].predict_proba(Xt)
self.steps.predict_proba = undefined
Xt = <2421x8192 sparse matrix of type ‘<class ‘numpy… stored elements in Compressed Sparse Row format>
358
359 @if_delegate_has_method(delegate=’_final_estimator’)
360 def decision_function(self, X):
361 “”"Apply transforms, and decision_function of the final estimator


/media/nec3_HDD/miniconda3/envs/qiime2-2018.8/lib/python3.5/site-packages/sklearn/naive_bayes.py in predict_proba(self=LowMemoryMultinomialNB(alpha=0.001, chunk_size=20000, class_prior=None,
fit_prior=False), X=<2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>)
99 C : array-like, shape = [n_samples, n_classes]
100 Returns the probability of the samples for each class in
101 the model. The columns correspond to the classes in sorted
102 order, as they appear in the attribute classes_.
103 “”"
–> 104 return np.exp(self.predict_log_proba(X))
self.predict_log_proba = <bound method BaseNB.predict_log_proba of LowMem…, class_prior=None,
fit_prior=False)>
X = <2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>
105
106
107 class GaussianNB(BaseNB):
108 “”"


/media/nec3_HDD/miniconda3/envs/qiime2-2018.8/lib/python3.5/site-packages/sklearn/naive_bayes.py in predict_log_proba(self=LowMemoryMultinomialNB(alpha=0.001, chunk_size=20000, class_prior=None,
fit_prior=False), X=<2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>)
79 C : array-like, shape = [n_samples, n_classes]
80 Returns the log-probability of the samples for each class in
81 the model. The columns correspond to the classes in sorted
82 order, as they appear in the attribute classes_.
83 “”"
—> 84 jll = self._joint_log_likelihood(X)
jll = undefined
self._joint_log_likelihood = <bound method MultinomialNB._joint_log_likelihoo…, class_prior=None,
fit_prior=False)>
X = <2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>
85 # normalize by P(x) = P(f_1, …, f_n)
86 log_prob_x = logsumexp(jll, axis=1)
87 return jll - np.atleast_2d(log_prob_x).T
88


/media/nec3_HDD/miniconda3/envs/qiime2-2018.8/lib/python3.5/site-packages/sklearn/naive_bayes.py in joint_log_likelihood(self=LowMemoryMultinomialNB(alpha=0.001, chunk_size=20000, class_prior=None,
fit_prior=False), X=<2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>)
720 def joint_log_likelihood(self, X):
721 “”“Calculate the posterior log probability of the samples X”""
722 check_is_fitted(self, "classes
")
723
724 X = check_array(X, accept_sparse=‘csr’)
–> 725 return (safe_sparse_dot(X, self.feature_log_prob
.T) +
X = <2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>
self.feature_log_prob_.T = memmap([[-10.6385582 , -10.63464125, -10.6255616…2.28372402,
-11.2530688 , -11.27937308]])
self.class_log_prior_ = array([-11.60209867, -11.60209867, -11.60209867,…-11.60209867,
-11.60209867, -11.60209867])
726 self.class_log_prior_)
727
728
729 class BernoulliNB(BaseDiscreteNB):


/media/nec3_HDD/miniconda3/envs/qiime2-2018.8/lib/python3.5/site-packages/sklearn/utils/extmath.py in safe_sparse_dot(a=<2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>, b=memmap([[-10.6385582 , -10.63464125, -10.6255616…2.28372402,
-11.2530688 , -11.27937308]]), dense_output=False)
130 -------
131 dot_product : array or sparse matrix
132 sparse if a or b is sparse and dense_output=False.
133 “”"
134 if issparse(a) or issparse(b):
–> 135 ret = a * b
ret = undefined
a = <2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>
b = memmap([[-10.6385582 , -10.63464125, -10.6255616…2.28372402,
-11.2530688 , -11.27937308]])
136 if dense_output and hasattr(ret, “toarray”):
137 ret = ret.toarray()
138 return ret
139 else:


/media/nec3_HDD/miniconda3/envs/qiime2-2018.8/lib/python3.5/site-packages/scipy/sparse/base.py in mul(self=<2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>, other=memmap([[-10.6385582 , -10.63464125, -10.6255616…2.28372402,
-11.2530688 , -11.27937308]]))
402 # dense 2D array or matrix (“multivector”)
403
404 if other.shape[0] != self.shape[1]:
405 raise ValueError(‘dimension mismatch’)
406
–> 407 result = self._mul_multivector(np.asarray(other))
result = undefined
self._mul_multivector = <bound method _cs_matrix._mul_multivector of <24…stored elements in Compressed Sparse Row format>>
other = memmap([[-10.6385582 , -10.63464125, -10.6255616…2.28372402,
-11.2530688 , -11.27937308]])
408
409 if isinstance(other, np.matrix):
410 result = np.asmatrix(result)
411


/media/nec3_HDD/miniconda3/envs/qiime2-2018.8/lib/python3.5/site-packages/scipy/sparse/compressed.py in _mul_multivector(self=<2421x8192 sparse matrix of type '<class 'numpy… stored elements in Compressed Sparse Row format>, other=array([[-10.6385582 , -10.63464125, -10.62556166…2.28372402,
-11.2530688 , -11.27937308]]))
506 result = np.zeros((M,n_vecs), dtype=upcast_char(self.dtype.char,
507 other.dtype.char))
508
509 # csr_matvecs or csc_matvecs
510 fn = getattr(_sparsetools,self.format + ‘_matvecs’)
–> 511 fn(M, N, n_vecs, self.indptr, self.indices, self.data, other.ravel(), result.ravel())
fn =
M = 2421
N = 8192
n_vecs = 109327
self.indptr = array([ 0, 115, 224, …, 296760, 296886, 297009], dtype=int32)
self.indices = array([ 13, 140, 172, …, 8052, 8057, 8166], dtype=int32)
self.data = array([ 0.08219949, 0.08219949, 0.08219949, …, 0.08703883,
0.08703883, 0.08703883])
other.ravel =
result.ravel =
512
513 return result
514
515 def _mul_sparse_matrix(self, other):

MemoryError:

If the full error is necessary I can send the debug info file.

Hi @Fra,
Try setting --p-reads-per-batch to a lower value, maybe 2000. if that does not work, set it to a lower value, and/or set n-jobs = 1.

The SILVA classifier takes lots of memory, so you need to reduce the number of reads-per-batch so you do not overload your RAM. Adding more parallel jobs (without reducing reads per batch), will only exacerbate this problem.

Good luck!

Many thanks Nicholas, it worked.

2 Likes

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.