Error in Taxonomy Analysis

I got this error. I did not solve it:

qiime feature-classifier classify-sklearn
--i-classifier gg-13-8-99-515-806-nb-classifier.qza (I used SILVA instead of gg)
--i-reads rep-seqs.qza \ (I used rep-seqs from dada2 plugin)
--o-classification taxonomy.qza

What is the problem?

:pray:

Hey there @Mehrdad, you have several typos in your command. What you wrote is this:

qiime feature-classifier classify-sklearn \
> >  --i-classifier silva-132-99-nb-classifier.qza \
> > --i-reads RepresenDenoisedLibA.qza \
> > --o-classification taxonomylibA.qza

Note the 6 3 extra > at the beginning of line 2, line 3, and line 4. You should not include those:

qiime feature-classifier classify-sklearn \
 --i-classifier silva-132-99-nb-classifier.qza \
 --i-reads RepresenDenoisedLibA.qza \
 --o-classification taxonomylibA.qza
3 Likes

Hi Matthew,
On terminal screen I have always this sign (>), but you are right I had three extra of that. Firstly I offer my thanks.
I put the right form then I got error.

I finished dada2 step. And based on the suggestion I am following the next step but I started with taxonomy analysis.

1 Like

I downloaded GreenGen (full length) database from this page:

https://docs.qiime2.org/2019.1/data-resources/

It worked; however, it did not work once I immediately run the same command with SILVA file. I am interested in working with SILVA database because it highly updated.

What is your idea?
How solve the error?
What is the origin of the error?

thanks

Hi @Mehrdad,
In order to troubleshoot any error messages it is almost always necessary to see the full error message. Could you rerun your commands with the --verbose flag and copy paste the full report please.

1 Like

Hi Mehrbod,
Thanks for the reply!

This the whole error. There you go!

Copy Paste the whole report:

qiime feature-classifier classify-sklearn --i-classifier silva-132-99-nb-classifier.qza --i-reads RepresenDenoisedLibA.qza --o-classification RepresenDenoisedLibA.qza --verbose
Traceback (most recent call last):
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/q2cli/commands.py”, line 274, in call
results = action(**arguments)
File “</home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/decorator.py:decorator-gen-338>”, line 2, in classify_sklearn
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/qiime2/sdk/action.py”, line 231, in bound_callable
output_types, provenance)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/qiime2/sdk/action.py”, line 365, in callable_executor
output_views = self._callable(**view_args)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/q2_feature_classifier/classifier.py”, line 212, in classify_sklearn
reads, classifier, read_orientation=read_orientation)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/q2_feature_classifier/classifier.py”, line 169, in _autodetect_orientation
result = list(zip(*predict(first_n_reads, classifier, confidence=0.)))
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/q2_feature_classifier/_skl.py”, line 45, in predict
for chunk in _chunks(reads, chunk_size)) for m in c)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/externals/joblib/parallel.py”, line 917, in call
if self.dispatch_one_batch(iterator):
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/externals/joblib/parallel.py”, line 759, in dispatch_one_batch
self._dispatch(tasks)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/externals/joblib/parallel.py”, line 716, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/externals/joblib/_parallel_backends.py”, line 182, in apply_async
result = ImmediateResult(func)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/externals/joblib/_parallel_backends.py”, line 549, in init
self.results = batch()
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/externals/joblib/parallel.py”, line 225, in call
for func, args, kwargs in self.items]
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/externals/joblib/parallel.py”, line 225, in
for func, args, kwargs in self.items]
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/q2_feature_classifier/_skl.py”, line 52, in _predict_chunk
return _predict_chunk_with_conf(pipeline, separator, confidence, chunk)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/q2_feature_classifier/_skl.py”, line 66, in _predict_chunk_with_conf
prob_pos = pipeline.predict_proba(X)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/utils/metaestimators.py”, line 118, in
out = lambda *args, **kwargs: self.fn(obj, *args, **kwargs)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/pipeline.py”, line 382, in predict_proba
return self.steps[-1][-1].predict_proba(Xt)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/naive_bayes.py”, line 104, in predict_proba
return np.exp(self.predict_log_proba(X))
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/naive_bayes.py”, line 84, in predict_log_proba
jll = self._joint_log_likelihood(X)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/naive_bayes.py”, line 731, in joint_log_likelihood
return (safe_sparse_dot(X, self.feature_log_prob
.T) +
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/sklearn/utils/extmath.py”, line 168, in safe_sparse_dot
ret = a * b
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/scipy/sparse/base.py”, line 473, in mul
return self._mul_multivector(other)
File “/home/mpi/miniconda3/envs/qiime2-2019.1/lib/python3.6/site-packages/scipy/sparse/compressed.py”, line 482, in _mul_multivector
other.ravel(), result.ravel())
MemoryError

Plugin error from feature-classifier:

See above for debug info.

Hi @Mehrdad,
Thanks! So if I’m not mistaken I believe your MemoryError indicates that you have insufficient RAM for this task. This is pretty common with the Silva dataset due to its size. You can browse around the forum to see what others have done, for ex here. Ultimately you just need access to more RAM. If that is not an option for you then you could try the Greengenes database which is a lot less memory intensive.

2 Likes

Dear @Mehrbod_Estaki,

I read some of the discussions regarding MemoryError I modified the command but I still have error.

Also I tested the --p-n-jobs parameter it did not work and computer totally frosen.
What should I do for solving the memory error?
*My memory is 16 GiB.

Thanks

https://docs.qiime2.org/2019.1/plugins/available/feature-classifier/classify-sklearn/

These parameters are not present in the page above, and you have mentioned them in some discussions, by the way.

–p-chunk-size
–p-classify-chunk

that is the old name for the --p-reads-per-batch parameter in previous releases. Try setting it to 1000 and it may fix your problem. If it does not, and you followed all other tips given above, then you need a more powerful computer to run that command and we cannot help you with that.

3 Likes

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.