Hello, @thermokarst.
The error message in this case is so big it won't show up completely even if I go all the way up in the terminal window. It's all just a big string of what seems to be Feature IDs.
Anyway, I checked the error logs and now know what seems to be the problem. I used the taxonomy artifact generated in the MP tutorial since when I used the following command:
qiime feature-classifier classify-sklearn
--i-classifier gg-13-8-99-515-806-nb-classifier.qza
--i-reads rep-seqs.qza
--o-classification taxonomy.qza
...on my own rep_seqs.qza artifact, it results in the following error.
Traceback (most recent call last):
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/q2cli/commands.py", line 224, in __call__
results = action(**arguments)
File "<decorator-gen-272>", line 2, in classify_sklearn
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/qiime2/sdk/action.py", line 228, in bound_callable
output_types, provenance)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/qiime2/sdk/action.py", line 363, in _callable_executor_
output_views = self._callable(**view_args)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/q2_feature_classifier/classifier.py", line 214, in classify_sklearn
confidence=confidence)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/q2_feature_classifier/_skl.py", line 45, in predict
for chunk in _chunks(reads, chunk_size)) for m in c)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/externals/joblib/parallel.py", line 779, in __call__
while self.dispatch_one_batch(iterator):
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/externals/joblib/parallel.py", line 625, in dispatch_one_batch
self._dispatch(tasks)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/externals/joblib/parallel.py", line 588, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/externals/joblib/_parallel_backends.py", line 111, in apply_async
result = ImmediateResult(func)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/externals/joblib/_parallel_backends.py", line 332, in __init__
self.results = batch()
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/externals/joblib/parallel.py", line 131, in __call__
return [func(*args, **kwargs) for func, args, kwargs in self.items]
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/externals/joblib/parallel.py", line 131, in <listcomp>
return [func(*args, **kwargs) for func, args, kwargs in self.items]
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/q2_feature_classifier/_skl.py", line 52, in _predict_chunk
return _predict_chunk_with_conf(pipeline, separator, confidence, chunk)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/q2_feature_classifier/_skl.py", line 66, in _predict_chunk_with_conf
prob_pos = pipeline.predict_proba(X)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/utils/metaestimators.py", line 115, in <lambda>
out = lambda *args, **kwargs: self.fn(obj, *args, **kwargs)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/pipeline.py", line 357, in predict_proba
return self.steps[-1][-1].predict_proba(Xt)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/naive_bayes.py", line 104, in predict_proba
return np.exp(self.predict_log_proba(X))
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/naive_bayes.py", line 84, in predict_log_proba
jll = self._joint_log_likelihood(X)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/naive_bayes.py", line 725, in _joint_log_likelihood
return (safe_sparse_dot(X, self.feature_log_prob_.T) +
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/sklearn/utils/extmath.py", line 135, in safe_sparse_dot
ret = a * b
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/scipy/sparse/base.py", line 360, in __mul__
return self._mul_multivector(other)
File "/home/qiime2/miniconda/envs/qiime2-2017.12/lib/python3.5/site-packages/scipy/sparse/compressed.py", line 507, in _mul_multivector
other.dtype.char))
MemoryError
So... It all seems to boil down to insufficient memory allocated to my VM. It's no wonder I'm getting nowhere with filtering my table seeing as my taxonomy file is not the proper file for my table.
I don't know what I was thinking, silly me .
I'll fix the memory issue then get back to it. Thanks a lot for the help and the patience .