Hello everyone,
I have ran qiime2 pipeline for getting the reps-seqs using dada2 denoise steps. Then, I have exported the reps-seqs as .fasta file and used the assignTaxonomy function in R package(dada2).
Now, I want to import the taxonomy file into qiime2 again into biom or qza file but not getting much information.rdp_taxa.txt (1.2 KB)
When, I am importing taxonomy file into qiime2. and calculating the abundance and rel abundance. The number of rows/taxonomy lineage in abundance file reduced to a large number compared to the taxonomy file. What could be the actual reason behind it?
I have 6250 ASVs in taxonomy file, when I imported this to qiime2 pipeline and mapped with table.qza file to calculate the abundance.After, Calculating the abundance, I was getting around only 194 rows/lineage in abundance file.
What could be the reason, how it is calculated?
Below are the steps:
qiime tools import --type 'FeatureData[Taxonomy]' --input-path raw_taxonomy_imprt.txt --output-path raw_taxonomy_import.qza # .txt file has around 6252 feature-id/ASVs
qiime taxa collapse --i-table T2D_raw_table.qza --i-taxonomy raw_taxonomy_import.qza --p-level 7 --o-collapsed-table taxa_table-l7.qza
qiime tools extract --input-path taxa_table-l7.qza #mapped with table.qza to cal abundance
`>qiime tools export --input-path taxa_table-l7.qza --output-path raw_taxa_abundance
biom convert -i feature-table.biom -o feature-table.tsv --to-tsv #converting abundance file to tsv, only getting 194 taxonomy lineage
Hi! You are comparing two different sets of data.
The first one, is an ASV table.
The second one, is a table, collapsed to the taxonomy.
Each ASV in your first table is assigned to only one taxonomy. But each taxonomy can be associated with more than one ASV.
For example, you may have thousands of ASVs, assigned to mitochondria, and if you will filter mitochondria from your tables, in taxonomy (second) file you will get rid of one certain taxonomy, but in ASV table (first) you will filter out thousands of ASVs.
Hope I understood your question and answered on it