Fantastic! Thank you. I will check it out the qiime2R. Seems super handy.
I ran the collapse function command like you suggested and I ended up with a .biom file, which I used Qiime 1 biom convert to a .txt and got this … which looks identical to the .csv file from the taxa function.
Constructed from biom file
#OTU ID T1A T1B T1C
|D_0__Archaea;D_1__Asgardaeota;D_2__Heimdallarchaeia;D_3__uncultured archaeon;D_4__uncultured archaeon;D_5__uncultured archaeon;D_6__uncultured archaeon 0 1 0
|D_0__Archaea;D_1__Asgardaeota;D_2__Lokiarchaeia;D_3__uncultured archaeon;D_4__uncultured archaeon;D_5__uncultured archaeon;D_6__uncultured archaeon 6 1 2
|D_0__Archaea;D_1__Asgardaeota;D_2__Lokiarchaeia;D_3__uncultured crenarchaeote;D_4__uncultured crenarchaeote;D_5__uncultured crenarchaeote;D_6__uncultured crenarchaeote 1 0 0
This is then my raw counts to use as inputs?
And according to this link: Statistical methods using ANCOM
ANCOM isn’t being supported anymore in favor of geniss.
Also, I have been using gneiss in qiime2 with my data, but it keeps giving me a:
" Detected zero variance balances - double check your table for unobserved features" error.
According to this link: Gneiss zero balance error
Its probably bc I have a lot of singletons and doubletons in my data, so using Qiime 1, I ran the following command:
filter_otus_from_otu_table.py -i feature-table.biom -o feature-table.N3.biom -n 3 (removing features that are observed less than 3 times; so singletons and doubletons)
Then I import this filtered table back into Qiime2 and re-ran the gneiss commands only to get the same error. So, I keep filtering with -n 5; -n 10 with each step getting rid of an exorbitant amount of features. By my -n 10, I only have 109 features left out of 23369, and I’m still getting the zero variance balances variances ; so I think DESeq2 normalization would be the better choice.
I just wanted to make sure I’m using the correct input.
Many thanks for your reply!! It was very helpful.