Filter Feature Table

Hi

I wanted to ask for your opinion about my filtering approach to remove low abundant ASVs before continuing with the downstream analysis:

  1. I rarefied my filter-table at a sequencing depths of 50’000 reads

  2. I use the threshold of 10^-4 relative abundance to remove low abundant reads. Hence I used the “qiime feature-table filter-features” to dismiss ASVs that have less than 5 reads.

  3. The sequencing depths of this filtered feature-table has now again a different sequencing depths among the samples, since it appears that some samples had more “noise”.

–> Should I rarify again to the the sequencing depths of the sample with the lowest depths after noise-removal? Is this a valid approach in your oppinion?

–> Some people use to remove low abundant ASVs based on their relative abundance before rarefaction. This plugin is not available in qiime2 (as far as I know). Also I am not sure if removing low abundant reads before rarefaction changes the proportions of the remaining ASVs more than if it does afterwards

Thank you

1 Like

Hi there @cla!

It depends on what you are doing downstream. If you are running a particular metric or method that is sensitive to sampling depth, then yes, you will want to ensure the table is rarefied.

Correct - the filtering we current support all operates on FeatureTable[Frequency], not FeatureTable[RelativeFrequency].

Hope that helps! :t_rex: :qiime2:

2 Likes

Thanks for your reply.

Is this the standard filtration approach or do you suggest different steps?

1 Like

Sounds like you are following a qiime1-style OTU filtering protocol.

I have some changes to recommend.

If you used dada2 or deblur to denoise your sequences, then abundance-based filtering should not be necessary (though it does not hurt, either, if you want to remove low-abundance features that slow down downstream analysis steps).

Do not rarefy your table, unless if you have good reason to do so. In QIIME2, all necessary normalization steps are largely built in to different actions, e.g., rarefying is built in to the diversity core-metrics pipeline prior to alpha and beta diversity analyses. Other downstream steps like gneiss and ancom have their own normalization built in and rarefying should NOT be performed prior to any type of differential abundance analysis.

The point of rarefying — when it is performed — is to control for uneven sampling depth. So if you are using rarefying to normalize prior to, e.g., alpha diversity, then yes you should rarefy again! Do not rarefy, then filter, then rarefy again. Just filter however you need, then rarefy immediately prior to whatever analysis needs it. Don’t use that rarefied table for other analyses.

I hope that helps!

3 Likes

An off-topic reply has been split into a new topic: ANCOM, normalization, and multiple correction

Please keep replies on-topic in the future.

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.