Hi @SaraGajas,
It depends entirely on the size of the database, and several parameters (n_jobs, reads_per_batch; search the forum for more details on how these impact memory use) so can be difficult to predict.
SILVA full-length with 1 job running should be possible to run with 10 GB RAM, but your mileage may vary (other users have reported up to 32GB RAM with similar classifiers, though again this depends on other parameters as well). Search the forum for info on reducing memory use, e.g., with the reads-per-batch parameter
The number of samples does not really matter at all. This is what the reads-per-batch parameter does... it feeds in a subset of the query sequences at a time to reduce memory load (and thus most of the memory demand is from the classifier stored in memory). Thus, you could have 500 trillion samples or sequences but it gets fed in batches... in other words, your sample count will not impact memory use if you use reads-per-batch correctly, but it will mean longer runtimes.
Good luck!