Thanx for your advice!!
But… i got intro another trouble.
Im running the bayesian silva 132 trained classifier with my data, but some minutes later it stops and says
“Segmentation Fault (Core dumped)”
i read on the forum it may be because space in disk, but i have 600 gb left, and when its runnin it starts to take around 30gb in the process, and then dump it, is it normal?
EDIT: As i see in another post… i think its a problem related to WSL
EDIT 2: It seems to be a problem with SILVA, GG achieved to classify and it was fast, maybe its size?