I'm having the same issue. When I run feature-classify, it says unable to relocate 3.5 GiB, but my virtual box machine should have 5G+ free memory. I have attached screenshots as following. Can you help me figure out what was the problem? Thank you so much!
Hi! I am guessing that the machine already used available memory and now requiring for 3.5 GB of additional memory, not in overall. So you still need to increase a memory for your VM.
Hi, thanks for the quick reply! I ran the same analysis on two computers (the other one has about half memory) and both said need 3.5 GB. Could it be that the VM is not using the memory I assigned to it?
Hi, last time I assigned 8556 MB and it asked for 3.5GB. I ran it again with 10985 MB assigned to the VM. It still returned error, but in a different message. Please see as following:
Ok, I’m reasonably convinced (particularly from that free -m you ran earlier) that you have successfully allocated the amount of memory you think you have to your VM. I only asked for clarification because we get a lot of out of memory issues with VMs where people think the VM has X amount of memory available (usually the amount of RAM they have) when really they’ve only given it access to say 2gigs.
That being said, you are definitely running out of memory available to your VM. The amount of memory it takes to run a classifier can vary significantly, but three samples with around 35 total sequences doesn’t sound like a large amount, so I’m not sure why it would be taking up so much memory.
Would you be able to PM me your data so I can attempt to replicate the command on my machine? Your data is probably just hogging a very large amount of RAM in which case you’ll need to allocate even more to your VM (if you have it available), but it can’t hurt to check and make sure nothing strange is happening.
Additionally, where did you get your classifier from? I can see it’s a silva classifier, but which one in particular? That may also have something to do with your high memory usage.
In the meantime, make sure your VM isn’t running anything in the background that is using a large amount of memory.
Try running htop in one terminal on your VM. You should get something that looks like this
It will tell you what is using memory and how much. Maybe try running classify-sklearn in a seperate terminal will you have htop open and look at how much memory is used.
EDIT: You'll probably want to sort processes by memory usage in htop, you can do this by pressing F6, then using the arrow keys to select "PERCENT_MEM" then pressing enter.
Ok I was able to successfully run your command and it peaked at ~10.6gigs of memory used. It seems like you just need to make more memory available to your VM. How much RAM do you have in the machine you’re hosting the VM on?
EDIT: I was able to successfully run your command on a VM with 12288mb (12gigs) of RAM allocated with no other windows open in the VM.
My computer also has 16, and I was able to assign 12 no problem. Give it a shot and let me know if it works out. Keep other open windows on your computer to a minimum though.