Feature table Killed: 9 error

Trying to summarize my feature table, and it returned an error saying 'Killed: 9". What does this mean and how do I proceed from here?

Thanks!

Ari

Hey @Ariangela_Davis,

killed 9 (or any variation of 9 really) means the process was sent a SIGKILL signal (the computer sees it as the number 9), telling the operating system to stop anything to do with the process and destroy it.

This usually happens when you are running in an HPC environment and you’ve hit the maximum allocated walltime (usually configurable, talk to you sysadmin for details). Alternatively if you closed the terminal or did anything else that might have cancelled the process, it’ll receive a SIGKILL as well.

Out of memory errors can also cause this (when you are lucky), but given the particular step you were running, this seems unlikely as Python is really pretty good at letting you know when that’s the case, and this visualizer shouldn’t consume much memory either.

1 Like

Hi Evan,

Thanks for your reply. I was running this locally. I have had no issues with this script until now, I have run several datasets before this with no issues. I tried it on two different MACs, a laptop and desktop. I restarted the computer, and even freed up some space, with no success. I tried version 2018.6 and 2018.8 and got the same thing. What should I do next to try and fix this?

Thanks!

That is strange!

How long does it take to fail? Something you could try is running it and watching your computer’s resources with a program like htop to see if you run out of memory while it runs.

Are you providing metadata to the visualizer? If so, does it still fail when you omit that parameter?

It usually fails within ten minutes. I had been providing metadata (albeit a huge file because the dataset is large), but I just tried to run it again without the metadata. It has been longer than ten minutes and it is still running, so we shall see what happens with that.

I wonder if it has something to do with the metadata then, it’s possible we are running out of memory if the file is large enough.

What is the size of your metadata? Can you run qiime metadata tabulate on it without issue?

The command completed normally after I trimmed down the metadata file. It was originally 6MB. I removed a substantial amount of data from the file and it worked. What is the size limit for metadata so that I can keep this from happening in the future?

Thanks!

Hey @Ariangela_Davis,

I don't really have an answer for that, I was hoping we could work that out!

In principle 6MB shouldn't be even close to a show-stopper, would it be possible for your to provide the header lines of your metadata file (i.e. no rows with sample information)?

Also, does the following work:

python -c \
  "import sys; import qiime2; qiime2.Metadata.load(sys.argv[1])" \
  <YOUR METADATA FILE HERE>

This will load only the metadata object, if we're lucky it crashes right away, otherwise something more complicated is happening.


Alternatively, if you are able to share your data via DM or similar, I could take a look and try to profile the method to see what's going on.

Thanks!

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.