Artifact API Requirements


I was looking to extend the Artifact and Plugin APIs for a project I am working on, so I grabbed the module qiime/core from your github page. However when I go to run a very basic “Hello World” script, I am bombarded with ModuleNotFoundErrors. I realized that the github page does not list the required dependencies for the core module and I will probably have to scrape through a previous cli install to find most of them. Could you please list the requirements for the core module somewhere on the github page or website?


Hey there @Matt_Kizaric1! What are you working on - it sounds like maybe you are working on using the Artifact API for analysis, or maybe writing a new plugin? The easiest way to get a development environment bootstrapped is to follow our quickstart guide.

Each individual code repo on our GitHub Org has a meta.yaml file in it, used for building conda packages - this lists the dependencies (e.g., the QIIME 2 framework) .

Hope that helps! :qiime2: :t_rex:

I was working on developing a docker wrapper for the cli since it does not persist when running in docker. However, I decided to do this a bit differently and do a REST API to send commands. I was wondering if the Docker image was intended to not persist and be controlled with a single shell script. If so, I think it might be a good idea to have a distribution that’s made for servers that allows for real-time control and execution of commands from outside the environment. Just some thoughts, but thank you for the info!

Hey hey @Matt_Kizaric1!

This is basically how docker itself is designed to operate - disposable containers don’t persist data, unless you ask them to! It sounds like maybe you aren’t mounting a volume while running the docker container - that will certainly make things difficult in terms of persistence. As well, you can run things in interactive and TTY mode - at that point its basically like you are running the command natively (check out our simple example in the docs):

$ ls 
demux.qza # This is local, on my host laptop
# Volume mounting with -v, run QIIME 2 in a docker container
$ docker run -t -i -v $(pwd):/data qiime2/core:2018.6 qiime demux summarize --i-data demux.qza --o-visualization demux.qzv
$ ls 
demux.qza demux.qzv # still local!
# TTY/interactive mode with -t -i
$ docker run -t -i -v $(pwd):/data qiime2/core:2018.6 bash
$ # you are now inside the container

All of this and much more can be found in the docker run docs, in case you want to see more!

Now with that said, maybe I misunderstood your question, or you are just generally curious - the Dockerfile for this image can be found here, one of the great things about docker is that you can layer your own functionality on to our image using your own Dockerfile, no need to let us get in your way! Anyway, have fun and keep us posted! :qiime2:

Sorry I should have been more carefully on my wording, I meant the container stops and closes the environment without running any commands unless you give it a shell script to execute.

I guess the -i option in docker does help this, but if you want other services to run on top of it (database managers, job status updates, automatic file transfer, ect.) there isn't a great way to hook into the docker image unless you are constantly starting and stopping the container. This might be a bit outside of qiime2's scope, but I think making something that has a simple non-cli interface for the docker image(Http, process interrupt, event, ect.) could really bring qiime to another level.

It sounds like the docker run behavior you are looking for is called “detached mode” (specified with the -d flag):

$ docker run -dti qiime2/core bash 
$ docker ps
CONTAINER ID        IMAGE               COMMAND                  CREATED             STATUS              PORTS               NAMES
58919f3f68c4        qiime2/core         "/usr/bin/tini -- ba…"   2 minutes ago       Up 2 minutes                            frosty_bell
$ docker exec -ti frosty_bell qiime info                                                                                                                                                        
System versions
Python version: 3.5.5
QIIME 2 release: 2018.6
QIIME 2 version: 2018.6.0
q2cli version: 2018.6.0

Installed plugins
alignment: 2018.6.0
composition: 2018.6.0
cutadapt: 2018.6.0
dada2: 2018.6.0
deblur: 2018.6.0
demux: 2018.6.0
diversity: 2018.6.0
emperor: 2018.6.0
feature-classifier: 2018.6.0
feature-table: 2018.6.0
gneiss: 2018.6.0
longitudinal: 2018.6.0
metadata: 2018.6.0
phylogeny: 2018.6.0
quality-control: 2018.6.1
quality-filter: 2018.6.0
sample-classifier: 2018.6.0
taxa: 2018.6.0
types: 2018.6.0
vsearch: 2018.6.0

Application config directory

Getting help
To get help with QIIME 2, visit
$ docker exec -ti frosty_bell qiime demux summarize --help
Usage: qiime demux summarize [OPTIONS]

  Summarize counts per sample for all samples,
  and generate interactive positional quality
  plots based on `n` randomly selected

  --i-data ARTIFACT PATH SampleData[JoinedSequencesWithQuality | PairedEndSequencesWithQuality | SequencesWithQuality]

Hope that helps! :qiime2:

Oh, that does make sense. It’s not super applicable to my project, however this is going to save me a lot of time debugging. Thank you for your help! :slight_smile:

Depending on what you’re after here, REST might be an awkward model for this — RPC might be a better fit (or even just a generic HTTP API). We have a bit of a prototype HTTP API backing q2studio — you might want to take a quick spin through there to see if there is anything you can use.

Keep us in the loop, we would love to know more about your project! :t_rex: :qiime2:

Yeah I’ll check through the code for studio, that might help a bit for my project actually. I’d have to do more research into how RPC interacts with docker containers since I plan to run a few other services for data management that I don’t want to be qiime specific. When I say REST I really mean a simple and standardized HTTP api, I don’t plan it to be too extensive but i’d like a structured way to get various qiime jobs I’m running as well as PUTing and POSTing the commands/info for data analysis. Thanks again for the help, unfortunately this is for a private company so I cannot give too many details on the project, but I might have a demo post in the future when I’m done. It’s in its early stages so don’t expect anything too soon.

1 Like