DNAnexus Documentation
APIDownloadsIndex of dx CommandsLegal
  • Overview
  • Getting Started
    • DNAnexus Essentials
    • Key Concepts
      • Projects
      • Organizations
      • Apps and Workflows
    • User Interface Quickstart
    • Command Line Quickstart
    • Developer Quickstart
    • Developer Tutorials
      • Bash
        • Bash Helpers
        • Distributed by Chr (sh)
        • Distributed by Region (sh)
        • SAMtools count
        • TensorBoard Example Web App
        • Git Dependency
        • Mkfifo and dx cat
        • Parallel by Region (sh)
        • Parallel xargs by Chr
        • Precompiled Binary
        • R Shiny Example Web App
      • Python
        • Dash Example Web App
        • Distributed by Region (py)
        • Parallel by Chr (py)
        • Parallel by Region (py)
        • Pysam
      • Web App(let) Tutorials
        • Dash Example Web App
        • TensorBoard Example Web App
      • Concurrent Computing Tutorials
        • Distributed
          • Distributed by Region (sh)
          • Distributed by Chr (sh)
          • Distributed by Region (py)
        • Parallel
          • Parallel by Chr (py)
          • Parallel by Region (py)
          • Parallel by Region (sh)
          • Parallel xargs by Chr
  • User
    • Login and Logout
    • Projects
      • Project Navigation
      • Path Resolution
    • Running Apps and Workflows
      • Running Apps and Applets
      • Running Workflows
      • Running Nextflow Pipelines
      • Running Batch Jobs
      • Monitoring Executions
      • Job Notifications
      • Job Lifecycle
      • Executions and Time Limits
      • Executions and Cost and Spending Limits
      • Smart Reuse (Job Reuse)
      • Apps and Workflows Glossary
      • Tools List
    • Cohort Browser
      • Chart Types
        • Row Chart
        • Histogram
        • Box Plot
        • List View
        • Grouped Box Plot
        • Stacked Row Chart
        • Scatter Plot
        • Kaplan-Meier Survival Curve
      • Locus Details Page
    • Using DXJupyterLab
      • DXJupyterLab Quickstart
      • Running DXJupyterLab
        • FreeSurfer in DXJupyterLab
      • Spark Cluster-Enabled DXJupyterLab
        • Exploring and Querying Datasets
      • Stata in DXJupyterLab
      • Running Older Versions of DXJupyterLab
      • DXJupyterLab Reference
    • Using Spark
      • Apollo Apps
      • Connect to Thrift
      • Example Applications
        • CSV Loader
        • SQL Runner
        • VCF Loader
      • VCF Preprocessing
    • Environment Variables
    • Objects
      • Describing Data Objects
      • Searching Data Objects
      • Visualizing Data
      • Filtering Objects and Jobs
      • Archiving Files
      • Relational Database Clusters
      • Symlinks
      • Uploading and Downloading Files
        • Small File Sets
          • dx upload
          • dx download
        • Batch
          • Upload Agent
          • Download Agent
    • Platform IDs
    • Organization Member Guide
    • Index of dx commands
  • Developer
    • Developing Portable Pipelines
      • dxCompiler
    • Cloud Workstation
    • Apps
      • Introduction to Building Apps
      • App Build Process
      • Advanced Applet Tutorial
      • Bash Apps
      • Python Apps
      • Spark Apps
        • Table Exporter
        • DX Spark Submit Utility
      • HTTPS Apps
        • Isolated Browsing for HTTPS Apps
      • Transitioning from Applets to Apps
      • Third Party and Community Apps
        • Community App Guidelines
        • Third Party App Style Guide
        • Third Party App Publishing Checklist
      • App Metadata
      • App Permissions
      • App Execution Environment
        • Connecting to Jobs
      • Dependency Management
        • Asset Build Process
        • Docker Images
        • Python package installation in Ubuntu 24.04 AEE
      • Job Identity Tokens for Access to Clouds and Third-Party Services
      • Enabling Web Application Users to Log In with DNAnexus Credentials
      • Types of Errors
    • Workflows
      • Importing Workflows
      • Introduction to Building Workflows
      • Building and Running Workflows
      • Workflow Build Process
      • Versioning and Publishing Global Workflows
      • Workflow Metadata
    • Ingesting Data
      • Molecular Expression Assay Loader
        • Common Errors
        • Example Usage
        • Example Input
      • Data Model Loader
        • Data Ingestion Key Steps
        • Ingestion Data Types
        • Data Files Used by the Data Model Loader
        • Troubleshooting
      • Dataset Extender
        • Using Dataset Extender
    • Dataset Management
      • Rebase Cohorts and Dashboards
      • Assay Dataset Merger
      • Clinical Dataset Merger
    • Apollo Datasets
      • Dataset Versions
      • Cohorts
    • Creating Custom Viewers
    • Client Libraries
      • Support for Python 3
    • Walkthroughs
      • Creating a Mixed Phenotypic Assay Dataset
      • Guide for Ingesting a Simple Four Table Dataset
    • DNAnexus API
      • Entity IDs
      • Protocols
      • Authentication
      • Regions
      • Nonces
      • Users
      • Organizations
      • OIDC Clients
      • Data Containers
        • Folders and Deletion
        • Cloning
        • Project API Methods
        • Project Permissions and Sharing
      • Data Object Lifecycle
        • Types
        • Object Details
        • Visibility
      • Data Object Metadata
        • Name
        • Properties
        • Tags
      • Data Object Classes
        • Records
        • Files
        • Databases
        • Drives
        • DBClusters
      • Running Analyses
        • I/O and Run Specifications
        • Instance Types
        • Job Input and Output
        • Applets and Entry Points
        • Apps
        • Workflows and Analyses
        • Global Workflows
        • Containers for Execution
      • Search
      • System Methods
      • Directory of API Methods
      • DNAnexus Service Limits
  • Administrator
    • Billing
    • Org Management
    • Single Sign-On
    • Audit Trail
    • Integrating with External Services
    • Portal Setup
    • GxP
      • Controlled Tool Access (allowed executables)
  • Science Corner
    • Scientific Guides
      • Somatic Small Variant and CNV Discovery Workflow Walkthrough
      • SAIGE GWAS Walkthrough
      • LocusZoom DNAnexus App
      • Human Reference Genomes
    • Using Hail to Analyze Genomic Data
    • Open-Source Tools by DNAnexus Scientists
    • Using IGV Locally with DNAnexus
  • Downloads
  • FAQs
    • EOL Documentation
      • Python 3 Support and Python 2 End of Life (EOL)
    • Automating Analysis Workflow
    • Backups of Customer Data
    • Developing Apps and Applets
    • Importing Data
    • Platform Uptime
    • Legal and Compliance
    • Sharing and Collaboration
    • Product Version Numbering
  • Release Notes
  • Technical Support
  • Legal
Powered by GitBook

Copyright 2025 DNAnexus

On this page
  • Step 1. Build a Simple App
  • Step 2. Run BLAST
  • Step 3. Provide an Input/Output Spec
  • Step 4. Configure App Settings
  • Step 5. Use SDK Tools
  • Learn More

Was this helpful?

Export as PDF
  1. Getting Started

Developer Quickstart

Learn to build an app that you can run on the Platform.

Last updated 25 days ago

Was this helpful?

This tutorial provides a quick intro to the DNAnexus developer experience, and progresses to building a fully functional, useful app on the Platform. For a more in-depth discussion of the Platform, take a look at .

The steps below require the . You must download and install it if you have not done so already.

In addition to this Quickstart, there are Developer Tutorials located in the sidebar that go over helpful tips for new users as well. A few of them include:

Step 1. Build a Simple App

Every DNAnexus app starts with 2 files:

  • : a file containing the app's metadata: its inputs and outputs, how the app will be run, etc.

  • a script that will be executed in the cloud when the app is run

Let's start by creating a file called dxapp.json with the following text:

{ "name": "coolapp",
  "runSpec": {
    "distribution": "Ubuntu",
    "release": "24.04",
    "version": "0",
    "interpreter": "python3",
    "file": "code.py"
  }
}

Above, we've specified the name for our app (coolapp), the type of interpreter (python3) to run our script with, and a path (code.py) to the script that we will create next. ("version":"0") refers to the version of Ubuntu 24.04 application execution environment that supports (python3) interpreter.

Next, we create our script in a file called code.py with the following text:

import dxpy

@dxpy.entry_point('main')
def main(**kwargs):
    print("Hello, DNAnexus!")
    return {}

That's all we need. To build the app, first log in to DNAnexus and start a project with dx login. In the directory with the two files above, run:

$ dx login
 ...
$ dx build -a

Now, run the app and watch the output:

$ dx run coolapp --watch

The app is now available in the DNAnexus web interface, as part of the project that you started. It can be configured and run in the Workflow Builder, or shared with other users by sharing the project.

Step 2. Run BLAST

{ "name": "coolapp",
  "runSpec": {
    "distribution": "Ubuntu",
    "release": "24.04",
    "version": "0",
    "interpreter": "python3",
    "file": "code.py",
    "execDepends": [ {"name": "ncbi-blast+"} ]
  }
}

Next, let's update code.py to run BLAST:

import dxpy, subprocess

@dxpy.entry_point('main')
def main(seq1, seq2):
    dxpy.download_dxfile(seq1, "seq1.fasta")
    dxpy.download_dxfile(seq2, "seq2.fasta")

    subprocess.call("blastn -query seq1.fasta -subject seq2.fasta > report.txt", shell=True)

    report = dxpy.upload_local_file("report.txt")
    return {"blast_result": report}

Rebuild the app with dx build -a, and run it like this:

$ dx run coolapp -i seq1="Demo Data:/Developer Quickstart/NC_000868.fasta" -i seq2="Demo Data:/Developer Quickstart/NC_001422.fasta" --watch

Once the job is done, you can examine the output with dx head report.txt, download it with dx download, or view it on the website.

Step 3. Provide an Input/Output Spec

{
  "name": "coolapp",
  "runSpec": {
    "distribution": "Ubuntu",
    "release": "24.04",
    "version": "0",
    "interpreter": "python3",
    "file": "code.py",
    "execDepends": [ {"name": "ncbi-blast+"} ]
  },
  "inputSpec": [
    {"name": "seq1", "class": "file"},
    {"name": "seq2", "class": "file"}
  ],
  "outputSpec": [
    {"name": "blast_result", "class": "file"}
  ]
}

Rebuild the app with dx build -a. You can run it in the same way as before, but now we can add the applet to a workflow. Click "New Workflow" while looking at your project on the website, and click on coolapp once to add it to the workflow. You'll see inputs and outputs appear on the workflow stage which can be connected to other stages in the workflow.

Also, if you now go back to the command line and run dx run coolapp with no input arguments, it will prompt you for the input values for seq1 and seq2.

Step 4. Configure App Settings

In addition to specifying input files, the I/O specification can also be used to configure settings that we want the app to use. For example, we can configure the E-value setting and other BLAST settings with this code and dxapp.json:

code.py

import dxpy, subprocess

@dxpy.entry_point('main')
def main(seq1, seq2, evalue, blast_args):
    dxpy.download_dxfile(seq1, "seq1.fasta")
    dxpy.download_dxfile(seq2, "seq2.fasta")

    command = "blastn -query seq1.fasta -subject seq2.fasta -evalue {e} {args} > report.txt".format(e=evalue, args=blast_args)
    subprocess.call(command, shell=True)

    report = dxpy.upload_local_file("report.txt")
    return {"blast_result": report}

dxapp.json

{
  "name": "coolapp",
  "runSpec": {
    "distribution": "Ubuntu",
    "release": "24.04",
    "version": "0",
    "interpreter": "python3",
    "file": "code.py",
    "execDepends": [ {"name": "ncbi-blast+"} ]
  },
  "inputSpec": [
    {"name": "seq1", "class": "file"},
    {"name": "seq2", "class": "file"},
    {"name": "evalue", "class": "float", "default": 0.01},
    {"name": "blast_args", "class": "string", "default": ""}
  ],
  "outputSpec": [
    {"name": "blast_result", "class": "file"}
  ]
}

Rebuild the app again and add it in the workflow builder. You should now see the evalue and blast_args settings available when you click the gear button on the stage. After building and configuring a workflow, you can run the workflow itself with dx run workflowname.

Step 5. Use SDK Tools

One of the utilities provided in the SDK is dx-app-wizard. This tool will prompt you with a series of questions with which it will create the basic files needed for a new app. It also gives you the option of writing your app as a bash shell script instead of Python. Just run dx-app-wizard to try it out.

Learn More

That's it! You have just made and run your first DNAnexus applet. Applets are lightweight apps that live in your project, and are not visible in the . When you typed dx run, the app ran on its own Linux instance in the cloud. You have exclusive, secure access to the CPU, storage, and memory on the instance. The lets your app read and write data on the Platform, as well as launch other apps.

Next, we'll make our app do something a bit more interesting: take in two files with -formatted DNA, run the tool to compare them, and output the result.

In the cloud, your app will run on , where BLAST is available as an APT package, ncbi-blast+. You can request that the DNAnexus install it before your script is run by listing ncbi-blast+ in the execDepends field of your dxapp.json like this:

We're now ready to rebuild the app and test it on some real data. You can use some demo inputs available in the project, or you can upload your own data with dx upload or via the website. If you use the Demo Data inputs, make sure the project you are running your app in is the same region as the Demo Data project.

are a powerful way to visually connect, configure, and run multiple apps in pipelines. To add our app to a workflow and be able to connect its inputs and/or outputs to other apps, our app will need both . Let's update our dxapp.json as follows:

For additional information and examples of how to run jobs using the CLI, may be useful. Note that this material is not a part of the official DNAnexus documentation and is for reference only.

Intro to Building Apps
DNAnexus SDK
Distributed by Chr (sh)
Parallel by Chr (py)
R Shiny Example Web App
dxapp.json
App Library
DNAnexus API
FASTA
BLAST
Ubuntu Linux
24.04
execution environment
Demo Data
Workflows
input and output specifications
Chapter 5 of this reference guide