DNAnexus Documentation
APIDownloadsIndex of dx CommandsLegal
  • Overview
  • Getting Started
    • DNAnexus Essentials
    • Key Concepts
      • Projects
      • Organizations
      • Apps and Workflows
    • User Interface Quickstart
    • Command Line Quickstart
    • Developer Quickstart
    • Developer Tutorials
      • Bash
        • Bash Helpers
        • Distributed by Chr (sh)
        • Distributed by Region (sh)
        • SAMtools count
        • TensorBoard Example Web App
        • Git Dependency
        • Mkfifo and dx cat
        • Parallel by Region (sh)
        • Parallel xargs by Chr
        • Precompiled Binary
        • R Shiny Example Web App
      • Python
        • Dash Example Web App
        • Distributed by Region (py)
        • Parallel by Chr (py)
        • Parallel by Region (py)
        • Pysam
      • Web App(let) Tutorials
        • Dash Example Web App
        • TensorBoard Example Web App
      • Concurrent Computing Tutorials
        • Distributed
          • Distributed by Region (sh)
          • Distributed by Chr (sh)
          • Distributed by Region (py)
        • Parallel
          • Parallel by Chr (py)
          • Parallel by Region (py)
          • Parallel by Region (sh)
          • Parallel xargs by Chr
  • User
    • Login and Logout
    • Projects
      • Project Navigation
      • Path Resolution
    • Running Apps and Workflows
      • Running Apps and Applets
      • Running Workflows
      • Running Nextflow Pipelines
      • Running Batch Jobs
      • Monitoring Executions
      • Job Notifications
      • Job Lifecycle
      • Executions and Time Limits
      • Executions and Cost and Spending Limits
      • Smart Reuse (Job Reuse)
      • Apps and Workflows Glossary
      • Tools List
    • Cohort Browser
      • Chart Types
        • Row Chart
        • Histogram
        • Box Plot
        • List View
        • Grouped Box Plot
        • Stacked Row Chart
        • Scatter Plot
        • Kaplan-Meier Survival Curve
      • Locus Details Page
    • Using DXJupyterLab
      • DXJupyterLab Quickstart
      • Running DXJupyterLab
        • FreeSurfer in DXJupyterLab
      • Spark Cluster-Enabled DXJupyterLab
        • Exploring and Querying Datasets
      • Stata in DXJupyterLab
      • Running Older Versions of DXJupyterLab
      • DXJupyterLab Reference
    • Using Spark
      • Apollo Apps
      • Connect to Thrift
      • Example Applications
        • CSV Loader
        • SQL Runner
        • VCF Loader
      • VCF Preprocessing
    • Environment Variables
    • Objects
      • Describing Data Objects
      • Searching Data Objects
      • Visualizing Data
      • Filtering Objects and Jobs
      • Archiving Files
      • Relational Database Clusters
      • Symlinks
      • Uploading and Downloading Files
        • Small File Sets
          • dx upload
          • dx download
        • Batch
          • Upload Agent
          • Download Agent
    • Platform IDs
    • Organization Member Guide
    • Index of dx commands
  • Developer
    • Developing Portable Pipelines
      • dxCompiler
    • Cloud Workstation
    • Apps
      • Introduction to Building Apps
      • App Build Process
      • Advanced Applet Tutorial
      • Bash Apps
      • Python Apps
      • Spark Apps
        • Table Exporter
        • DX Spark Submit Utility
      • HTTPS Apps
        • Isolated Browsing for HTTPS Apps
      • Transitioning from Applets to Apps
      • Third Party and Community Apps
        • Community App Guidelines
        • Third Party App Style Guide
        • Third Party App Publishing Checklist
      • App Metadata
      • App Permissions
      • App Execution Environment
        • Connecting to Jobs
      • Dependency Management
        • Asset Build Process
        • Docker Images
        • Python package installation in Ubuntu 24.04 AEE
      • Job Identity Tokens for Access to Clouds and Third-Party Services
      • Enabling Web Application Users to Log In with DNAnexus Credentials
      • Types of Errors
    • Workflows
      • Importing Workflows
      • Introduction to Building Workflows
      • Building and Running Workflows
      • Workflow Build Process
      • Versioning and Publishing Global Workflows
      • Workflow Metadata
    • Ingesting Data
      • Molecular Expression Assay Loader
        • Common Errors
        • Example Usage
        • Example Input
      • Data Model Loader
        • Data Ingestion Key Steps
        • Ingestion Data Types
        • Data Files Used by the Data Model Loader
        • Troubleshooting
      • Dataset Extender
        • Using Dataset Extender
    • Dataset Management
      • Rebase Cohorts and Dashboards
      • Assay Dataset Merger
      • Clinical Dataset Merger
    • Apollo Datasets
      • Dataset Versions
      • Cohorts
    • Creating Custom Viewers
    • Client Libraries
      • Support for Python 3
    • Walkthroughs
      • Creating a Mixed Phenotypic Assay Dataset
      • Guide for Ingesting a Simple Four Table Dataset
    • DNAnexus API
      • Entity IDs
      • Protocols
      • Authentication
      • Regions
      • Nonces
      • Users
      • Organizations
      • OIDC Clients
      • Data Containers
        • Folders and Deletion
        • Cloning
        • Project API Methods
        • Project Permissions and Sharing
      • Data Object Lifecycle
        • Types
        • Object Details
        • Visibility
      • Data Object Metadata
        • Name
        • Properties
        • Tags
      • Data Object Classes
        • Records
        • Files
        • Databases
        • Drives
        • DBClusters
      • Running Analyses
        • I/O and Run Specifications
        • Instance Types
        • Job Input and Output
        • Applets and Entry Points
        • Apps
        • Workflows and Analyses
        • Global Workflows
        • Containers for Execution
      • Search
      • System Methods
      • Directory of API Methods
      • DNAnexus Service Limits
  • Administrator
    • Billing
    • Org Management
    • Single Sign-On
    • Audit Trail
    • Integrating with External Services
    • Portal Setup
    • GxP
      • Controlled Tool Access (allowed executables)
  • Science Corner
    • Scientific Guides
      • Somatic Small Variant and CNV Discovery Workflow Walkthrough
      • SAIGE GWAS Walkthrough
      • LocusZoom DNAnexus App
      • Human Reference Genomes
    • Using Hail to Analyze Genomic Data
    • Open-Source Tools by DNAnexus Scientists
    • Using IGV Locally with DNAnexus
  • Downloads
  • FAQs
    • EOL Documentation
      • Python 3 Support and Python 2 End of Life (EOL)
    • Automating Analysis Workflow
    • Backups of Customer Data
    • Developing Apps and Applets
    • Importing Data
    • Platform Uptime
    • Legal and Compliance
    • Sharing and Collaboration
    • Product Version Numbering
  • Release Notes
  • Technical Support
  • Legal
Powered by GitBook

Copyright 2025 DNAnexus

On this page
  • Downloading and Using File Inputs
  • Downloading Inputs Using dx-mount-all-inputs
  • Uploading Outputs

Was this helpful?

Export as PDF
  1. Developer
  2. Apps

Bash Apps

Learn to write a basic Bash app .

Last updated 1 year ago

Was this helpful?

If you have not already, you should install the and walk through the tutorial.

Bash apps are the simplest apps you can create on DNAnexus. They include a shell script which runs on a virtual machine on the cloud. The script, written by the app developer, is responsible for downloading inputs, processing, and uploading outputs.

Downloading and Using File Inputs

There are two ways to download inputs: one by one, or all at once.

To download all inputs at once (recommended), use the dx-download-all-inputs utility. Add "--parallel" to allow multiple downloads in parallel:dx-download-all-inputs --parallel.

Inputs are downloaded to a folder called "in" under the home folder. Each input is placed under its own subfolder "~/in/name_of_input_field/", named after the input field. Files retain their original filenames as supplied by the user who launched the applet.

For example, if your applet defines a file input called "mappings" and a user runs it with a file called "SRR001.bam", it will be downloaded into ~/in/mappings/SRR001.bam.

The system defines the following helper bash variables, which you can use in your applet:

Bash variable

Content

$mappings_path

~/in/mappings/SRR001.bam

$mappings_name

SRR001.bam

$mappings_prefix

SRR001

$mappings

{"$dnanexus_link": "file-F77Bp7002302Zb343BF1FpG0"}

The variable $mappings_prefix is automatically computed by the system by starting from the filename and removing any suffixes that match the patterns specified in dxapp.json for this input field.

To download inputs one by one, use the following syntax:

dx download "$name_of_input_field"

This will download the file to the current working directory. It will retain the original file as supplied by the user who launched the applet. You can use the "$xxxxxx_name" variable (as shown in the table above) to refer to that filename.

To name the local file using a different name than the original filename, use the following syntax:

dx download "$name_of_input_field" -o local_filename

To stream the local file and pipe it to another command, use the following syntax:

dx cat "$name_of_input_field" | command

You can combine these strategies. You can download or stream some inputs one by one, and the rest all at once, using the following syntax:

dx-download-all-inputs --except name_of_input_field1 --except name_of_input_field2

dx download "$name_of_input_field1"

dx cat "$name_of_input_field2" | command

Downloading Inputs Using dx-mount-all-inputs

An alternative to dx-download-all-inputs is the dx-mount-all-inputs command-line utility. You can use it by adding this line to your script.sh:

$ dx-mount-all-inputs

dx-mount-all-inputsuses the same directory structure as illustrated above for dx-download-all-inputs, except that the files are mounted to the respective location rather than downloaded. When using dx-mount-all-inputs, input files do not take up local storage because they are mounted using Linux FUSE technology and streamed behind the scenes transparently when accessed.

Uploading Outputs

There are two ways to upload outputs: one by one, or all at once.

To upload outputs one by one, use the following syntax:

id=$(dx upload /path/to/local/file --brief)

dx-jobutil-add-output name_of_output_field "$id"

If you would like the uploaded file to have a different name than the local file, add the following to the dx upload command:

--path remote_filename

If you would like the uploaded file to appear under a subfolder in the applet outputs, add the following to the dx upload command:

--path /subfolder/remote_filename --parents

If you would like the uploaded file to contain metadata, for example set a property (key/value pair), add the following to the dx upload command:

--property key=value

To upload all outputs at once, create a folder "out" under the home folder, and create a subfolder named after each output field. Place a file under each subfolder, and call the "dx-upload-all-outputs" utility:

mkdir -p ~/out/name_of_output_field1/ ~/out/name_of_output_field2/

mv file1 ~/out/name_of_output_field1/

mv file2 ~/out/name_of_output_field2/

dx-upload-all-outputs

If you would like any uploaded file to have a different name than the local file, rename them as you move them:

mv file1 ~/out/name_of_output_field1/renamed_file1

If you would like any uploaded file to appear under a subfolder in the applet outputs, create the subfolder like this:

mkdir -p ~/out/name_of_output_field1/subfolder/ mv file1 ~/out/name_of_output_field1/subfolder/

If you would like the uploaded files to contain metadata, for example set a property (key/value pair), make the following changes:

  1. Install the "attr" Linux executable by specifying the "attr" Ubuntu package in your dxapp.json.

  2. For each file, and for each key/value pair you would like to attach, set the respective Linux extended file system attribute like this:

    attr -s key -V value ~/out/name_of_output_field1/file1

  3. Add "--xattr-properties" to the dx-upload-all-outputs invocation:

    dx-upload-all-outputs --xattr-properties

is required for the dx-mount-all-inputs to work. You can download the dxfuse binary from and add it to ./resources/usr/bin/dxfuse.

DNAnexus SDK
Intro to Building Apps
dxfuse
https://github.com/dnanexus/dxfuse/releases