DNAnexus Documentation
APIDownloadsIndex of dx CommandsLegal
  • Overview
  • Getting Started
    • DNAnexus Essentials
    • Key Concepts
      • Projects
      • Organizations
      • Apps and Workflows
    • User Interface Quickstart
    • Command Line Quickstart
    • Developer Quickstart
    • Developer Tutorials
      • Bash
        • Bash Helpers
        • Distributed by Chr (sh)
        • Distributed by Region (sh)
        • SAMtools count
        • TensorBoard Example Web App
        • Git Dependency
        • Mkfifo and dx cat
        • Parallel by Region (sh)
        • Parallel xargs by Chr
        • Precompiled Binary
        • R Shiny Example Web App
      • Python
        • Dash Example Web App
        • Distributed by Region (py)
        • Parallel by Chr (py)
        • Parallel by Region (py)
        • Pysam
      • Web App(let) Tutorials
        • Dash Example Web App
        • TensorBoard Example Web App
      • Concurrent Computing Tutorials
        • Distributed
          • Distributed by Region (sh)
          • Distributed by Chr (sh)
          • Distributed by Region (py)
        • Parallel
          • Parallel by Chr (py)
          • Parallel by Region (py)
          • Parallel by Region (sh)
          • Parallel xargs by Chr
  • User
    • Login and Logout
    • Projects
      • Project Navigation
      • Path Resolution
    • Running Apps and Workflows
      • Running Apps and Applets
      • Running Workflows
      • Running Nextflow Pipelines
      • Running Batch Jobs
      • Monitoring Executions
      • Job Notifications
      • Job Lifecycle
      • Executions and Time Limits
      • Executions and Cost and Spending Limits
      • Smart Reuse (Job Reuse)
      • Apps and Workflows Glossary
      • Tools List
    • Cohort Browser
      • Chart Types
        • Row Chart
        • Histogram
        • Box Plot
        • List View
        • Grouped Box Plot
        • Stacked Row Chart
        • Scatter Plot
        • Kaplan-Meier Survival Curve
      • Locus Details Page
    • Using DXJupyterLab
      • DXJupyterLab Quickstart
      • Running DXJupyterLab
        • FreeSurfer in DXJupyterLab
      • Spark Cluster-Enabled DXJupyterLab
        • Exploring and Querying Datasets
      • Stata in DXJupyterLab
      • Running Older Versions of DXJupyterLab
      • DXJupyterLab Reference
    • Using Spark
      • Apollo Apps
      • Connect to Thrift
      • Example Applications
        • CSV Loader
        • SQL Runner
        • VCF Loader
      • VCF Preprocessing
    • Environment Variables
    • Objects
      • Describing Data Objects
      • Searching Data Objects
      • Visualizing Data
      • Filtering Objects and Jobs
      • Archiving Files
      • Relational Database Clusters
      • Symlinks
      • Uploading and Downloading Files
        • Small File Sets
          • dx upload
          • dx download
        • Batch
          • Upload Agent
          • Download Agent
    • Platform IDs
    • Organization Member Guide
    • Index of dx commands
  • Developer
    • Developing Portable Pipelines
      • dxCompiler
    • Cloud Workstation
    • Apps
      • Introduction to Building Apps
      • App Build Process
      • Advanced Applet Tutorial
      • Bash Apps
      • Python Apps
      • Spark Apps
        • Table Exporter
        • DX Spark Submit Utility
      • HTTPS Apps
        • Isolated Browsing for HTTPS Apps
      • Transitioning from Applets to Apps
      • Third Party and Community Apps
        • Community App Guidelines
        • Third Party App Style Guide
        • Third Party App Publishing Checklist
      • App Metadata
      • App Permissions
      • App Execution Environment
        • Connecting to Jobs
      • Dependency Management
        • Asset Build Process
        • Docker Images
        • Python package installation in Ubuntu 24.04 AEE
      • Job Identity Tokens for Access to Clouds and Third-Party Services
      • Enabling Web Application Users to Log In with DNAnexus Credentials
      • Types of Errors
    • Workflows
      • Importing Workflows
      • Introduction to Building Workflows
      • Building and Running Workflows
      • Workflow Build Process
      • Versioning and Publishing Global Workflows
      • Workflow Metadata
    • Ingesting Data
      • Molecular Expression Assay Loader
        • Common Errors
        • Example Usage
        • Example Input
      • Data Model Loader
        • Data Ingestion Key Steps
        • Ingestion Data Types
        • Data Files Used by the Data Model Loader
        • Troubleshooting
      • Dataset Extender
        • Using Dataset Extender
    • Dataset Management
      • Rebase Cohorts and Dashboards
      • Assay Dataset Merger
      • Clinical Dataset Merger
    • Apollo Datasets
      • Dataset Versions
      • Cohorts
    • Creating Custom Viewers
    • Client Libraries
      • Support for Python 3
    • Walkthroughs
      • Creating a Mixed Phenotypic Assay Dataset
      • Guide for Ingesting a Simple Four Table Dataset
    • DNAnexus API
      • Entity IDs
      • Protocols
      • Authentication
      • Regions
      • Nonces
      • Users
      • Organizations
      • OIDC Clients
      • Data Containers
        • Folders and Deletion
        • Cloning
        • Project API Methods
        • Project Permissions and Sharing
      • Data Object Lifecycle
        • Types
        • Object Details
        • Visibility
      • Data Object Metadata
        • Name
        • Properties
        • Tags
      • Data Object Classes
        • Records
        • Files
        • Databases
        • Drives
        • DBClusters
      • Running Analyses
        • I/O and Run Specifications
        • Instance Types
        • Job Input and Output
        • Applets and Entry Points
        • Apps
        • Workflows and Analyses
        • Global Workflows
        • Containers for Execution
      • Search
      • System Methods
      • Directory of API Methods
      • DNAnexus Service Limits
  • Administrator
    • Billing
    • Org Management
    • Single Sign-On
    • Audit Trail
    • Integrating with External Services
    • Portal Setup
    • GxP
      • Controlled Tool Access (allowed executables)
  • Science Corner
    • Scientific Guides
      • Somatic Small Variant and CNV Discovery Workflow Walkthrough
      • SAIGE GWAS Walkthrough
      • LocusZoom DNAnexus App
      • Human Reference Genomes
    • Using Hail to Analyze Genomic Data
    • Open-Source Tools by DNAnexus Scientists
    • Using IGV Locally with DNAnexus
  • Downloads
  • FAQs
    • EOL Documentation
      • Python 3 Support and Python 2 End of Life (EOL)
    • Automating Analysis Workflow
    • Backups of Customer Data
    • Developing Apps and Applets
    • Importing Data
    • Platform Uptime
    • Legal and Compliance
    • Sharing and Collaboration
    • Product Version Numbering
  • Release Notes
  • Technical Support
  • Legal
Powered by GitBook

Copyright 2025 DNAnexus

On this page
  • Deletion of Data
  • API Methods
  • API method: /class-xxxx/newFolder
  • API method: /class-xxxx/renameFolder
  • API method: /class-xxxx/listFolder
  • API method: /class-xxxx/removeFolder
  • API method: /class-xxxx/move
  • API method: /class-xxxx/removeObjects

Was this helpful?

Export as PDF
  1. Developer
  2. DNAnexus API
  3. Data Containers

Folders and Deletion

Last updated 12 months ago

Was this helpful?

Objects inside a data container are organized into folders. Any data object or folder can be attached into another folder. Cycles are prohibited, and objects can be in at most one folder. By convention, the "root" folder of a container has path equal to "/" and cannot be moved, cloned or removed. This is the default destination folder of any created objects in the data container. Folders can be created in any existing folder and must have a name that is a UTF-8 string that contains at least one character, is not equal to "." or "..", and does not match the regular expression [\x00-\x1F]. Thus in full folder paths, consecutive characters of "/" will be interpreted as a single character "/", and trailing characters of "/" will be ignored. Folders can be renamed, moved, and removed.

Deletion of Data

When data objects are removed, removal is propagated to hidden linked objects that are no longer reachable from some other visible object (also in the data container) via one or more links. (For a full description of details, links and visibility, see and . The objects removed can be in any state. Removal of open objects may have additional consequences such as invalidating URLs provided for uploading data. Removal of objects can occur using either the or the API methods.

API Methods

API method: /class-xxxx/newFolder

Specification

Creates a new folder in the data container using the given route.

Inputs

  • folder string The new folder to be created in the data container

  • parents boolean (optional, default false) Whether the parent

    folders should be created if they do not exist

Outputs

  • id string ID of the manipulated data container

Errors

  • InvalidInput (the input is not a hash, or folder is missing or is not a nonempty string starting with "/", or parents is present but is not a boolean)

  • ResourceNotFound (the specified data container does not exist, or parents is false but the parent folder of the folder specified in folder does not yet exist)

  • InvalidState (a folder with the given route already exists and parents was not set to true)

  • PermissionDenied (UPLOAD access required)

API method: /class-xxxx/renameFolder

Specification

Renames a current folder in the data container with the given name.

Inputs

  • folder string Folder to be renamed

  • name string The new basename of the folder (to replace the

    string after the last "/" in folder)

Outputs

  • id string ID of the manipulated data container

Errors

  • InvalidInput (the input is not a hash, or folder is missing or is not a nonempty string starting with "/", folder is "/", name is missing or is not a nonempty string with no "/")

  • ResourceNotFound (the specified data container does not exist, or the specified folder does not exist)

  • InvalidState (a folder with the given route already exists)

  • PermissionDenied (CONTRIBUTE access required)

API method: /class-xxxx/listFolder

Specification

Lists the contents of a given data container's folder. Only folders or only objects can be specified to be returned, though this method returns both folders and objects by default. This method will not return hidden objects unless includeHidden is set to true.

Inputs

  • folder string (optional, default "/") The folder to be listed

  • describe boolean or mapping (optional, default false) False indicates that no extra metadata should be retrieved with the data object results. A mapping represents the input that would be used for calling the result's corresponding describe method; the value true is equivalent to the empty hash input.

  • only string (optional, default "all") Indicate what type of contents to return; one of the values "folders", "objects", or "all"

  • includeHidden boolean (optional, default false) Whether hidden objects should be returned in objects; applicable only if only is set to "objects" or "all".

  • hasSubfolderFlags boolean (optional, default false) Indicate whether a folder has subfolders

Outputs

  • objects array of mappings List of metadata for all data objects in the specified folder, each with the key/values:

    • id string The object ID

    If describe was set to true or a mapping:

    • describe mapping The output of the result's corresponding describe method

  • folders array of strings or array of arrays By default, this is a list of all folders directly under the specified folder. If the 'hasSubfolderFlags' input is set to true, the array will contain arrays of exactly two elements where the first element is a string of the folder name and the second element is a boolean that is set to true if the folder has subfolders and false otherwise.

Errors

  • InvalidInput (the input is not a hash; folder is not a nonempty

    string starting with "/"; only is given but is not one of

    "folders", "objects", and "all"; includeHidden is present but is

    not a boolean)

  • ResourceNotFound (the specified data container does not exist or a folder with the given route does not exist)

  • PermissionDenied (VIEW access required)

API method: /class-xxxx/removeFolder

Specification

Removes the given folder in the data container using the given route. The folder must contain no visible data objects in order to be removed unless recurse is set to true. Note that removing the root folder "/" with recurse set to true removes ALL of the contents in the data container. Setting recurse to true should be used carefully to prevent accidental deletion of data. Removing a folder removes all of the visible objects contained in it and all of their orphaned hidden links. Any remaining hidden objects that are not removed by this process are placed in the root folder. In particular, if recurse is false and the folder to be removed only contains hidden objects, then these objects are similarly placed in the root folder, and the specified folder folder is removed.

There is a limit to how many objects may be removed in one API call of 10,000 data objects regardless of visibility. If a folder's contents goes above this threshold, one can set partial to true to allow for a partial removal of the folder's contents up to this limit. A completed flag will then be set in the output to indicate whether or not the folder was completely removed. If completed is set to false, reissue the api call with the partial flag in order to continue removing the folder.

Inputs

  • folder string The folder to be removed from the data container

  • recurse boolean (optional, default false) Whether removal should

    propagate to its contained folders and objects

  • force boolean (optional, default false) If true, this operation should

    not throw an error even if the given folder does not exist

  • partial boolean (optional, default false) If true, this operation will

    attempt to remove objects from the folder even if the folder contains more

    objects than can be deleted at one time.

Outputs

  • id string ID of the manipulated data container

    If partial was set to true:

  • completed boolean True if all objects, subfolders, and the specified

    folder have been removed. False if there are still objects remaining indicating

    the user should reissue the call to further progress.

Errors

  • InvalidInput (the input is not a hash, folder is missing, folder is not set to a nonempty string starting with "/", folder is set to "/" but recurse is not specified or is false, recurse is provided and is not a boolean)

  • ResourceNotFound

    • The specified data container does not exist

    • A folder with the given route does not exist and force was not set to true

  • PermissionDenied (CONTRIBUTE access required)

  • InvalidState

    • The project contains a dbcluster in a non-terminal state

    • recurse is false and the specified folder contains either subfolders or visible objects

    • partial is false and the specified folder contains more objects than

      can be deleted in one api call. For more information, see

API method: /class-xxxx/move

Specification

Moves the specified objects and/or folders into the specified folder. Objects and folders to be moved can be in any state and in any folder in the data container. Objects and folders that are in a folder listed in folders remain in the (moved) folder unless explicitly listed in objects or folders as well, in which case they are removed from their parent folder and moved to the destination folder described in folder. In addition, any objects explicitly listed in objects will pull along hidden objects that they also link to into the folder indicated as destination.

Inputs

  • objects array of strings (optional) List of data object IDs to

    be moved

  • folders array of strings (optional) List of folders to be

    moved

  • destination string The destination folder

Outputs

  • id string ID of the manipulated data container

Errors

  • InvalidInput (the input is not a hash, objects (if provided) is not an array of nonempty strings, folders (if provided) is not an array of nonempty strings starting with "/", two of the folders in folders have the same name but different paths, destination is missing or is not a nonempty string starting with "/")

  • ResourceNotFound (the specified data container does not exist, one of the specified objects and/or folders does not exist in the specified data container, or the destination folder does not exist)

  • PermissionDenied (CONTRIBUTE access required)

  • InvalidState (a folder listed in folders is or contains the destination folder)

API method: /class-xxxx/removeObjects

Specification

Inputs

  • objects array of strings List of object IDs to be removed from the data container

  • force boolean (optional, default false) If true, this operation should not throw an error even if the given object does not exist

Outputs

  • id string ID of the manipulated data container

Errors

  • InvalidInput (the input is not a hash, or objects is missing or is not an array of nonempty strings)

  • ResourceNotFound

    • The specified data container does not exist

    • An object with the given route does not exist or does not exist in the specified data container and force was not set to true

  • PermissionDenied (CONTRIBUTE/ADMINISTER access required, depending on PROTECTED flag)

  • InvalidState

    • One of the objects is a dbcluster in a non-terminal state

.

Removes the specified objects from the data container. Removal is propagated to hidden linked objects that are no longer reachable from some other visible object (also in the data container) via one or more links. The objects removed can be in any state, and there is no way to undo this operation. Removal of open objects may have additional consequences such as invalidating URLs provided for uploading data. The only way to reinstate a closed object into the data container is if a copy of it still exists in some other data container and if someone with the proper permissions clones the object back into the data container; note that any previously set user-provided metadata will be lost and will be initialized instead to the metadata from the source data container. See the section on for more details.

Service Limits
Cloning
Details and Links
Visibility
/class-xxxx/removeFolder
/class-xxxx/removeObjects