DNAnexus Documentation
APIDownloadsIndex of dx CommandsLegal
  • Overview
  • Getting Started
    • DNAnexus Essentials
    • Key Concepts
      • Projects
      • Organizations
      • Apps and Workflows
    • User Interface Quickstart
    • Command Line Quickstart
    • Developer Quickstart
    • Developer Tutorials
      • Bash
        • Bash Helpers
        • Distributed by Chr (sh)
        • Distributed by Region (sh)
        • SAMtools count
        • TensorBoard Example Web App
        • Git Dependency
        • Mkfifo and dx cat
        • Parallel by Region (sh)
        • Parallel xargs by Chr
        • Precompiled Binary
        • R Shiny Example Web App
      • Python
        • Dash Example Web App
        • Distributed by Region (py)
        • Parallel by Chr (py)
        • Parallel by Region (py)
        • Pysam
      • Web App(let) Tutorials
        • Dash Example Web App
        • TensorBoard Example Web App
      • Concurrent Computing Tutorials
        • Distributed
          • Distributed by Region (sh)
          • Distributed by Chr (sh)
          • Distributed by Region (py)
        • Parallel
          • Parallel by Chr (py)
          • Parallel by Region (py)
          • Parallel by Region (sh)
          • Parallel xargs by Chr
  • User
    • Login and Logout
    • Projects
      • Project Navigation
      • Path Resolution
    • Running Apps and Workflows
      • Running Apps and Applets
      • Running Workflows
      • Running Nextflow Pipelines
      • Running Batch Jobs
      • Monitoring Executions
      • Job Notifications
      • Job Lifecycle
      • Executions and Time Limits
      • Executions and Cost and Spending Limits
      • Smart Reuse (Job Reuse)
      • Apps and Workflows Glossary
      • Tools List
    • Cohort Browser
      • Chart Types
        • Row Chart
        • Histogram
        • Box Plot
        • List View
        • Grouped Box Plot
        • Stacked Row Chart
        • Scatter Plot
        • Kaplan-Meier Survival Curve
      • Locus Details Page
    • Using DXJupyterLab
      • DXJupyterLab Quickstart
      • Running DXJupyterLab
        • FreeSurfer in DXJupyterLab
      • Spark Cluster-Enabled DXJupyterLab
        • Exploring and Querying Datasets
      • Stata in DXJupyterLab
      • Running Older Versions of DXJupyterLab
      • DXJupyterLab Reference
    • Using Spark
      • Apollo Apps
      • Connect to Thrift
      • Example Applications
        • CSV Loader
        • SQL Runner
        • VCF Loader
      • VCF Preprocessing
    • Environment Variables
    • Objects
      • Describing Data Objects
      • Searching Data Objects
      • Visualizing Data
      • Filtering Objects and Jobs
      • Archiving Files
      • Relational Database Clusters
      • Symlinks
      • Uploading and Downloading Files
        • Small File Sets
          • dx upload
          • dx download
        • Batch
          • Upload Agent
          • Download Agent
    • Platform IDs
    • Organization Member Guide
    • Index of dx commands
  • Developer
    • Developing Portable Pipelines
      • dxCompiler
    • Cloud Workstation
    • Apps
      • Introduction to Building Apps
      • App Build Process
      • Advanced Applet Tutorial
      • Bash Apps
      • Python Apps
      • Spark Apps
        • Table Exporter
        • DX Spark Submit Utility
      • HTTPS Apps
        • Isolated Browsing for HTTPS Apps
      • Transitioning from Applets to Apps
      • Third Party and Community Apps
        • Community App Guidelines
        • Third Party App Style Guide
        • Third Party App Publishing Checklist
      • App Metadata
      • App Permissions
      • App Execution Environment
        • Connecting to Jobs
      • Dependency Management
        • Asset Build Process
        • Docker Images
        • Python package installation in Ubuntu 24.04 AEE
      • Job Identity Tokens for Access to Clouds and Third-Party Services
      • Enabling Web Application Users to Log In with DNAnexus Credentials
      • Types of Errors
    • Workflows
      • Importing Workflows
      • Introduction to Building Workflows
      • Building and Running Workflows
      • Workflow Build Process
      • Versioning and Publishing Global Workflows
      • Workflow Metadata
    • Ingesting Data
      • Molecular Expression Assay Loader
        • Common Errors
        • Example Usage
        • Example Input
      • Data Model Loader
        • Data Ingestion Key Steps
        • Ingestion Data Types
        • Data Files Used by the Data Model Loader
        • Troubleshooting
      • Dataset Extender
        • Using Dataset Extender
    • Dataset Management
      • Rebase Cohorts and Dashboards
      • Assay Dataset Merger
      • Clinical Dataset Merger
    • Apollo Datasets
      • Dataset Versions
      • Cohorts
    • Creating Custom Viewers
    • Client Libraries
      • Support for Python 3
    • Walkthroughs
      • Creating a Mixed Phenotypic Assay Dataset
      • Guide for Ingesting a Simple Four Table Dataset
    • DNAnexus API
      • Entity IDs
      • Protocols
      • Authentication
      • Regions
      • Nonces
      • Users
      • Organizations
      • OIDC Clients
      • Data Containers
        • Folders and Deletion
        • Cloning
        • Project API Methods
        • Project Permissions and Sharing
      • Data Object Lifecycle
        • Types
        • Object Details
        • Visibility
      • Data Object Metadata
        • Name
        • Properties
        • Tags
      • Data Object Classes
        • Records
        • Files
        • Databases
        • Drives
        • DBClusters
      • Running Analyses
        • I/O and Run Specifications
        • Instance Types
        • Job Input and Output
        • Applets and Entry Points
        • Apps
        • Workflows and Analyses
        • Global Workflows
        • Containers for Execution
      • Search
      • System Methods
      • Directory of API Methods
      • DNAnexus Service Limits
  • Administrator
    • Billing
    • Org Management
    • Single Sign-On
    • Audit Trail
    • Integrating with External Services
    • Portal Setup
    • GxP
      • Controlled Tool Access (allowed executables)
  • Science Corner
    • Scientific Guides
      • Somatic Small Variant and CNV Discovery Workflow Walkthrough
      • SAIGE GWAS Walkthrough
      • LocusZoom DNAnexus App
      • Human Reference Genomes
    • Using Hail to Analyze Genomic Data
    • Open-Source Tools by DNAnexus Scientists
    • Using IGV Locally with DNAnexus
  • Downloads
  • FAQs
    • EOL Documentation
      • Python 3 Support and Python 2 End of Life (EOL)
    • Automating Analysis Workflow
    • Backups of Customer Data
    • Developing Apps and Applets
    • Importing Data
    • Platform Uptime
    • Legal and Compliance
    • Sharing and Collaboration
    • Product Version Numbering
  • Release Notes
  • Technical Support
  • Legal
Powered by GitBook

Copyright 2025 DNAnexus

On this page
  • API method: /class-xxxx/clone
  • Specification
  • Inputs
  • Outputs
  • Errors
  • API method: /class-xxxx/listProjects
  • Specification
  • Inputs
  • Outputs
  • Errors

Was this helpful?

Export as PDF
  1. Developer
  2. DNAnexus API
  3. Data Containers

Cloning

Last updated 2 years ago

Was this helpful?

Objects can be cloned (copied) from one data container (including projects) to another once they are in the "closed" state. Cloning requires VIEW or greater access to the source container and UPLOAD or greater access to the destination container. Moreover, the source container must not have the RESTRICTED flag set. Cloning generates a new copy of the user-provided metadata in the destination container, and access to that metadata and the underlying data object will abide by the permission list of the destination container; it will not be associated to the original object in any sense. (For example, if the original object is removed from the source container, that does not affect the cloned object. Similarly, if the cloned object’s metadata is modified, that does not affect the original; the cloned object should be thought of as a true copy) .

API method: /class-xxxx/clone

Specification

Clones one or more objects and/or folders from a source container into a destination container (and optionally into a destination folder) . Links in the details of the cloned objects will be followed, and any hidden objects they point to and are available in the source container will also be cloned. The process will continue recursively if the hidden linked objects also have outgoing links pointing to additional hidden objects. Note that if an object links to a hidden object that is not in the source container but which points to a hidden object that is available, this second hidden object will not be cloned as well. (The assumption is that it may lose its usefulness or meaning without its parent object.) If a folder is listed in "folders" and is successfully cloned, a new folder is created in the destination folder with the same name and with its contents set to the clones of the source folder’s contents. Any hidden objects contained in a folder to be cloned are only cloned if a visible ancestor is also cloned.

For an , if its run specification includes links to objects in its bundledDepends field, these objects will also be considered as an essential part of the data object and cloned with the applet.

If the root folder "/" is specified in "folders", then all of its contents will be cloned into the destination folder. Note that in all other cases, the contents of a folder (e.g. "/foo") specified to be cloned will be cloned into a new folder under the destination folder (e.g. "/destination/foo") , not into the destination folder itself.

If the same object appears multiple times in the "objects" input array, it is cloned only once. Similarly, if an extra object is encountered multiple times during the cloning process (for example, because it is a hidden object linked by multiple objects in the input) , it is also cloned only once.

If a hidden object is to be cloned as part of a folder, it will be cloned to a destination that is the same relative to the cloned folder as it was in the source container. Otherwise, it will be placed in the destination folder.

If an object represented in "objects" already exists in the destination container, no action is taken for that object. The object ID is returned in the array "exists". Note that as a result, a request to clone a set of objects into a particular folder in the container may succeed (in that no error is returned) , but the destination folder may not contain all of the requested objects if one of the objects already exists in another folder in the same container. If the object needs to be in the specified folder, a "/move" call needs to be made. An object can appear in at most one folder in a container.

If a workflow is to be cloned, the result will depend upon the workflow's state. If the source workflow is open, a new replica workflow object will be created in the destination. If the source workflow is closed, the source workflow will be cloned to its destination.

Databases are an exception among data object classes and cannot be cloned. A database object belongs to a single project and can be relocated from the source project to a destination project.

If the clone operation involves , the "objects" can only be cloned if they are in projects with the same billTo entity, and in archivalState "live", "archival", "archived". If any archival object is being transitioned from "archival" to "archived" state, or in "unarchiving" state, cloning is not allowed. If the clone operation involves archival objects and clones between projects with different billTo entities, cloning is not allowed until all objects are in archivalState "live".

Inputs

  • objects array of strings (optional, required if folders is not provided) List of object IDs (strings of the form "class-xxxx") in the source container to be cloned

  • folders array of strings (optional, required if objects is not provided) List of folders in the source container to be cloned

  • project string ID of the destination container

  • destination string (optional, default "/") The destination folder in project

  • parents boolean (optional, default false) Whether the destination folder and/or parent folders should be created if they do not exist

Outputs

  • id string ID of the source container

  • project string ID of the modified destination container

  • exists array of strings List of object IDs that could not be cloned because they already exist in the destination container

Errors

  • InvalidInput

    • objects (if provided) is not an array of nonempty strings

    • folders (if provided) is not an array of nonempty strings

    • Two of the folders in folders have the same name but different paths

    • project is missing or is not a nonempty string unequal to the source container

    • destination (if provided) is not a nonempty string starting with "/"

    • parents if provided is not a boolean

    • project must be in the same region as the source containers of all specified objects.

  • InvalidType

    • project is not a container ID

  • ResourceNotFound

    • The container specified in the URL does not exist

    • The container specified in project does not exist

    • One or more objects or folder routes in objects and folders were not found or were not in the source container

    • The folder specified in destination does not exist in the destination container)

  • PermissionDenied

    • VIEW access is required in the source container

    • UPLOAD access is required in the destination container

    • RESTRICTED flag must not be set in the source container

  • InvalidState

    • One or more of the input objects is not in the "closed" state

    • One or more of the input objects is a database object, or one or more of the input folders contain a database object

    • There exists a folder in the destination folder destination which has the same name as a folder listed in folders

    • The destination project is third-party sponsored and one or more of the input objects is DNAnexus-sponsored

    • The destination project is third-party sponsored and one or more of the input objects is already third-party sponsored in another project. When this error is thrown, the error details is of the form {alreadySponsored: [{id: ..., project: ...}, ...]} indicating the ID and sponsoring project of each object that was already third-party sponsored.

    • Cannot clone one or more objects as they are being archived or unarchived.

    • Cannot clone one or more objects as they are not in the live state, and the source project and destination project belong to different billTo orgs.

    • Cannot clone one or more objects in folder folder as they are being archived or unarchived.

    • Cannot clone one or more objects in folder folder as they are not in the live state, and the source project and destination project belong to different billTo orgs.

  • SpendingLimitExceeded

    • The billTo of the destination project has already reached its spending limit and is different from the billTo of the source project

API method: /class-xxxx/listProjects

Specification

Returns a list of projects that contain the object specified and for which the user has permissions to access. If the token is not full scope, the projects that are returned are additionally filtered to those that the token has at least VIEW access to.

Inputs

  • archivalInfoForOrg string (optional) The org ID of the organization whose org admin is the requesting user.

Outputs

  • key ID of a project that contains the specified object

  • value string Permission level that the user has in the project

  • liveProjects array of strings (only if archivalInfoForOrg is specified) An array IDs of projects that contain a live copy of the file. Those projects are either the ones requesting user has access to or the ones whose billTo is the org specified by archivalInfoForOrg.

Errors

  • ResourceNotFound

    • The specified object does not exist

  • PermissionDenied

    • Must be the org ADMIN of archivalInfoForOrg to return archival info

The source and destination containers must have the same region. For more information about regions, see .

Regions
archival objects
applet