Instance Types
Learn about the full range of AWS and Azure instance types available on the DNAnexus Platform.
About Instance Types
Availability
This page provides a complete list of the AWS and Azure instance types that are available for use on the DNAnexus Platform.
A list of instance types to which you have access is available via the command-line interface (CLI), by entering the command dx run --instance-type-help
. When using the user interface (UI) to configure a tool to run an analysis, you can see this list in the Stage Settings pane of the Run Analysis screen.
GPU Instance Types
AWS and Azure GPU instance types are available for use on the Platform. See the full list below and learn how to specify NVIDIA driver version.
FPGA Instance Types
AWS FPGA instance types are available for use on the Platform. See the full list below and learn how to specify FPGA driver version.
OS Support
Ubuntu Linux 24.04 and 20.04 are supported on all instance types in all regions.
Resource Usage
When using any instance, the Platform uses some resources to run processes that support your job and provide API services. For example, around 5% of the available storage will be reserved for Platform use. Your job's virtual file system can use the remaining space as local scratch. Some available memory will also be allocated to Platform processes.
New Versions of Existing Instance Types
DNAnexus regularly adds new versions of existing instance types, as AWS and Azure make them available. These new versions feature better hardware, such as a more powerful CPU, more memory, more local storage, or some combination of these elements.
On the Platform, each new instance type version has a version infix, such as _v2
, in its name, to distinguish it from the original version of that instance type. See the Instance Names section for more information on version infixes.
While the latest version will deliver better performance, DNAnexus, as a rule, makes both the original and updated version of instance types available. For example, you can access both mem1_ssd1_x8
and mem1_ssd1_v2_x8
. See the full list of available instance types for more information.
Instance Type Names
Except for GPU and FPGA instance types, instance type names are constructed according to the following scheme:
∅ (AWS) azure: (Azure)
+
mem1_ (<=2GiB/core) mem2_ (~4GiB/core) mem3_ (>=7GiB/core) mem4_ (~14GiB/core) mem5_ (~28GiB/core)
+
ssd1_ (<=20GB/core) ssd2_ (~32GB-128GB/core) ssd3_ (>600GB/core) hdd1_ (>100GB/core) hdd2_ (>100GB/core)
+
∅ (version1) v2_ (version2)
+
x1 x2 x4 x8 x16 x20 x32 x36 x40 x48 x64 x96 x128
Cloud provider prefix: Denotes the cloud provider.
Memory infix: Denotes the memory capacity (per core).
Storage infix: Denotes the local storage technology and capacity (per core).
ssd
represents a solid-state drive, whereashdd
represents a regular hard disk drive.Version infix (optional): Denotes the version of the instance type.
Core suffix: Denotes the number of cores.
For example, mem1_ssd1_v2_x8
is the second version of an instance type available on AWS, featuring 8 cores, 16GiB of memory (2GiB/core), and 160GB of solid-state drive storage (20GB/core). Similarly, azure:mem1_ssd1_x8
is an instance type available on Azure, featuring 8 cores, 15.7GiB of memory (~1.9GiB/core), and 128GB of solid-state drive storage (16GB/core).
GPU and FPGA Instance Type Names
GPU and FPGA instance type names following the schema described above, but with the inclusion of an additional infix. This infix is included before the version suffix, and provides information on the instance type (GPU or FPGA). Many such infixes also include a number, indicating the number of GPUs or FPGA included in the instance.
For example, the AWS FPGA instance type mem3_ssd2_fpga1_x8
includes 1 FPGA. The AWS GPU instance type mem2_ssd1_gpu4_x48
, meanwhile, includes 4 GPUs.
For more information, see the detailed lists of GPU instances types and FPGA instances types.
Available Instance Types
Standard AWS Instance Types
mem1_hdd1_x2
2
3.75
200
mem1_hdd1_x4
4
7.5
400
mem1_hdd1_x8
8
15
800
mem1_hdd1_x16
16
30
1600
mem1_hdd1_x36
36
60
3200
mem1_hdd1_v2_x2
2
4
200
mem1_hdd1_v2_x4
4
8
400
mem1_hdd1_v2_x8
8
16
800
mem1_hdd1_v2_x16
16
32
1600
mem1_hdd1_v2_x36
36
72
3600
mem1_hdd1_v2_x72
72
144
7200
mem1_hdd1_v2_x96
96
192
9600
mem1_ssd1_x2
2
3.8
40
mem1_ssd1_x4
4
7.5
80
mem1_ssd1_x8
8
15
160
mem1_ssd1_x16
16
30
320
mem1_ssd1_x32
32
60
640
mem1_ssd1_x36
36
72
900
mem1_ssd1_v2_x2
2
4
50
mem1_ssd1_v2_x4
4
8
100
mem1_ssd1_v2_x8
8
16
200
mem1_ssd1_v2_x16
16
32
400
mem1_ssd1_v2_x36
36
72
900
mem1_ssd1_v2_x72
72
144
1,800
mem1_ssd2_x2
2
3.8
160
mem1_ssd2_x4
4
7.5
320
mem1_ssd2_x8
8
15
640
mem1_ssd2_x16
16
30
1,280
mem1_ssd2_x36
36
60
2,880
mem1_ssd2_v2_x2
2
4
160
mem1_ssd2_v2_x4
4
8
320
mem1_ssd2_v2_x8
8
16
640
mem1_ssd2_v2_x16
16
32
1,280
mem1_ssd2_v2_x36
36
72
2,880
mem1_ssd2_v2_x72
72
144
5,760
mem1_hdd2_x1
1
1.7
160
mem1_hdd2_x8
8
7
1,680
mem1_hdd2_x32
32
60.5
3,360
mem2_ssd1_x2
2
7.5
40
mem2_ssd1_x4
4
15
80
mem2_ssd1_x8
8
30
160
mem2_ssd1_v2_x2
2
8
75
mem2_ssd1_v2_x4
4
16
150
mem2_ssd1_v2_x8
8
32
300
mem2_ssd1_v2_x16
16
64
600
mem2_ssd1_v2_x32
32
128
1,200
mem2_ssd1_v2_x48
48
192
1,800
mem2_ssd1_v2_x64
64
256
2,400
mem2_ssd1_v2_x96
96
384
3,600
mem2_ssd2_x2
2
8
160
mem2_ssd2_x4
4
16
320
mem2_ssd2_x8
8
32
1280
mem2_ssd2_x16
16
64
2560
mem2_ssd2_x40
40
160
3200
mem2_ssd2_x64
64
256
5120
mem2_ssd2_v2_x2
2
8
160
mem2_ssd2_v2_x4
4
16
320
mem2_ssd2_v2_x8
8
32
640
mem2_ssd2_v2_x16
16
64
1280
mem2_ssd2_v2_x32
32
128
2560
mem2_ssd2_v2_x48
48
192
3840
mem2_ssd2_v2_x64
64
256
5120
mem2_ssd2_v2_x96
96
384
7480
mem2_hdd2_x1
1
3.8
410
mem2_hdd2_x2
2
7.5
840
mem2_hdd2_x4
4
15
1,680
mem2_hdd2_v2_x2
2
8
1,000
mem2_hdd2_v2_x4
4
16
2,000
mem3_ssd1_x2
2
15
40
mem3_ssd1_x4
4
30.5
80
mem3_ssd1_x8
8
61
160
mem3_ssd1_x16
16
122
320
mem3_ssd1_x32
32
244
640
mem3_ssd1_v2_x2
2
16
75
mem3_ssd1_v2_x4
4
32
150
mem3_ssd1_v2_x8
8
64
300
mem3_ssd1_v2_x16
16
128
600
mem3_ssd1_v2_x32
32
256
1,200
mem3_ssd1_v2_x48
48
384
1,800
mem3_ssd1_v2_x64
64
512
3,200
mem3_ssd1_v2_x96
96
768
3,600
mem3_ssd2_x4
4
30.5
800
mem3_ssd2_x8
8
61
1,600
mem3_ssd2_x16
16
122
3,200
mem3_ssd2_x32
32
244
6,400
mem3_ssd2_v2_x2
2
15.25
475
mem3_ssd2_v2_x4
4
30.5
950
mem3_ssd2_v2_x8
8
61
1,900
mem3_ssd2_v2_x16
16
122
3,800
mem3_ssd2_v2_x32
32
244
7,600
mem3_ssd2_v2_x64
64
488
15,200
mem3_ssd3_x2
2
16
1,250
mem3_ssd3_x4
4
32
2,500
mem3_ssd3_x8
8
64
5,000
mem3_ssd3_x12
12
96
7,500
mem3_ssd3_x24
24
192
15,000
mem3_ssd3_x48
48
384
30,000
mem3_ssd3_x96
96
768
60,000
mem3_hdd2_x2
2
17.1
420
mem3_hdd2_x4
4
34.2
850
mem3_hdd2_x8
8
68.4
1,680
mem3_hdd2_v2_x2
2
16
500
mem3_hdd2_v2_x4
4
32
1,000
mem3_hdd2_v2_x8
8
64
2,000
mem4_ssd1_x128
128
1,952
3,840
Standard Azure Instance Types
azure:mem1_ssd1_x2
2
3.9
32
azure:mem1_ssd1_x4
4
7.8
64
azure:mem1_ssd1_x8
8
15.7
128
azure:mem1_ssd1_x16
16
31.4
254
azure:mem2_ssd1_x1
1
3.5
128
azure:mem2_ssd1_x2
2
7
128
azure:mem2_ssd1_x4
4
14
128
azure:mem2_ssd1_x8
8
28
256
azure:mem2_ssd1_x16
16
56
512
azure:mem3_ssd1_x2
2
14
128
azure:mem3_ssd1_x4
4
28
128
azure:mem3_ssd1_x8
8
56
256
azure:mem3_ssd1_x16
16
112
512
azure:mem3_ssd1_x20
20
140
640
azure:mem4_ssd1_x2
2
28
128
azure:mem4_ssd1_x4
4
56
128
azure:mem4_ssd1_x8
8
112
256
azure:mem4_ssd1_x16
16
224
512
azure:mem4_ssd1_x32
32
448
1024
azure:mem5_ssd2_x64 ¹
64
1,792
8,192
azure:mem5_ssd2_x128 ¹
128
3,892
16,384
Notes:
¹ High memory instance types are expensive. Use them with caution.
GPU Instance Types
The following table shows all the GPU instance types available on AWS and Azure.
mem2_ssd1_gpu_x16
16
64
225
1 NVIDIA T4
16
mem2_ssd1_gpu_x32
32
128
900
1 NVIDIA T4
16
mem2_ssd1_gpu_x48
48
192
900
4 NVIDIA T4
64
mem2_ssd1_gpu_x64
64
256
900
1 NVIDIA T4
16
mem2_ssd1_gpu1_x32
32
128
900
1 NVIDIA T4
16
mem2_ssd1_gpu1_x64
64
256
900
1 NVIDIA T4
16
mem2_ssd1_gpu4_x48
48
192
900
4 NVIDIA T4
64
mem2_ssd2_gpu1_x4 ²
4
16
250
1 NVIDIA A10G
24
mem2_ssd2_gpu1_x8 ²
8
32
450
1 NVIDIA A10G
24
mem2_ssd2_gpu1_x16 ²
16
64
600
1 NVIDIA A10G
24
mem2_ssd2_gpu1_x32 ²
32
128
900
1 NVIDIA A10G
24
mem2_ssd2_gpu1_x64 ²
64
256
1900
1 NVIDIA A10G
24
mem2_ssd2_gpu4_x48 ²
48
192
3800
4 NVIDIA A10G
96
mem2_ssd2_gpu4_x96 ²
96
384
3800
4 NVIDIA A10G
96
mem2_ssd2_gpu8_x192 ¹ ²
192
768
7600
8 NVIDIA A10G
192
mem2_ssd2_gpu1_v2_x4 ²
4
16
250
1 NVIDIA L4
24
mem2_ssd2_gpu1_v2_x8 ²
8
32
450
1 NVIDIA L4
24
mem2_ssd2_gpu1_v2_x16 ²
16
64
600
1 NVIDIA L4
24
mem2_ssd2_gpu1_v2_x32 ²
32
128
900
1 NVIDIA L4
24
mem2_ssd2_gpu1_v2_x64 ²
64
256
1880
1 NVIDIA L4
24
mem2_ssd2_gpu4_v2_x48 ²
48
192
3760
4 NVIDIA L4
96
mem2_ssd2_gpu4_v2_x96 ²
96
384
3760
4 NVIDIA L4
96
mem2_ssd2_gpu8_v2_x192 ¹ ²
192
768
7520
8 NVIDIA L4
192
mem3_ssd1_gpu1_x16 ²
16
128
600
1 NVIDIA L4
24
mem3_ssd1_gpu1_x32 ²
32
256
900
1 NVIDIA L4
24
mem3_ssd1_gpu_x8
8
61
160
1 NVIDIA V100
16
mem3_ssd1_gpu_x32 ¹
32
244
640
4 NVIDIA V100
64
mem3_ssd1_gpu_x64 ¹
64
488
1,280
8 NVIDIA V100
128
azure:mem3_ssd2_gpu4_x64 ¹
64
488
2,048
4 NVIDIA V100
64
Notes:
¹ High memory instance types are expensive. Use them with caution.
² Available only to special licensed users.
FPGA Instance Types
The following table shows all the FPGA instance types available on AWS.
mem3_ssd2_fpga1_x8
1
8
122
470
mem3_ssd2_fpga1_x16
1
16
244
940
mem3_ssd2_fpga1_x64
1
64
976
3760
Last updated
Was this helpful?