# Apollo Apps

## Spark Applications

{% hint style="info" %}
A license is required to access Spark functionality on the DNAnexus Platform. [Contact DNAnexus Sales](mailto:sales@dnanexus.com) for more information.
{% endhint %}

The Spark application is an extension of the current app(let) framework. App(let)s have a [specification](https://documentation.dnanexus.com/developer/api/running-analyses/io-and-run-specifications) for their VM (instance type, OS, packages). This has been extended to allow for an additional optional [cluster specification](https://documentation.dnanexus.com/developer/apps/developing-spark-apps#cluster-specifications) with `type=dxspark`.

* Calling /app(let)-xxxx/run for Spark apps creates a Spark cluster (+ master VM).
* The master VM (where the app shell code runs) acts as the driver node for Spark.
* Code in the master VM leverages the Spark infrastructure.
* Job mechanisms (monitoring, termination, and management) are the same for Spark apps as for any other regular app(let)s on the Platform.
* Spark apps use the same platform `dx` communication between the master VM and DNAnexus API servers.
* There's a new log collection [mechanism](https://documentation.dnanexus.com/developer/apps/developing-spark-apps#collecting-cluster-logs) to collect logs from all nodes.
* You can use the [Spark UI](https://documentation.dnanexus.com/developer/apps/developing-spark-apps#monitoring-the-spark-ui) to monitor running job using ssh tunneling.

Spark apps can be launched over a distributed Spark cluster.
