You can launch a Spark application distributed across a cluster of workers. Since this is all tightly integrated with the rest of the platform, Spark jobs will leverage the features of normal jobs. You'll have the same ways to monitor a job's progress, SSH into a job instance to debug, and use the features of
dx-toolkit and the platform web UI. You'll additionally have access to logs from workers and be able to monitor the job in the Spark UI.