Trusted answers to developer questions
Trusted Answers to Developer Questions

Related Tags

spark
cluster

How to run on a cluster in Spark

Educative Answers Team
svg viewer

Spark applications run as separate processes in distributed clusters that, once connected, are coordinated by the SparkContext in its controller program. Spark acquires executors on cluster nodes that perform the calculations.

Connect an application to the Cluster

To run an application on the Spark cluster, pass the spark://IP:PORT URL of the master as to the SparkContext constructor.

A shell can be run against the cluster by typing, ./bin/spark-shell --master spark://IP:PORT.

The --total-executor-cores <number of cores> can be set to control the number of cores the spark-shell uses.

RELATED TAGS

spark
cluster
Copyright ©2022 Educative, Inc. All rights reserved
RELATED COURSES

View all Courses

Keep Exploring