Execution of a Spark Application

Get familiarity with the role of the DAG and Task schedulers in the context of Spark.

We'll cover the following

Execution of a Spark application

As discussed earlier, a Spark job is initiated when an action is performed. Internally, this invokes the SparkContext object's runJob(...) method. The call is passed on the scheduler. The scheduler runs as part of the driver and has two parts:

  • DAG Scheduler

  • Task Scheduler

Get hands-on with 1200+ tech skills courses.