jaceklaskowski
Mastering Apache Spark

Updated a month ago

Manish (@manishatgit) started discussion #113

a year ago · 1 comment

Open

A Spark driver (aka an application’s driver process) is a JVM process that hosts SparkContext for a Spark application. It is the master node in a Spark application.

Driver (Edit this file)

I am little bit confused by the last statement. Is it always the master node which runs the driver process or It can be run anywhere in the cluster which can be decided by resource manager?

No description provided.
Jacek Laskowski @jaceklaskowski commented a year ago

You're right. It's a bit confusing and am going to fix the confusion. When I said "the master node" I didn't mean to refer to the Spark infrastructure (the master and workers) but to the distinction between the driver and executors. You're right that the driver can be hosted on any worker in a cluster if you use --deploy-mode cluster command-line option.

Thanks for bringing it up! Very appreciated.


to join this conversation on GitBook. Already have an account? Sign in to comment
Notifications

You’re not receiving notifications from this thread.


2 participants