How spark executes a job on the cluster

In the cluster mode, when a j ob is submitted for execution, the j ob is sent to the driver (or a master) node. The driver node creates a DAG for a j ob and decides which executor (or worker) nodes will run specific tasks.

The driver then instructs the workers to execute their tasks and return the results to the driver when done. Before that happens, however, the driver prepares each task's closure: A set of variables and methods present on the driver for the worker to execute its task on the RDD.

This set of variables and methods is inherently static within the executors' context, that is, each executor gets a copy  of the variables and methods from the driver. If, when running the task, the executor alters these variables or overwrites the methods, it does so without affecting either other executors' copies or the variables and methods of the driver. This might lead to some unexpected behavior and runtime bugs that can sometimes be really hard to track down.

©著作权归作者所有,转载或内容合作请联系作者
【社区内容提示】社区部分内容疑似由AI辅助生成,浏览时请结合常识与多方信息审慎甄别。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容

  • rljs by sennchi Timeline of History Part One The Cognitiv...
    sennchi阅读 12,167评论 0 10
  • 常用的核心包 java.lang包:Java语言包,主要包含与语言、数据类型相关的类。程序运行时,编译器自动引入该...
    骇客与画家阅读 3,038评论 0 0
  • 最近发现一件有趣的事——朋友圈里伪科学类文章气焰小了很多。偶尔冒出几篇来,也都很快被戳穿。屡次碰壁的老爸老妈们慢慢...
    阿鹿先生阅读 5,248评论 12 34
  • 听着一位朋友推荐的《很高兴认识你》,唤醒了脑中沉睡的文字,执笔涂画,把这些美丽的横撇剌记下… 午夜蔷薇 春风吻过月...
    肖才胜阅读 2,540评论 0 1

友情链接更多精彩内容