What are Spark tasks and jobs?

A Spark task refers to the smallest unit of work performed within a Spark application, typically involving operations on data within a single partition.

A Spark job refers to a collection of tasks with dependencies between them, usually triggered by an action. Each job can consist of one or more stages, with each stage containing a set of interdependent tasks. In Spark, each RDD transformation operation will trigger a new job.

广告
Closing in 10 seconds
bannerAds