What is the Spark cluster and Spark Standalone mode?
A Spark cluster refers to running Spark applications by combining multiple computers into a cluster. In a Spark cluster, there is one main node (Master) responsible for coordinating and managing all the worker nodes in the cluster.
Spark Standalone mode is a cluster deployment mode provided by Spark, whereby a Spark cluster can be set up by launching Master and Worker nodes. In Spark Standalone mode, the Master node is responsible for managing resource allocation and job scheduling within the cluster, while Worker nodes are responsible for carrying out specific tasks. Users can run Spark applications in Spark Standalone mode by submitting them to the Master node.