Installing Spark
When you install Spark, two directories will be created:
/usr/hdp/current/spark-clientfor submitting Spark jobs/usr/hdp/current/spark-historyfor launching Spark master processes, such as the Spark history server
Search for Spark in the HDP repo:
For RHEL or CentOS:
yum search sparkFor SLES:
zypper install sparkFor Ubuntu and Debian:
apt-cache spark
This will show all the versions of Spark available. For example,
spark_2_3_2_0_2950-master.noarch : Server for Spark master spark_2_3_2_0_2950-python.noarch : Python client for Spark spark_2_3_2_0_2950-worker.noarch : Server for Spark worker
Install the version corresponding to the HDP version you currently have installed.
For RHEL or CentOS:
yum install spark_<version>-master spark_<version>-pythonFor SLES:
zypper install spark_<version>-master spark_<version>-pythonFor Ubuntu and Debian:
apt-get install spark_<version>-master apt-get install spark_<version>-python

