You can start a Spark Standalone master (aka standalone Master) using sbin/start-master.sh and stop it using sbin/stop-master.sh.
sbin/start-master.sh
script starts a Spark master on the machine the script is executed on.
./sbin/start-master.sh
The script prepares the command line to start the class org.apache.spark.deploy.master.Master
and by default runs as follows:
org.apache.spark.deploy.master.Master \
--ip japila.local --port 7077 --webui-port 8080
Note
|
The command sets SPARK_PRINT_LAUNCH_COMMAND environment variable to print out the launch command to standard error output. Refer to Print Launch Command of Spark Scripts.
|
It has support for starting Tachyon using --with-tachyon
command line option. It assumes tachyon/bin/tachyon
command be available in Spark’s home directory.
The script uses the following helper scripts:
-
sbin/spark-config.sh
-
bin/load-spark-env.sh
-
conf/spark-env.sh
contains environment variables of a Spark executable.
Ultimately, the script calls sbin/spark-daemon.sh start
to kick off org.apache.spark.deploy.master.Master
with parameter 1
and --ip
, --port
, and --webui-port
command-line options.
You can use the following command-line options:
-
--host
or-h
the hostname to listen on; overrides SPARK_MASTER_HOST. -
--ip
or-i
(deprecated) the IP to listen on -
--port
or-p
- command-line version of SPARK_MASTER_PORT that overrides it. -
--webui-port
- command-line version of SPARK_MASTER_WEBUI_PORT that overrides it. -
--properties-file
(default:$SPARK_HOME/conf/spark-defaults.conf
) - the path to a custom Spark properties file -
--help
- prints out help