spark on yarn的搭建步骤
时间: 2024-03-05 15:41:49 浏览: 103
感谢您的提问!关于 Spark on YARN 的搭建步骤,以下是简要说明:
1.下载并解压Spark:
wget http://mirrors.tuna.tsinghua.edu.cn/apache/spark/spark-3.0.0-preview2/spark-3.0.0-preview2-bin-hadoop2.7.tgz
tar -zxvf spark-3.0.0-preview2-bin-hadoop2.7.tgz
2.配置环境变量:
export SPARK_HOME=/usr/local/spark/spark-3.0.0-preview2-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin
3.修改 $SPARK_HOME/conf/spark-env.sh:
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
4.配置 YARN 参数:
cp $SPARK_HOME/conf/spark-defaults.conf.template $SPARK_HOME/conf/spark-defaults.conf
vim $SPARK_HOME/conf/spark-defaults.conf
# 添加以下参数
spark.master yarn
spark.driver.memory 1g
spark.executor.instances 2
spark.executor.memory 1g
spark.executor.cores 1
spark.yarn.queue default
spark.driver.extraJavaOptions -Dlog4j.configuration=file:/usr/local/spark/spark-3.0.0-preview2-bin-hadoop2.7/conf/log4j.properties
4.在 YARN 上启动 Spark:
start-dfs.sh
start-yarn.sh
spark-submit --class com.xxx.xxx.xxx.xxx --master yarn --deploy-mode cluster --num-executors 2 --executor-memory 1g --executor-cores 1 xxx.jar
以上就是简单的 Spark on YARN 的搭建步骤,如果您还有其他问题或疑问,可以随时向我提问,我会尽力回答。
阅读全文