Spark On YARN环境配置

一、准备工作

点击查看Spark Standalone HA环境配置教程

二、修改配置文件

一、修改spark-env.sh

cd /export/server/spark/conf
vim /export/server/spark/conf/spark-env.sh
# 添加以下内容
HADOOP_CONF_DIR=/export/server/hadoop-3.3.0/etc/hadoop/
YARN_CONF_DIR=/export/server/hadoop-3.3.0/etc/hadoop/

二、修改hadoop的yarn-site.xml

cd /export/server/spark/conf
scp -r spark-env.sh node2:$PWD
scp -r spark-env.sh node3:$PWD
cd /export/server/hadoop-3.3.0/etc/hadoop/
vim /export/server/hadoop-3.3.0/etc/hadoop/yarn-site.xml
# 要修改的内容



<configuration>





<property>
        <name>yarn.resourcemanager.hostnamename>
        <value>node1value>
property>

<

你可能感兴趣的:(Water,spark,yarn,hadoop)