1、修改 spark-defaults.conf

vim conf/spark-defaults.conf
添加如下内容
spark.eventLog.enabled           true
spark.eventLog.dir               hdfs://hadoop1:9000/directory
spark.yarn.historyServer.address=hadoop1:18080
spark.history.ui.port=18080
并在hdfs中创建目录
hadoop fs -mkdir /directory

2、配置env

vim conf/spark-env.sh
export SPARK_HISTORY_OPTS="-Dspark.history.ui.port=18080 -Dspark.history.fs.logDirectory=hdfs://hadoop1:9000/directory -Dspark.history.retainedApplications=30"

3、启动历史服务

sbin/start-history-server.sh

4、提交作业

bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--master yarn \
--deploy-mode client \
./examples/jars/spark-examples_2.12-3.1.3.jar \
10

5、查看历史服务

http://hadoop1:8088/cluster
最后修改于 2022-07-26 18:54:24
如果觉得我的文章对你有用,请随意赞赏
扫一扫支付
上一篇