Spark 各种配置项

/bin/spark-shell --master yarn --deploy-mode client
/bin/spark-shell --master yarn --deploy-mode cluster

There are two deploy modes that can be used to launch Spark applications on YARN.
In cluster mode, the Spark driver runs inside an application master process which is managed by YARN on the cluster, and the client can go away after initiating the application.
In client mode, the driver runs in the client process, and the application master is only used for requesting resources from YARN.

spark.yarn.am.cores	1	Number of cores to use for the YARN Application Master in client mode. In cluster mode, use spark.driver.cores instead.
spark.executor.instances	2	The number of executors for static allocation. With spark.dynamicAllocation.enab

你可能感兴趣的:(大数据,spark,Spark,Conf,spark,jvm,java)