Spark OOM

一、背景

        在跑Spark SQL任务时,突然报OOM错误,测试的时候是可以的。报错如下:

Exception in thread "broadcast-exchange-12" 
java.lang.OutOfMemoryError: 
Not enough memory to build and broadcast the 
table to all worker nodes. As a workaround, 
you can either disable broadcast by setting 
spark.sql.autoBroadcastJoinThreshold to -1 
or increase the spark driver memory by 
setting spark.driver.memory to a higher value

二、解决办法尝试:

①.试试修改上面报错信息的参数。

spark.sql.autoBroadcastJoinThreshold=-1 

完美解决

你可能感兴趣的:(大数据,SparkSQL)