pyspark依赖环境设置

pypspark 异常

py49-protocol. Py433avaError: An error occurred while calling 0117.sql.
org.apache.spark.SparkException: Job aborted due to stage failure: Task ® in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3) (***bigdata-host executor 1): java.io.IOException: Cannot run program “python3”: error=13, Permission denied at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at org-apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:222)
at org.apache.spark.api-python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:134)
at org-apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:107)
limengyao
at org-apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:124)
at org-apache-spark-spl.python.BasePythonRunner compute (PythonRunner scala:166) 20
at org-ap

你可能感兴趣的:(pyspark依赖环境设置)