针对spark任务,发布分发任务时出现的EOFException异常解决

18/11/28 17:15:23 ERROR Inbox: Ignoring error
java.io.EOFException
at java.io.DataInputStream.readFully(DataInputStream.java:197)
at java.io.DataInputStream.readUTF(DataInputStream.java:609)
at java.io.DataInputStream.readUTF(DataInputStream.java:564)
at org.apache.spark.scheduler.TaskDescriptionKaTeX parse error: Can't use function '$' in math mode at position 8: anonfun$̲deserializeStri…anonfun$receive 1. a p p l y O r E l s e ( C o a r s e G r a i n e d E x e c u t o r B a c k e n d . s c a l a : 96 ) a t o r g . a p a c h e . s p a r k . r p c . n e t t y . I n b o x 1.applyOrElse(CoarseGrainedExecutorBackend.scala:96) at org.apache.spark.rpc.netty.Inbox 1.applyOrElse(CoarseGrainedExecutorBackend.scala:96)atorg.apache.spark.rpc.netty.Inbox a n o n f u n anonfun anonfunprocess 1. a p p l y 1.apply 1.applymcV s p ( I n b o x . s c a l a : 117 ) a t o r g . a p a c h e . s p a r k . r p c . n e t t y . I n b o x . s a f e l y C a l l ( I n b o x . s c a l a : 205 ) a t o r g . a p a c h e . s p a r k . r p c . n e t t y . I n b o x . p r o c e s s ( I n b o x . s c a l a : 101 ) a t o r g . a p a c h e . s p a r k . r p c . n e t t y . D i s p a t c h e r sp(Inbox.scala:117) at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205) at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101) at org.apache.spark.rpc.netty.Dispatcher sp(Inbox.scala:117)atorg.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)atorg.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)atorg.apache.spark.rpc.netty.DispatcherMessageLoop.run(Dispatcher.scala:221)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

spark的jar一定要和集群的spark版本相同,否则通信时就会出现问题

你可能感兴趣的:(异常处理)