shell spark 导入包,如何在Spark Shell中运行外部jar函数

I created a jar package from a project by this file-tree:

build.sbt

src/main

src/main/scala

src/main/scala/Tester.scala

src/main/scala/main.scala

where Tester is a class by a function (name is print()) and main has an object to run that prints "Hi!" (from spark documention)

created a jar file by sbt successfully and worked well in spark-submit

now I wanna add it into spark-shell and use Tester class as a class to create objects and ...

I added the jar file into spark-default.conf but:

scala> val t = new Tester();

:23: error: not found: type Tester

val t = new Tester();

解决方案

you can try by providing jars with argument as below

./spark-shell --jars pathOfjarsWithCommaSeprated

Or you can add following configuration in you spark-defaults.conf but remember to remove template from end of spark-defaults

spark.driver.extraClassPath pathOfJarsWithCommaSeprated

你可能感兴趣的:(shell,spark,导入包)