安装spark遇到的问题

1.
报错
spark Caused by: java.lang.ClassNotFoundException: com.hadoop.compression.lzo.LzoCodec
或者
spark ERROR lzo.LzoCodec: Cannot load native-lzo without native-hadoop
http://www.linuxidc.com/Linux/2013-09/90012.htm
解决办法:
./spark-shell --driver-library-path :/usr/local/hadoop-1.1.2/lib/native/Linux-i386-32:/usr/local/hadoop-1.1.2/lib/native/Linux-amd64-64:/usr/local/hadoop-1.1.2/lib/hadoop-lzo-0.4.17-SNAPSHOT.jar

你可能感兴趣的:(spark)