$tar -zxvf hadoop-2.8.1.tar.gz
fs.defaultFS
hdfs://0.0.0.0:9000
hadoop.tmp.dir
/Users/eleme/Documents/ProgramFiles/apache-software-foundation/hadoop-2.7.3/temp
dfs.replication
1
dfs.namenode.name.dir
file:/Users/eleme/Documents/ProgramFiles/apache-software-foundation/hadoop-2.7.3/tmp/hdfs/name
dfs.datanode.data.dir
file://Users/eleme/Documents/ProgramFiles/apache-software-foundation/hadoop-2.7.3/tmp/hdfs/data
dfs.namenode.secondary.http-address
localhost:9001
dfs.webhdfs.enabled
true
mapreduce.framework.name
yarn
mapreduce.admin.user.env
HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME
yarn.app.mapreduce.am.env
HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME
yarn.nodemanager.aux-services
mapreduce_shuffle
hdfs namenode -format
启动hdfs
start-dfs.sh
start-yarn.sh
hdfs dfs -mkdir -p /user/yonglu/input
hdfs dfs -put $HADOOP_HOME/etc/hadoop/*.xml input
hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.0.0-alpha1.jar grep input output ‘dfs[a-z.]+’
查看执行结果
hdfs dfs -cat output/part-r-00000
16/12/17 12:56:25 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
dfsadmin
dfs.webhdfs.enabled
dfs.replication
dfs.namenode.secondary.http
dfs.namenode.name.dir
dfs.datanode.data.dir
如果启动过程出现下面的提示:需要重新编译安装lib/native下面的动态库:
Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
dfsadmin
protobuf
和maven以及cmake,自己百度搜索安装即可开始编译native库,命令如下:
mvn package -Pdist,native -DskipTests -Dtar
export OPENSSL_ROOT_DIR=/usr/local/Cellar/openssl/1.0.2k
export OPENSSL_INCLUDE_DIR=/usr/local/Cellar/openssl/1.0.2k/include
将编译出的native library复制到下载的二进制版本的Hadoop 2.6.0相应目录中
编译出的native library库的位置为
hadoop-2.7.0-src/hadoop-dist/target/hadoop-2.7.0/lib/native
拷贝到二进制版本的Hadoop 2.7.0的目录
hadoop-2.7.0/lib/native