hadoop使用中出现的问题

1. 在执行bin/hadoop fs -put conf input 遇到 错误

WARN hdfs.DFSClient: DataStreamer Exception: java.lang.NumberFormatException: For input string: "0:0:0:0:0:0:1:50010" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48) at java.lang.Integer.parseInt(Integer.java:458) at java.lang.Integer.parseInt(Integer.java:499) at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:155) at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:129) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:855) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:820) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:427) put: DFSOutputStream is closed  

 

原因:  0:0:0:0:0:0:1:50010地址为ipv6, 但是目前hadoop还不支持ipv6. 需要将本机的ip使用方式改为ipv4.

解决: 将/etc/hosts 中的ipv6支持去掉(注释掉下面内容)

# The following lines are desirable for IPv6 capable hosts #::1 localhost ip6-localhost ip6-loopback 

 

 

 

你可能感兴趣的:(hadoop,exception,String,input)