spark运行失败The directory item limit of /spark_dir/spark_eventLogs is exceeded: limit=1048576 items=104

报错:

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.FSLimitException$MaxDirectoryItemsExceededException): The directory item limit of /spark_dir/spark_eventLogs is exceeded: limit=1048576 items=1048576

原因:

目录数超过了HDFS的文件个数限制

解决方法:

修改配置文件 ,重启namenode,datanode
hdfs-site.xml

<property>
  <name>dfs.namenode.fs-limits.max-directory-itemsname>
  <value>3200000value>
  <description>Defines the maximum number of items that a directory may
      contain. Cannot set the property to a value less than 1 or more than
      6400000.description>
property>

你可能感兴趣的:(Hadoop,spark,hadoop,hdfs)