Hdfs+hbase持久化到azure storage
Hbase版本:hbase-2.5.10-hadoop3-bin.tar.gz
Hadoop版本:hadoop-3.3.6.tar.gz
环境变量配置:vim /etc/profile
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 (openjdk8不是必要的)
export HBASE_HOME=/usr/local/hbase
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_OPTIONAL_TOOLS=hadoop-azure (hadoop持久化到azure中的必要环境变量)
#export HBASE_CLASSPATH=/usr/local/hadoop/share/hadoop/tools/lib/hadoop-azure-3.3.6.jar:/usr/local/hadoop/share/hadoop/tools/lib/hadoop-azure-datalake-3.3.6.jar:$HBASE_CLASSPATH
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HBASE_HOME/bin
export CLASSPATH=$CLASSPATH:/usr/local/hadoop/share/hadoop/tools/lib:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
Hadoop搭建:
tar xf hadoop-3.3.6.tar.gz -C /usr/local/
cd /usr/local/
mv hadoop-3.3.6 hadoop
cd hadoop
vim etc/hadoop/hdfs-site.xml
数量-->
Vim etc/hadoop/core-site.xml
认证类型OAuth—>
在门户中找到Azure Active Directory,然后找到应用注册-->
在门户中找到Azure Active Directory,然后找到应用注册-->
vim etc/hadoop/hadoop.env.sh
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
如果用root启动Hdfs需要在sbin/start/stop-dfs.sh里面添加
HDFS_DATANODE_USER=root
HDFS_DATANODE_SECURE_USER=root
HDFS_NAMENODE_USER=root
HDFS_SECONDARYNAMENODE_USER=root
HDFS_ZKFC_USER=root
HDFS_JOURNALNODE=root
启动hdfs
./sbin/start-dfs.sh
关闭 hdfs
./sbin/stop-dfs.sh
安装hbase
tar xf hbase-2.5.10-hadoop3-bin.tar.gz -C /usr/local/
cd /usr/local/
mv hbase-2.5.10 hbase
cd hbase
vim conf/hbase-site.xml
存储路径-->
存储路径-->
因为需要将hbase数据持久化到azure storage中,连接azure storage需要加认证所以需要配置core-site.xml文件
vim core-site.xml
在门户中找到Azure Active Directory,然后找到应用注册-->
在门户中找到Azure Active Directory,然后找到应用注册-->
添加hbase存储到azure中的依赖,
mv /usr/local/hadoop/share/hadoop/tools/lib/hadoop-azure-* /usr/local/hbase/lib
如果没有这两个jar包会包缺少依赖的报错内容如下
ERROR [main] regionserver.HRegionServer: Failed construction RegionServer
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/statistics/IOStatisticsSource
vim hbase-env.sh
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HBASE_CLASSPATH=$HBASE_CLASSPATH:/usr/local/hadoop/share/hadoop/common/hadoop-common-3.3.6.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-shaded-guava-1.1.1.jar
如不添加环境变量会影响master或者regionserver启动不了,报错如下
ERROR [main] regionserver.HRegionServer: Failed construction RegionServer
java.lang.NoClassDefFoundError: org/apache/hadoop/thirdparty/com/google/common/base/Preconditions
at
如果配置完成后还是启动不了regionserver/master
可以尝试先删除zookeeper中的/hbase
./bin/zkCli.sh
deleteall /hbase
然后初始化namenode
./bin/hadoop namenode -format
删除hadoop的datanode目录,
然后启动hdfs与hbase
启动Hbase ./bin/start-hbase.sh
关闭 hbase ./bin/stop-hbase.sh