Hudi0.14.0集成Spark3.2.3(Spark SQL方式)

1 整合Hive

For users who have Spark-Hive integration in their environment, this guide assumes that you have the appropriate settings configured to allow Spark to create tables and register in Hive Metastore.

我们使用 Hive添加第三方jar包方式总结 中**{HIVE_HOME}/lib**的方式添加第三发jar:

  1. Import hudi-hadoop-mr-bundleinto hive. moving hudi-hadoop-mr-bundle-0.14.0.jarinto {HIVE_HOME}/lib. hudi-hadoop-mr-bundle-0.14.0.jar is at packaging/hudi-hadoop-mr-bundle/target.
  2. 重启hiveserver2。

2 启动

# For Spark versions: 3.2 - 3.4

你可能感兴趣的:(大数据企业级实战,hudi)