Hudi0.14.0集成Spark3.2.3(Spark Shell方式)

1 启动

1.1 启动Spark Shell

# For Spark versions: 3.2 - 3.4
spark-shell --jars /path/to/jars/hudi-spark3.2-bundle_2.12-0.14.0.jar \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf 'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog' \
--conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' \

你可能感兴趣的:(大数据企业级实战,hudi)