Spark 之 FileSourceScanExec Operator

case
test("SPARK decoder without codegen") {
    withSQLConf(SQLConf.WHOLESTAGE_CODEGEN_ENABLED.key -> "false") {
      spark.catalog.createTable("variance", "/mnt/DP_disk1/string_variance_value.gz.parquet", "parquet")
      val df = sql("select * from variance")
      df.show(4)
      df.explain(false)
    }
  }
#without  codegen
== Physical Plan ==
FileScan parquet default.variance[col0_str#218] Batched: false, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[file:/mnt/DP_disk1/string_variance_value.gz.parquet], PartitionFilters: [], PushedFilters: [], ReadSchema: struct

你可能感兴趣的:(spark,spark,大数据,分布式)