Spark 之 metrics

  • peak memory

.//sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/HashAggregateExec.scala: “peakMemory” -> SQLMetrics.createSizeMetric(sparkContext, “peak memory”),
.//sql/core/src/main/scala/org/apache/spark/sql/execution/SortExec.scala: “peakMemory” -> SQLMetrics.createSizeMetric(sparkContext, “peak memory”),

  • src/test/scala/org/apache/spark/sql/execution/metric/SQLMetricsTestUtils.scala
  // Pattern of size SQLMetric value, e.g. "\n96.2 MiB (32.1 MiB, 32.1 MiB, 32.1 MiB (stage 0.0:
  // task 4))" OR "\n96.2 MiB (32.1 MiB, 32.1 MiB, 32.1 MiB)"
  protected val sizeMetricPattern 

你可能感兴趣的:(spark,spark,大数据,分布式)