Setup via Code Change
This approach configures and registers the listener programmatically. Include the library as a dependency in your project.
This approach is required when the JVM is launched before you can add your properties to it. This seems to be the case in Databricks.
import com.amadeus.perfgazer.{JsonSink, PerfGazer, PerfGazerConfig}
val jsonSink = new JsonSink(
JsonSink.Config(
destination = "/dbfs/perfgazer/v1/",
writeBatchSize = 100,
fileSizeLimit = 10L * 1024
),
spark.conf
)
val perfGazerConfig = PerfGazerConfig(
sqlEnabled = true,
jobsEnabled = true,
stagesEnabled = true,
tasksEnabled = false,
maxCacheSize = 100
)
val perfGazer = new PerfGazer(perfGazerConfig, jsonSink)
// Register the listener
spark.sparkContext.addSparkListener(perfGazer)
// Your Spark code here ...
// At the end of your application, remove the listener and close it properly
spark.sparkContext.removeSparkListener(perfGazer)
perfGazer.close()
Note: a shutdown hook is registered automatically on construction, so the listener will be closed on JVM termination even if you omit the explicit
removeSparkListener/close()calls. That said, calling them explicitly at the end of your application is still good practice to ensure a clean, predictable teardown.