LucaCanali / sparkMeasure

This is the development repository for sparkMeasure, a tool and library designed for efficient analysis and troubleshooting of Apache Spark jobs. It focuses on easing the collection and examination of Spark metrics, making it a practical choice for both developers and data engineers.
Apache License 2.0
690 stars 144 forks source link

jobId field only comes as "0" or "1" in stage and task Metrics. #22

Closed Amittyagi007 closed 5 years ago

Amittyagi007 commented 5 years ago

Hi,

I am evaluating sparkMeasure for my use case but I always gets jobID filed as "0" or a series of 1. Is this expected behavior or I am missing something here?

Below are the outputs of a spark2-submit pi job with sparkMeasure and from a spark-shell job resp.

+-----+--------+-------+--------------------+ |jobId|jobGroup|stageId| name| +-----+--------+-------+--------------------+ | 0| null| 0|reduce at SparkPi...| +-----+--------+-------+--------------------+

+-----+--------+ |jobId|jobGroup| +-----+--------+ | 0| null| | 1| null| | 1| null| | 1| null| | 1| null| | 1| null| | 1| null|

Also, how can we figure out to which task a particular metric belongs to?

Thanks Amit