- 
    Type:
Task
 - 
    Resolution: Unresolved
 - 
    Priority:
Minor - P4
 - 
    None
 - 
    Affects Version/s: None
 - 
    Component/s: None
 - 
    None
 
- 
        None
 - 
        None
 - 
        None
 - 
        None
 - 
        None
 - 
        None
 
Hello guys, the mongo-spark connector does not write the metrics of records and bytes (Read and Written). I tried to override the MongoRDD class in the compute method and set the metrics value but they are not accessible for writing.
Do you have any workaround for that?
I am using SparkListener to capture metrics.
Thank you!