I'm using the same runtime configurations, tested with:
- 17.3.4 (17.3 LTS (includes Apache Spark 4.0.0, Scala 2.13)
- 16.4.16 (16.4 LTS (includes Apache Spark 3.5.2, Scala 2.12)
and spark connector with the following versions:
- org.mongodb.spark:mongo-spark-connector_2.12:10.6.0)
- org.mongodb.spark:mongo-spark-connector_2.13:10.6.0)
- org.mongodb.spark:mongo-spark-connector_2.13:10.3.0
In the above mentioned configurations:
- 17.3 + 10.6 -> Logging Bug
- 16.4 + 10.6 -> Logging Bug
- 16.4 + 10.3 -> OK
Logging bug: the spark connector is happily logging GB of Info Messages in the Databricks and the duration of the same process is increased with factor 20.
I've attached the 10_3.png picture, where the error is not appearing and 10_6.png (withe the bug)
Regards,
Alexandru