Version 10.6 with Databricks Runtime 17.3.4 (Spark 4.0.0) logs all bson conversions

XMLWordPrintableJSON

    • Type: Bug
    • Resolution: Fixed
    • Priority: Unknown
    • 10.6.1, 11.0.1
    • Affects Version/s: 10.6.0
    • Component/s: None
    • Java Drivers
    • Not Needed
    • None
    • None
    • None
    • None
    • None
    • None

      I'm using the same runtime configurations, tested with:

      • 17.3.4 (17.3 LTS (includes Apache Spark 4.0.0, Scala 2.13)
      • 16.4.16 (16.4 LTS (includes Apache Spark 3.5.2, Scala 2.12)

      and spark connector with the following versions:

      • org.mongodb.spark:mongo-spark-connector_2.12:10.6.0)
      • org.mongodb.spark:mongo-spark-connector_2.13:10.6.0)
      • org.mongodb.spark:mongo-spark-connector_2.13:10.3.0

      In the above mentioned configurations:

      • 17.3 + 10.6 -> Logging Bug
      • 16.4 + 10.6 -> Logging Bug
      • 16.4 + 10.3 -> OK

      Logging bug: the spark connector is happily logging GB of Info Messages in the Databricks and the duration of the same process is increased with factor 20.

      I've attached the 10_3.png picture, where the error is not appearing and 10_6.png (withe the bug)

      Regards,

      Alexandru

        1. 1_GB_Log.png
          1_GB_Log.png
          74 kB
        2. 10_3.png
          10_3.png
          282 kB
        3. 10_6.png
          10_6.png
          368 kB

            Assignee:
            Ross Lawley
            Reporter:
            Alexandru Htc
            Nabil Hachicha
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: