Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-371

Handling of missing array field in the new Spark connector is not consistent with previous version

    • Type: Icon: Bug Bug
    • Resolution: Won't Fix
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: None
    • None

      Customer opened a case for Spark connector 10.0.4 related to a change in behavior.
       
      Previously an empty document {} was accepted by the MongoDB Spark connector as an aggregate result while reading a dataframe with a schema like StructType(StructField("foo", ArrayType(String))In such a case, the read dataframe was something like:
      foo

      []
       
      Now, a strict mode seems to be the default and so raises an exception.It is a breaking change for the customer when migrating an existing Spark application.case 
       
      Not sure what can be done for the customer if anything. Any suggestions?
       

            Assignee:
            robert.walters@mongodb.com Robert Walters (Inactive)
            Reporter:
            shawn.hawkins@mongodb.com Shawn Hawkins
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: