Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-280

Enhance save(RDD) to avoid duplicate key exception

    XMLWordPrintable

    Details

    • Type: New Feature
    • Status: Closed
    • Priority: Major - P3
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 2.4.3, 2.3.5, 2.2.9, 2.1.8, 3.0.1
    • Component/s: None
    • Labels:
      None
    • Case:

      Description

      Customer is experiencing a duplicate key exception when attempting to execute MongoSpark.save(RDD, writeConfig) - there is a workaround that involves manually executing the operations. It may be possible to alter the function to check for an _id and honor the *replaceDocument *flag similar to save(dataset)

        Attachments

          Issue Links

            Activity

              People

              Assignee:
              ross.lawley Ross Lawley
              Reporter:
              steffan.mejia Steffan Mejia
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

                Dates

                Created:
                Updated:
                Resolved: