Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-335

Duplicate key error does not show duplicate key id in description

    • Type: Icon: Improvement Improvement
    • Resolution: Cannot Reproduce
    • Priority: Icon: Unknown Unknown
    • None
    • Affects Version/s: None
    • Component/s: Writes

      We are facing an issue when dealing with "unique" indexes.
      We have defined a unique index in a collection for field "field_id" using:

      db.client.createIndex({"field_id": 1}, {unique: true})

      If two rows with the same value for field "field_id" are inserted, the insert operation fails, as expected. However, the error message is not helpful at all:

      com.mongodb.MongoBulkWriteException: [...] Write errors: [BulkWriteError{index=0, code=11000, message='E11000 duplicate key error collection: client index: field_id_1', details={}}]

      As you can see, the error description does not specify which duplicate key is causing the issue.

      Looking around, we have seen that other 11000 code errors include a "dup key" element in the error description, indicating which key caused it. Why aren't we getting this detail in the error description?

      We are using this expression to write to the collection:

      df.write.format("mongo").mode("append").option("uri", uri).save()

      We have tried with different spark-connector versions but still get the same results.

      mongo-spark-connector_2.11-2.4.3

      mongo-spark-connector_2.12-3.0.1

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            jaalsina@nttdata.com Javier Alsina
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: