Uploaded image for project: 'Kafka Connector'
  1. Kafka Connector
  2. KAFKA-305

Duplicate Key Errors

    XMLWordPrintableJSON

Details

    • Icon: Question Question
    • Resolution: Works as Designed
    • Icon: Major - P3 Major - P3
    • None
    • None
    • None

    Description

      Hello team!

      I am testing if Retryable Writes works on the new version. I have a docker compose with a PSA. I shutdown the secondary and try to write on two topics.

      I have the following error.

      //  com.mongodb.MongoBulkWriteException: Bulk write operation error on server mongo2:27017. Write errors: [BulkWriteError{index=0, code=11000, message='E11000 duplicate key error collection: kafka.schema index: _id_ dup key: { _id: "113" }', details={}}].
      connect            | com.mongodb.kafka.connect.sink.dlq.WriteException: v=1, code=11000, message=E11000 duplicate key error collection: kafka.schema index: _id_ dup key: { _id: "113" }, details={}
      
      

      https://www.mongodb.com/docs/manual/core/retryable-writes/#duplicate-key-errors-on-upsert What I assumed was that the dirver was going to retry the writing. I am using MongoDB 4.4 (MongoDB shell version v4.4.13.

      Could you help me?

      Should I manage by DLQ or diver will retry it?

      Regards,

      Juan

       

      Attachments

        Activity

          People

            ross@mongodb.com Ross Lawley
            juan.soto@mongodb.com Juan Soto (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: