Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-130

spark connector may need update/replace/delete function when save data

    XMLWordPrintable

Details

    • Improvement
    • Status: Closed
    • Major - P3
    • Resolution: Works as Designed
    • 2.2.0
    • None
    • API

    Description

      In our project, we need to update the data stored in MongoDB. At first I used mongo-hadoop-core jar, but met an exception of "the directory item limit ", and I need to increase the limit for the cluster. So I then change to mongo-spark connnector, everything gone well before I wanted to update the data, it dosen't support RDD data update function, so I implement the function in save method according to the dataframe update function. So I think it's necessary for RDD to store/update/replace/delete function according to the MongoDB's insert/update/replace/delete for anyone who will use it.

      Attachments

        Activity

          People

            ross@mongodb.com Ross Lawley
            hevensun Davy Song
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: