spark connector may need update/replace/delete function when save data

XMLWordPrintableJSON

    • Type: Improvement
    • Resolution: Works as Designed
    • Priority: Major - P3
    • None
    • Affects Version/s: 2.2.0
    • Component/s: API
    • None
    • None
    • None
    • None
    • None
    • None

      In our project, we need to update the data stored in MongoDB. At first I used mongo-hadoop-core jar, but met an exception of "the directory item limit ", and I need to increase the limit for the cluster. So I then change to mongo-spark connnector, everything gone well before I wanted to update the data, it dosen't support RDD data update function, so I implement the function in save method according to the dataframe update function. So I think it's necessary for RDD to store/update/replace/delete function according to the MongoDB's insert/update/replace/delete for anyone who will use it.

              Assignee:
              Ross Lawley
              Reporter:
              Davy Song
              None
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

                Created:
                Updated:
                Resolved: