In our project, we need to update the data stored in MongoDB. At first I used mongo-hadoop-core jar, but met an exception of "the directory item limit ", and I need to increase the limit for the cluster. So I then change to mongo-spark connnector, everything gone well before I wanted to update the data, it dosen't support RDD data update function, so I implement the function in save method according to the dataframe update function. So I think it's necessary for RDD to store/update/replace/delete function according to the MongoDB's insert/update/replace/delete for anyone who will use it.
- Assignee:
- Ross Lawley
- Reporter:
- Davy Song
- Votes:
-
0 Vote for this issue
- Watchers:
-
5 Start watching this issue
- Created:
- Updated:
- Resolved: