-
Type:
New Feature
-
Resolution: Unresolved
-
Priority:
Unknown
-
None
-
Affects Version/s: None
-
Component/s: None
-
None
-
None
-
None
-
None
-
None
-
None
-
None
Hey everyone.
Currently, the MongoDB Spark connector only supports insert, replace, and update (via the $set operator) operations when writing - in other words, only a subset of the update operators available in MongoDB. Would it be possible to add support for other update operators ($inc in our case), or perhaps pushing custom BSON documents to the database (like you're able to do when reading via the aggregation.pipeline option)?
Use case:
We're trying to stream statistics about our data lake (in this case, added/removed rows) - ideally via the $inc operator in MongoDB. Previously, in V3.0 we were able to work around the lack of built-in support by writing BSON documents directly to the database via the (now removed) MongoConnector object, but this is no longer possible.