-
Type:
Bug
-
Resolution: Fixed
-
Priority:
Blocker - P1
-
Affects Version/s: None
-
Component/s: None
-
None
-
Java Drivers
-
Not Needed
-
-
None
-
None
-
None
-
None
-
None
-
None
When migrating to latest Mongo Spark Connector 10.2.1, we are experiencing some performance issue while saving to MongoDB. With same functionality, global time have been multiply by 2.5.
We are currently blocked with this update that is causing huge performance issue.
In addition to this the overall memory consumption when saving the Mongo Spark Connector is also 2 times bigger when using V10.
We have created a showcase that reproduces with a simple implementation the issue we are experiencing.
Please can you help us in identifying if the issue is due to the way we are using this connector V10 (which is mostly same way than the V3), or if something needs to be improved in the connector itself.
Please find in attachement the Zip of our showcase. You can see in the readme.md the output produced.
You can run it by simply typing ./run_showcase.sh (either in a Bash shell on Linux or in Git Bash on Windows).
A case is also openned into the MongoDB support https://support.mongodb.com/case/01260503