-
Type: Bug
-
Resolution: Done
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: Spark Connector
-
Labels:
-
Environment:
*Location*: https://docs.mongodb.com/spark-connector/faq/
*User-Agent*: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36
*Referrer*: https://www.mongodb.com/products/spark-connector
*Screen Resolution*: 1440 x 900
Hi Support,
I have a question about Mongo connector with Spark, if I have a large connection in MongoDB, that data size is larger than total memory of Spark cluster, how does it handle? could it throw OOM issue? if so, how can I solve it, add some configuration in ReadConfig?
Thanks
Yin