Uploaded image for project: 'Documentation'
  1. Documentation
  2. DOCS-9166

how  to handle when loading data large than spark memory

    • Type: Icon: Bug Bug
    • Resolution: Done
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: Spark Connector
    • Environment:

      Hi Support,

      I have a question about Mongo connector with Spark, if I have a large connection in MongoDB, that data size is larger than total memory of Spark cluster, how does it handle? could it throw OOM issue? if so, how can I solve it, add some configuration in ReadConfig?

      Thanks
      Yin

            Assignee:
            Unassigned Unassigned
            Reporter:
            xgen-internal-docs Docs Collector User (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

              Created:
              Updated:
              Resolved:
              2 years, 48 weeks, 2 days ago