Uploaded image for project: 'Documentation'
  1. Documentation
  2. DOCS-9166

how  to handle when loading data large than spark memory

    XMLWordPrintableJSON

Details

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Major - P3 Major - P3
    • None
    • None
    • Spark Connector

    Description

      Hi Support,

      I have a question about Mongo connector with Spark, if I have a large connection in MongoDB, that data size is larger than total memory of Spark cluster, how does it handle? could it throw OOM issue? if so, how can I solve it, add some configuration in ReadConfig?

      Thanks
      Yin

      Attachments

        Activity

          People

            Unassigned Unassigned
            xgen-internal-docs Docs Collector User (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:
              2 years, 38 weeks, 1 day ago