Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-129

Is there an option to run splitt the connector on several interfaces

    • Type: Icon: Task Task
    • Resolution: Works as Designed
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: 2.2.0
    • Component/s: Performance
    • Labels:
      None
    • Environment:
      zSeries and all other

      I like to speed up data request from Spark via the Connector to a MongoDB.
      What is the scenario?
      Lets assume a very simpel SQL query in Spark like SELECT * FROM TEMP
      Due to the single connect I have to the database i have a serialization of my request.
      Technically I could server 6GB/s if I could split the request on more channels to the MongoDB
      in fakt I do not cross appr. 1GB/s and both systems (MongoDB and Spark) are almost idle. The bottleneck is Spark waiting for data from the MongoDB.

      So bottom line: is there any config option to get this to a higher performance?

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            akazia Michael Höller
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: