Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-93

MongoSpark ignoring ReadConfigs for Multiple Collections

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major - P3
    • Resolution: Fixed
    • Affects Version/s: 2.0.0-rc1
    • Fix Version/s: 2.0.0
    • Component/s: Configuration
    • Labels:
      None
    • Environment:
      Windows 10, Spark 2.0.0, Scala 2.11.2
    • # Replies:
      3
    • Last comment by Customer:
      false

      Description

      I have a few collections that I need to read in, I've tried following the documentation on how to set ReadConfig, but I'm still coming up blank.

      I've attached a Scala object, where I try connecting to 2 different collections. MongoSpark ignores my read config in the second instance, and continues to read from the first collection.

      I've spent about 4 hours trying to figure this out. I've tried many permutations, which all yield errors.

      The attached are with:

      • Mongo settings NOT specified in spark-defaults.conf
      • No other --conf options passed to my Spark master/driver

        Attachments

        1. multiplecollections.log
          39 kB
        2. MultipleCollections.scala
          1 kB

          Activity

            People

            • Assignee:
              ross.lawley Ross Lawley
              Reporter:
              nevi_me Neville Dipale
              Participants:
              Last commenter:
              Rathi Gnanasekaran
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:
                Days since reply:
                3 years, 22 weeks, 5 days ago
                Date of 1st Reply: