Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-264

mutiple server for input uri and output uri is not supported

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major - P3
    • Resolution: Works as Designed
    • Affects Version/s: 2.1.5
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None
    • Environment:
      Centos7

      Description

      I want to read data from one collection on a sharding cluster, after some transformation, write to another collection on a standalone server. Spark config is like below:

      ```

      String iputUri = "mongodb://usr:pwd@ip1:port1,ip2:port2,ip3:port3,ip4:port4,ip5:port5,ip6:port6/db.col?maxPoolSize=40&authSource=admin&readPreference=secondary";

      String outputUri = "mongodb://usr2:pw2@ip7:port7/db2.col2?maxPoolSize=40&authSource=admin&readPreference=secondary";

      SparkSession sparkR =BUILDER.appName("myapp")
      .config("spark.mongodb.input.uri", inputUri)

      .config("spark.mongodb.output.uri", ouputUri)
      .config("spark.mongodb.input.partitionerOptions.partitionKey", "id")
      .config("spark.mongodb.input.partitioner", "MongoSamplePartitioner")
      .getOrCreate();

      ```

      Exception occurred when executing task: Command failed with error 13: 'not authorized on db2 to execute command { update: "col2", ordered: true, writeConcern:

      { w: 1 }

      , $db: "db2", $clusterTime: { clusterTime: Timestamp(1574246888, 5), signature:

      { hash: BinData(0, DB1EDCE1A9DA107BFE21A700396763ADBF09D528), keyId: 6710554364668280834 }

      }, lsid: { id: UUID("952698d2-b4ae-40b8-beff-354de32a9f8c") } }' on server ip2:port2. 

      Then I changed codes to put output uri in WriteConfig:

      ```

      Map<String, String> writeOverrides = new HashMap<>();
      writeOverrides.put("writeConcern.w", "1");
      writeOverrides.put("replaceDocument", "true");
      writeOverrides.put("uri", outputUri);
      WriteConfig writeConfig = WriteConfig.create(JSC).withOptions(writeOverrides);

      ```

      Still failed with com.mongodb.MongoInterruptedException: Interrupted acquiring a permit to retrieve an item from the pool.

      And the version is:

      ```

      compile 'org.mongodb.spark:mongo-spark-connector_2.11:2.1.5'

      ```

        Attachments

          Activity

            People

            Assignee:
            ross.lawley Ross Lawley
            Reporter:
            yangr_yr@outlook.com rui yang
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved: