Uploaded image for project: 'Documentation'
  1. Documentation
  2. DOCS-15666

[Spark] Description of giving partitioner options via SparkConf is not right

    • Type: Icon: Task Task
    • Resolution: Fixed
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: Spark Connector

      Hello,

      In the Mongo Spark Connector documentation, there seems to be a mistake. We're asked to use spark.mongodb.read.partitionerOptions instead of partitioner.options in case we give configuration via Spark.

      https://www.mongodb.com/docs/spark-connector/current/configuration/read/

      So for example:

      partitioner.options.partition.field spark.mongodb.read.partitionerOptions.partition.field

      But this seems to be wrong looking at the code, it should be :

       
      If you use SparkConf to set the connector's read configurations, prefix each property with
      spark.mongodb.read.partitioner.options. instead of partitioner.options.
       

            Assignee:
            chris.cho@mongodb.com Christopher Cho (Inactive)
            Reporter:
            mehdi.elhajami.pro@gmail.com Mehdi El Hajami
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved:
              1 year, 42 weeks ago