Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-290

an error occourred while trying to schedule the "MongoSpark.save(sparkDocuments, writeConfig)"

    • Type: Icon: Task Task
    • Resolution: Works as Designed
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: None
    • Labels:
      None

      My project scala, hadoop, spark version as follows:

       

      <scala.version>2.12.10</scala.version>
      <hadoop.version>2.7.3</hadoop.version>
      <spark.version>3.0.0</spark.version>

      pom.xml is:

       

       

       <dependency>
         <groupId>org.mongodb.spark</groupId>
         <artifactId>mongo-spark-connector_2.12</artifactId>
         <version>3.0.0</version>
       </dependency>
      

      when run MongoSpark.save(sparkDocuments, writeConfig), there is an error: 'Caused by: java.lang.ClassNotFoundException: com.mongodb.client.result.InsertManyResult'

      reference: _https://docs.mongodb.com/spark-connector/master/java/write-to-mongodb_

      could you help me, thanks!

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            bwang2009@yeah.net wang bin
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:
              Resolved: