Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-277

Spark 3.0 Connector

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major - P3
    • Resolution: Done
    • Affects Version/s: 3.0.0
    • Fix Version/s: None
    • Component/s: Writes
    • Labels:
      None

      Description

      We have a structured streaming application which we are upgrading to spark 3.0, so we upgraded mongo connector to below

       

      "org.mongodb.scala" %% "mongo-scala-driver" % "4.1.0",
      "org.mongodb.spark" %% "mongo-spark-connector" % "3.0.0"
      "org.mongodb" % "mongo-java-driver" % "3.11.2"
      

       

       

      As we run the application we get below error 

       

      Caused by: java.lang.NoSuchMethodError: com.mongodb.client.MongoCollection.insertMany(Ljava/util/List;Lcom/mongodb/client/model/InsertManyOptions;)Lcom/mongodb/client/result/InsertManyResult;Caused by: java.lang.NoSuchMethodError: com.mongodb.client.MongoCollection.insertMany(Ljava/util/List;Lcom/mongodb/client/model/InsertManyOptions;)Lcom/mongodb/client/result/InsertManyResult; at com.mongodb.spark.MongoSpark$.$anonfun$save$3(MongoSpark.scala:121) at scala.collection.Iterator.foreach(Iterator.scala:943) at scala.collection.Iterator.foreach$(Iterator.scala:943) at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) at com.mongodb.spark.MongoSpark$.$anonfun$save$2(MongoSpark.scala:119) at com.mongodb.spark.MongoSpark$.$anonfun$save$2$adapted(MongoSpark.scala:118) at com.mongodb.spark.MongoConnector.$anonfun$withCollectionDo$1(MongoConnector.scala:186) at com.mongodb.spark.MongoConnector.$anonfun$withDatabaseDo$1(MongoConnector.scala:171) at com.mongodb.spark.MongoConnector.withMongoClientDo(MongoConnector.scala:154) at com.mongodb.spark.MongoConnector.withDatabaseDo(MongoConnector.scala:171) at com.mongodb.spark.MongoConnector.withCollectionDo(MongoConnector.scala:184) at com.mongodb.spark.MongoSpark$.$anonfun$save$1(MongoSpark.scala:118) at com.mongodb.spark.MongoSpark$.$anonfun$save$1$adapted(MongoSpark.scala:117) at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:994) at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:994) at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2133) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) at org.apache.spark.scheduler.Task.run(Task.scala:127) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
       
      

       

      Can someone assist

       

       

       

       

       

        Attachments

          Activity

            People

            Assignee:
            ross.lawley Ross Lawley
            Reporter:
            wajidp@gmail.com Wajid P
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved: