-
Type: Question
-
Resolution: Declined
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
Labels:None
I am new to Apache Spark and I am using Scala and Mongodb to learn it. https://docs.mongodb.com/spark-connector/current/scala-api/ I am trying to read the RDD from my MongoDB database, my notebook script as below:
import com.mongodb.spark.config._import com.mongodb.spark._val readConfig = ReadConfig(Map("uri" -> "mongodb+srv://$USER:$PASSWORD@mongodbcluster.td5gp.mongodb.net/test_database.test_collection?retryWrites=true&w=majority"))val testRDD = MongoSpark.load(sc, readConfig)print(testRDD.collect)
At the print(testRDD.collect) line, I got this error:
java.lang.NoSuchMethodError: com.mongodb.internal.connection.Cluster.selectServer(Lcom/mongodb/selector/ServerSelector;)Lcom/mongodb/internal/connection/Server;
And more than 10 lines "at..."
Used libraries:
org.mongodb.spark:mongo-spark-connector_2.12:3.0.1org.mongodb.scala:mongo-scala-driver_2.12:4.2.3
Is this the problem from Mongodb internal libraries or how could I fix it?
Many thanks