-
Type: Task
-
Resolution: Works as Designed
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
Labels:None
My project scala, hadoop, spark version as follows:
<scala.version>2.12.10</scala.version> <hadoop.version>2.7.3</hadoop.version> <spark.version>3.0.0</spark.version>
pom.xml is:
<dependency> <groupId>org.mongodb.spark</groupId> <artifactId>mongo-spark-connector_2.12</artifactId> <version>3.0.0</version> </dependency>
when run MongoSpark.save(sparkDocuments, writeConfig), there is an error: 'Caused by: java.lang.ClassNotFoundException: com.mongodb.client.result.InsertManyResult'
reference: _https://docs.mongodb.com/spark-connector/master/java/write-to-mongodb_
could you help me, thanks!