-
Type: Bug
-
Resolution: Done
-
Priority: Major - P3
-
None
-
Affects Version/s: 0.1
-
Component/s: None
-
Labels:None
-
Environment:Vagrant Ubuntu trusty64, 8G RAM
performing the 'first" action on an empty collection causes the Spark shell to throw an java.lang.UnsupportedOperationException: empty collection. Given the following command line
./bin/spark-shell --packages org.mongodb.spark:mongo-spark-connector_2.10:0.1 --conf "spark.mongodb.input.uri=mongodb://127.0.0.1/SFCab.wangdoodle"
The following commands cause the exception
scala> import com.mongodb.spark._ import com.mongodb.spark._ scala> val rdd = sc.loadFromMongoDB() rdd: com.mongodb.spark.rdd.MongoRDD[org.bson.Document] = MongoRDD[0] at RDD at MongoRDD.scala:160 scala> println(rdd.count) 16/04/05 10:31:45 WARN DefaultMongoPartitioner: Could not find collection (logscoll), using single partition 0 scala> println(rdd.first) java.lang.UnsupportedOperationException: empty collection
Interestingly, the count, and take operations work fine.
scala> println(rdd.count) 0 scala> println(rdd.take(1)) [Lorg.bson.Document;@4a15fe7