Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-193

Reading from MongoDB views using Spark Connector 2.2 throws 'Namespace xyz is a view, not a collection' error

    • Type: Icon: Bug Bug
    • Resolution: Works as Designed
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: None
    • Labels:
      None

      Spark 2.2.1
      Spark-Connector 2.2.2
      Scala version 2.11.8

      Reading from a `View` in Spark shell, using MongoDB Spark Connector 2.2, throws this error:

      Command failed with error 166: 'Namespace DBNAME.VIEWNAME is a view, not a collection' on server 127.0.0.1:36211. The full response is { "ok" : 0.0, "errmsg" : "Namespace DBNAME.VIEWNAME is a view, not a collection", "code" : 166, "codeName" : "CommandNotSupportedOnView" }

      Detailed steps to reproduce this issue:

      ------------------------------------------------------------------------------

      Reading from a View in Spark shell, using MongoDB Spark Connector

      This error is reproducible via Spark-Shell, in Spark 2.2.1 + Spark-Connector 2.2.2 + Scala version 2.11.8.

      On MongoDB side, I created this view called test_view:

       

      MongoDB Enterprise > db.createView("test_view", "coll1", [{$match: {"a" : 1}}])
      { "ok" : 1 }
      MongoDB Enterprise > db.test_view.find()
      { "_id" : ObjectId("5b3fe386b35fb556a4aa839f"), "a" : 1, "b" : 2 }

       

      Then, I started spark-shell by running the following command - note the input.uri is reading from the view test_view:

       

      /Users/harshaddhavale/Documents/Apache-Spark/spark-2.2.1-bin-hadoop2.7/bin/spark-shell --conf "spark.mongodb.input.uri=mongodb://127.0.0.1:36211/test.test_view?readPreference=primaryPreferred" \
       --packages org.mongodb.spark:mongo-spark-connector_2.11:2.2.2

       

      When I tried to read the view in the Spark shell, it threw this error:

       

      Welcome to
       ____ __
       / __/__ ___ _____/ /__
       _\ \/ _ \/ _ `/ __/ '_/
       /___/ .__/\_,_/_/ /_/\_\ version 2.2.1
       /_/
      Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_172)
      Type in expressions to have them evaluated.
      Type :help for more information.
      scala>
      scala> import com.mongodb.spark._
      import com.mongodb.spark._
      scala> val rdd = MongoSpark.load(sc)
      18/07/11 11:52:50 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
      rdd: com.mongodb.spark.rdd.MongoRDD[org.bson.Document] = MongoRDD[1] at RDD at MongoRDD.scala:56
      
      scala> println(rdd.first.toJson)
      18/07/11 11:52:50 WARN MongoSamplePartitioner: Could not get collection statistics. Server errmsg: Command failed with error 166: 'Namespace test.test_view is a view, not a collection' on server 127.0.0.1:36211. The full response is { "ok" : 0.0, "errmsg" : "Namespace test.test_view is a view, not a collection", "code" : 166, "codeName" : "CommandNotSupportedOnView" }
      com.mongodb.MongoCommandException: Command failed with error 166: 'Namespace test.test_view is a view, not a collection' on server 127.0.0.1:36211. The full response is { "ok" : 0.0, "errmsg" : "Namespace test.test_view is a view, not a collection", "code" : 166, "codeName" : "CommandNotSupportedOnView" }
       at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:115)
       at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:114)
       at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168)
       at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289)
       at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:187)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:179)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:92)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:85)
       at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
       at com.mongodb.Mongo.execute(Mongo.java:836)
       at com.mongodb.Mongo$2.execute(Mongo.java:823)
       at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:137)
       at com.mongodb.spark.rdd.partitioner.PartitionerHelper$$anonfun$collStats$1.apply(PartitionerHelper.scala:88)
       at com.mongodb.spark.rdd.partitioner.PartitionerHelper$$anonfun$collStats$1.apply(PartitionerHelper.scala:88)
       at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
       at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
       at com.mongodb.spark.MongoConnector.withMongoClientDo(MongoConnector.scala:154)
       at com.mongodb.spark.MongoConnector.withDatabaseDo(MongoConnector.scala:171)
       at com.mongodb.spark.rdd.partitioner.PartitionerHelper$.collStats(PartitionerHelper.scala:88)
       at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner$$anonfun$2.apply(MongoSamplePartitioner.scala:76)
       at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner$$anonfun$2.apply(MongoSamplePartitioner.scala:76)
       at scala.util.Try$.apply(Try.scala:192)
       at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner.partitions(MongoSamplePartitioner.scala:76)
       at com.mongodb.spark.rdd.partitioner.DefaultMongoPartitioner.partitions(DefaultMongoPartitioner.scala:34)
       at com.mongodb.spark.rdd.MongoRDD.getPartitions(MongoRDD.scala:141)
       at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252)
       at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250)
       at scala.Option.getOrElse(Option.scala:121)
       at org.apache.spark.rdd.RDD.partitions(RDD.scala:250)
       at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1333)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
       at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
       at org.apache.spark.rdd.RDD.take(RDD.scala:1327)
       at org.apache.spark.rdd.RDD$$anonfun$first$1.apply(RDD.scala:1368)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
       at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
       at org.apache.spark.rdd.RDD.first(RDD.scala:1367)
       ... 52 elided
      
      scala> println(rdd.count)
      18/07/11 11:52:53 WARN MongoSamplePartitioner: Could not get collection statistics. Server errmsg: Command failed with error 166: 'Namespace test.test_view is a view, not a collection' on server 127.0.0.1:36211. The full response is { "ok" : 0.0, "errmsg" : "Namespace test.test_view is a view, not a collection", "code" : 166, "codeName" : "CommandNotSupportedOnView" }
      com.mongodb.MongoCommandException: Command failed with error 166: 'Namespace test.test_view is a view, not a collection' on server 127.0.0.1:36211. The full response is { "ok" : 0.0, "errmsg" : "Namespace test.test_view is a view, not a collection", "code" : 166, "codeName" : "CommandNotSupportedOnView" }
       at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:115)
       at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:114)
       at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168)
       at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289)
       at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:187)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:179)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:92)
       at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:85)
       at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
       at com.mongodb.Mongo.execute(Mongo.java:836)
       at com.mongodb.Mongo$2.execute(Mongo.java:823)
       at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:137)
       at com.mongodb.spark.rdd.partitioner.PartitionerHelper$$anonfun$collStats$1.apply(PartitionerHelper.scala:88)
       at com.mongodb.spark.rdd.partitioner.PartitionerHelper$$anonfun$collStats$1.apply(PartitionerHelper.scala:88)
       at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
       at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
       at com.mongodb.spark.MongoConnector.withMongoClientDo(MongoConnector.scala:154)
       at com.mongodb.spark.MongoConnector.withDatabaseDo(MongoConnector.scala:171)
       at com.mongodb.spark.rdd.partitioner.PartitionerHelper$.collStats(PartitionerHelper.scala:88)
       at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner$$anonfun$2.apply(MongoSamplePartitioner.scala:76)
       at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner$$anonfun$2.apply(MongoSamplePartitioner.scala:76)
       at scala.util.Try$.apply(Try.scala:192)
       at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner.partitions(MongoSamplePartitioner.scala:76)
       at com.mongodb.spark.rdd.partitioner.DefaultMongoPartitioner.partitions(DefaultMongoPartitioner.scala:34)
       at com.mongodb.spark.rdd.MongoRDD.getPartitions(MongoRDD.scala:141)
       at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252)
       at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250)
       at scala.Option.getOrElse(Option.scala:121)
       at org.apache.spark.rdd.RDD.partitions(RDD.scala:250)
       at org.apache.spark.SparkContext.runJob(SparkContext.scala:2094)
       at org.apache.spark.rdd.RDD.count(RDD.scala:1158)
       ... 52 elided
      

      ------------------------------------------------------------------------------

      Reading from the underlying collection (on which the view is based) in Spark shell, using MongoDB Spark Connector

      I started spark-shell by running the following command - note the input.uri is reading from the collection coll1:

       

      /Users/harshaddhavale/Documents/Apache-Spark/spark-2.2.1-bin-hadoop2.7/bin/spark-shell --conf "spark.mongodb.input.uri=mongodb://127.0.0.1:36211/test.coll1?readPreference=primaryPreferred" \
       --packages org.mongodb.spark:mongo-spark-connector_2.11:2.2.2

      When I tried to read the underlying collection in the Spark shell, it worked just fine:

       

      Welcome to
       ____ __
       / __/__ ___ _____/ /__
       _\ \/ _ \/ _ `/ __/ '_/
       /___/ .__/\_,_/_/ /_/\_\ version 2.2.1
       /_/
      Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_172)
      Type in expressions to have them evaluated.
      Type :help for more information.
      scala>
      scala> import com.mongodb.spark._
      import com.mongodb.spark._
      scala> val rdd = MongoSpark.load(sc)
      18/07/11 12:04:34 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
      18/07/11 12:04:34 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
      rdd: com.mongodb.spark.rdd.MongoRDD[org.bson.Document] = MongoRDD[0] at RDD at MongoRDD.scala:56
      scala> println(rdd.first.toJson)
      { "_id" : { "$oid" : "5b3fe386b35fb556a4aa839f" }, "a" : 1.0, "b" : 2.0 }
      scala> println(rdd.count)
      1
       
      
      

      ---------------------------------------

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            harshad.dhavale@mongodb.com Harshad Dhavale
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: