Fields values in Mongodb logs are hidden

XMLWordPrintableJSON

    • Type: Task
    • Resolution: Works as Designed
    • Priority: Major - P3
    • None
    • Affects Version/s: 2.1.6
    • Component/s: Logging, Writes
    • None
    • None
    • None
    • None
    • None
    • None

      Spark connector is inserting documents to MongoDB, with following configuration. Trying to capture the error logs when Duplicates getting inserted into Mongo.

       

      Spark Connector inserting to MongoDB Sharded cluster, Uniqueness on Shard key

       

      In the mongod logs, shard key field values are hiding, so I am unable to capture which is getting duplicated.

       

      There is no visibility from Spark Connector to capture Logs

       

      Spark Connector code:

      object Write2Mongo {
        def main(args: Array[String]): Unit = {
          if(args.length != 2)

      { _println_("Usage: <CONF FILE> <STG DIR>"); sys._exit_(1) }

          val confFile = args(0)
          val stgDir = args(1)

          /**************************************************
            * Build Spark context
            **************************************************/
          __     val config = ConfigFactory.parseFile(new File(confFile))
          val spark = MongoContext.buildSparkContext(config)
          val mongoRdd = spark.read.parquet(stgDir).rdd.map(x => convertBytes2Bson(x.getAs[Array[Byte]]("document")))
          MongoSpark.save(mongoRdd)
        }
      }

              Assignee:
              Ross Lawley
              Reporter:
              Dheeraj Gunda
              None
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: