Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-253

Fields values in Mongodb logs are hidden

    • Type: Icon: Task Task
    • Resolution: Works as Designed
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: 2.1.6
    • Component/s: Logging, Writes
    • Labels:

      Spark connector is inserting documents to MongoDB, with following configuration. Trying to capture the error logs when Duplicates getting inserted into Mongo.

       

      Spark Connector inserting to MongoDB Sharded cluster, Uniqueness on Shard key

       

      In the mongod logs, shard key field values are hiding, so I am unable to capture which is getting duplicated.

       

      There is no visibility from Spark Connector to capture Logs

       

      Spark Connector code:

      object Write2Mongo {
        def main(args: Array[String]): Unit = {
          if(args.length != 2)

      { _println_("Usage: <CONF FILE> <STG DIR>"); sys._exit_(1) }

          val confFile = args(0)
          val stgDir = args(1)

          /**************************************************
            * Build Spark context
            **************************************************/
          __     val config = ConfigFactory.parseFile(new File(confFile))
          val spark = MongoContext.buildSparkContext(config)
          val mongoRdd = spark.read.parquet(stgDir).rdd.map(x => convertBytes2Bson(x.getAs[Array[Byte]]("document")))
          MongoSpark.save(mongoRdd)
        }
      }

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            gundadheeraj8@gmail.com Dheeraj Gunda
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: