[KAFKA-145] org.bson.BsonInvalidOperationException: Value expected to be of type DOCUMENT is of unexpected type NULL Created: 21/Aug/20  Updated: 28/Oct/23  Resolved: 07/Sep/20

Status: Closed
Project: Kafka Connector
Component/s: Source
Affects Version/s: 1.2.0
Fix Version/s: 1.3.0

Type: Bug Priority: Major - P3
Reporter: Rajaramesh Yaramati Assignee: Ross Lawley
Resolution: Fixed Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment:

kafka version: 2.6


Attachments: PNG File Screen Shot 2020-08-22 at 11.06.11 AM.png    

 Description   

My source connector terminating with this error.

[2020-08-20 20:05:52,666] ERROR WorkerSourceTask{id=mongo-source-assets-mongos-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:187)
org.bson.BsonInvalidOperationException: Value expected to be of type DOCUMENT is of unexpected type NULL
at org.bson.BsonValue.throwIfInvalidType(BsonValue.java:419)
at org.bson.BsonValue.asDocument(BsonValue.java:47)
at org.bson.BsonDocument.getDocument(BsonDocument.java:136)
at com.mongodb.kafka.connect.source.MongoSourceTask.poll(MongoSourceTask.java:189)
at org.apache.kafka.connect.runtime.WorkerSourceTask.poll(WorkerSourceTask.java:289)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:256)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:185)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:235)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:830)
[2020-08-20 20:05:52,667] ERROR WorkerSourceTask{id=mongo-source-assets-mongos-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:188)
[2020-08-20 20:05:52,667] INFO Stopping MongoDB source task (com.mongodb.kafka.connect.source.MongoSourceTask:223)
[2020-08-20 20:05:52,669] INFO Closed connection [connectionId

{localValue:3}

] to 10.74.3.104:27017 because the pool has been closed. (org.mongodb.driver.connection:71)
[2020-08-20 20:05:52,669] INFO [Producer clientId=connector-producer-mongo-source-assets-mongos-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1189)

Can someone please take a look at this error and tell me if this is something expected?

Thanks,

Rajaramesh



 Comments   
Comment by Gil De Grove [ 25/Sep/20 ]

We discover the same bug,

 

We think that this may be related to what is explained in the connector documentation. If you have a lot of update on a document, and have a delete on the document then it is send as a null full_document. 

I think the check you added will totally allow to prevent this issue. 
Thanks

 

Comment by Githook User [ 07/Sep/20 ]

Author:

{'name': 'Ross Lawley', 'email': 'ross.lawley@gmail.com', 'username': 'rozza'}

Message: Ensure the fullDocument field is a document

KAFKA-145
Branch: master
https://github.com/mongodb/mongo-kafka/commit/0c85b67160c143c0f8df714517ecb7481242e1f1

Comment by Ross Lawley [ 07/Sep/20 ]

Hi yaramati@adobe.com,

From that error message, it looks like the source connector is encountering a fullDocument field that is not a document but rather is set as null. I'm not sure in what scenario the field would be published with a null value. What version of MongoDB are you running?

I'll add an extra check in 1.3

Ross

Comment by Ross Lawley [ 07/Sep/20 ]

Hi there, thank you for reaching out. As this sounds like a support issue, I wanted to give you some resources to get this questioned answered more quickly:

  • our MongoDB support portal, located at support.mongodb.com
  • our MongoDB community portal, located here
  • If you are an Atlas customer, there is free support offered 24/7 in the lower right hand corner of the UI.
    Just in case you have already opened a support case and are not receiving sufficient help, please let me know and I can facilitate escalating your issue.

Thank you!

 

Comment by Rajaramesh Yaramati [ 22/Aug/20 ]

My source connector config:

 

kafka-connector:config# curl localhost:9083/connectors/mongo-source-assets-mongos/tasks|jq
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   738  100   738    0     0   124k      0 --:--:-- --:--:-- --:--:--  144k
[
  {
    "id": {
      "connector": "mongo-source-assets-mongos",
      "task": 0
    },
    "config": {
      "connector.class": "com.mongodb.kafka.connect.MongoSourceConnector",
      "publish.full.document.only": "true",
      "batch.size": "10000",
      "collection": "assets",
      "copy.existing.queue.size": "64000",
      "key.converter.schemas.enable": "false",
      "database": "oz_next",
      "topic.prefix": "oplog.oz_mongo",
      "task.class": "com.mongodb.kafka.connect.source.MongoSourceTask",
      "poll.await.time.ms": "1000",
      "value.converter.schemas.enable": "false",
      "connection.uri": "mongodb://xxx.xxx.xxx.xxx:27017",
      "name": "mongo-source-assets-mongos",
      "copy.existing": "true",
      "value.converter": "org.apache.kafka.connect.json.JsonConverter",
      "key.converter": "org.apache.kafka.connect.json.JsonConverter",
      "poll.max.batch.size": "5000"
    }
  }
]

Generated at Thu Feb 08 09:05:41 UTC 2024 using Jira 9.7.1#970001-sha1:2222b88b221c4928ef0de3161136cc90c8356a66.