[KAFKA-365] The data should exceed 1.4mb, causing the producer send to fail Created: 24/Apr/23  Updated: 27/Oct/23  Resolved: 22/May/23

Status: Closed
Project: Kafka Connector
Component/s: None
Affects Version/s: 1.9.1
Fix Version/s: None

Type: Question Priority: Major - P3
Reporter: 晓阳 刘 Assignee: Robert Walters
Resolution: Gone away Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Attachments: PNG File screenshot-1.png     PNG File screenshot-2.png     PNG File screenshot-3.png    

 Comments   
Comment by Jeffrey Yemin [ 22/May/23 ]

OK liuxiaoyang904.61@gmail.com  thanks for following up. 

Comment by 晓阳 刘 [ 22/May/23 ]

dbeng-pm-botjeff.yemin@mongodb.comrobert.walters@mongodb.com Hi~,
I have other things to deal with recently, and I don't have much energy to deal with this matter for the time being. This jar can also be closed for the time being. This problem occurs in a test program, and the importance is not high.

Comment by PM Bot [ 18/May/23 ]

Hi liuxiaoyang904.61@gmail.com! KAFKA-365 is awaiting your response.

If this is still an issue for you, please open Jira to review the latest status and provide your feedback. Thanks!

Comment by PM Bot [ 11/May/23 ]

Hey liuxiaoyang904.61@gmail.com, We need additional details to investigate the problem. If this is still an issue for you, please provide the requested information.

Comment by 晓阳 刘 [ 04/May/23 ]

robert.walters@mongodb.com Hi,my kafka config is 10m I don't know how many bytes of data are causing the error. I'll take a look to see if it exceeds 10M, although that's just text data.

Comment by Robert Walters [ 01/May/23 ]

liuxiaoyang904.61@gmail.com Can you ensure the following properties are set in  your kafka broker? This would make 15MB the max

    • message.max.bytes = 15728640
    • replica.fetch.max.bytes = 15728640
    • max.request.size = 15728640
    • fetch.message.max.bytes = 15728640
Comment by 晓阳 刘 [ 25/Apr/23 ]

jeff.yemin@mongodb.com log:
org.apache.kafka.connect.errors.ConnectException: Unrecoverable exception from producer send callback
at org.apache.kafka.connect.runtime.WorkerSourceTask.maybeThrowProducerSendException(WorkerSourceTask.java:258)
at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:312)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:240)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.common.errors.RecordTooLargeException: The message is 1445052 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
----------------------------------------------------------------
my kafka conf is 10M

Comment by 晓阳 刘 [ 25/Apr/23 ]

robert.walters@mongodb.com
The test proves that the 2M data I sent through the java producer can be received normally without restriction, so I wonder if the parameter max.request.size is not defined in the producer structure in the connector.

Comment by Robert Walters [ 24/Apr/23 ]

By default Kafka only handles messages about 1MB in size, it might be that your Kafka is not setup to handle more.  I found this stackoverflow that suggests a few Kafka configurations to change the default size

https://stackoverflow.com/questions/21020347/how-can-i-send-large-messages-with-kafka-over-15mb

 

Comment by Jeffrey Yemin [ 24/Apr/23 ]

Hi liuxiaoyang904.61@gmail.com,

It's difficult to say with the information provided. If you give us more context around this error we may be able to assist you:

  1. a reproduction scenario
  2. all relevant Kafka configuration properties
  3. the full stack trace
Comment by 晓阳 刘 [ 24/Apr/23 ]


But my Kafka configuration is 10mb, and I didn't find any related send parameter settings in the mongodb Kafka conn documentation

Comment by 晓阳 刘 [ 24/Apr/23 ]

Caused by: org.apache.kafka.common.errors.RecordTooLargeException: The message is 1445052 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

Generated at Thu Feb 08 09:06:13 UTC 2024 using Jira 9.7.1#970001-sha1:2222b88b221c4928ef0de3161136cc90c8356a66.