[KAFKA-149] After applying the update 'immutable' was found to have been altered Created: 28/Aug/20 Updated: 28/Aug/20 Resolved: 28/Aug/20 |
|
| Status: | Closed |
| Project: | Kafka Connector |
| Component/s: | Sink |
| Affects Version/s: | 1.2.0 |
| Fix Version/s: | None |
| Type: | Bug | Priority: | Major - P3 |
| Reporter: | Rajaramesh Yaramati | Assignee: | Ross Lawley |
| Resolution: | Duplicate | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Environment: |
kafka 2.13 |
||
| Issue Links: |
|
||||||||
| Documentation Changes Summary: | |||||||||
| Description |
|
After initial sync complete, the connector terminates with the below error. Also aren't these errors should tolerate as per this https://docs.mongodb.com/kafka-connector/master/kafka-sink-properties/#dead-letter-queue-configuration-settings? My understanding of these DLQ is any such errors should tolerate and proceed to next message. But I don't see that behavior.
Error formatting macro: code: java.lang.StackOverflowError
{
Source & Sink configuration:
Thanks, Rajaramesh
|
| Comments |
| Comment by Ross Lawley [ 28/Aug/20 ] |
|
Please see the excellent blog post: Kafka Connect Deep Dive – Error Handling and Dead Letter Queues for more information about the Kafka deadletter queue feature. Specifically the "Where is error handling NOT provided by Kafka Connect?" section. As you can see from what is covered the put part of a source connector is not covered by Kafka's dead letter queues. For future reference as this sounds like a question / support issue, I wanted to give you some resources to get this questioned answered more quickly:
Just in case you have already opened a support case and are not receiving sufficient help, please let me know and I can facilitate escalating your issue. Thank you! |