-
Notifications
You must be signed in to change notification settings - Fork 262
Infinity loop if any message logged by kafka client library when kafka server is down. #36
Comments
IIRC the process was like this:
When kafka really is unavailable, the The basic mechanic is: The The reasoning behind this stunt is #1 and #11 (logbacks bootstrap). I hope this clarifies a bit how it - at least- is intended to work :) |
Hi @danielwegener, my reproducible environment is: add kafkaAppender to root logger and set this logger's level to trace. If I shutdown kafka server before starting this Spring Boot application, this application would never be initialized but having the following log messages again and again and again (from another file appender). In my production environment, I set the root logger level to error. This works fine when kafka is not available. By the way, thank you very much for this project. It really speeds up my development.
|
@xiangyihong would you be so kind as to check against the code in #39 ? This kind of thing is very difficult to test, even with the embedded integration tests for example |
@danielwegener at the very least my previous contribution was missing the metrics logger prefix which was used at trace level. I was also concerned if synchronize block in the lazy producer could interleave with locking in the Kafka Sender. Moving the code further into the To test I used a version of |
I've also just noticed @yangqiju's PR :) |
@danielwegener @aerskine - I read about this issue and it appears this issue was resolved. Can you please provide me the steps to resolve it. Please see my log config
|
For any logs from kafka client library, the strategy is to defer these messages and drained them before the next log.
However, if kafka server is down, this may lead to an infinity loop, especially when the log level is low(trace, debug, or even info).
production code use kafka appender to log --> kafka appender sends message to kafka client library --> cannot connect to kafka server, kafka client library log messages --> kafka appender gets these messages, try to send them to kafka client library --> ...
Would there be any fix for this issue? How about we eat logs from kafka client library. Users have to configure another logger for org.apache.kafka.clients
The text was updated successfully, but these errors were encountered: