Closed godocean closed 6 years ago
Hi @godocean , Hive you fixed this issue yet? I have meet the same issue.
maybe you should change another log collector component such as logstash, flume, etc, have a try, it works out
Maybe you sent too large msg? I get the same problem too. When I send short msg it will be all right. But when I send 50KB msg, this problem always be there.
nginx error log: [error] 30952#0: 30248161 lua tcp socket read timed out, context: ngx.timer, client: xxxx, server: xxxx [error] 30952#0: 30248161 [lua] producer.lua:258: buffered messages send to kafka err: timeout, retryable: nil, topic:xxxx, partition_id: 1, length: 1, context: ngx.timer, client: xxxx, server: xxxx
but it did not happen frequently. kafka works fine and everything is OK, I have set request_timeout 10000ms like this:
local bp = producer:new(broker_list, { producer_type = "async", request_timeout = 10000 })
but it seems to be not work. could you give me a favor?