Closed bloodgang94 closed 1 year ago
@bloodgang94
I ran your exact script (with no modifications) for a minute with 50 VUs and this is the result:
I am aware of the reader errors, as you see in the logs. The writer generated 147k messages and the reader consumed 132k. The number of errors and your limit of 300 correlates to the difference between the messages produced vs. consumed. I tweaked the reader config:
import {
Writer,
Reader,
Connection,
SchemaRegistry,
CODEC_SNAPPY,
SCHEMA_TYPE_JSON,
SECOND,
} from "k6/x/kafka"
const reader = new Reader({
brokers: brokers,
groupID: groupID,
groupTopics: [topic],
queueCapacity: 300,
readBatchTimeout: 20 * SECOND,
maxBytes: 10000000, // 10 MB
readLagInterval: 10 * SECOND,
})
And this is the best result I can get:
I'll close this due to inactivity. Feel free to re-open it if you still have the issue.
@mostafa After making changes also getting Unable to read messages error right Then what does it mean? How to avoid this error?
Even in your result also writer message count and reader message count are different
Hello! Thanks for your work. I have a question. I am using your example https://github.com/mostafa/xk6-kafka/blob/main/scripts/test_consumer_group.js and ran into the problem that if you try to subtract all 300 messages, then the consumer waits indefinitely. And this happens somewhere in 3 out of 5 attempts. Full code
logs
I have tried the rebalanceTimeout and sessionTimeout settings but without success.
Could you suggest what I am doing wrong?