Open vasilypodguzov opened 3 years ago
Update: To load data into the topic, I added: In docker-compose.yml: environment: KAFKA_MESSAGE_MAX_BYTES: 10000000 KAFKA_SOCKET_REQUEST_MAX_BYTES: 100001200
Mount producer.properties file with next additional value: max.request.size=10000000 message.max.bytes=10000000
But for now, I can import JSON only in the docker container by the next command and it's successfull:
kafka-console-producer.sh --bootstrap-server localhost:9092 --topic test_topic < /test.json --producer.config /opt/kafka/config/producer.properties
Is the possibility to upload JSON into the topic without a console? Because without parameter "--producer.config /opt/kafka/config/producer.properties" import gives the error:
The message is 6846307 bytes when serialized which is larger than 1048576, which is the value of the max.request.size configuration.
Thanks!
I had the same problem and i've solved adding gzip compression flag to the producer.
./kafka-console-producer.sh --compression-codec gzip
Dear developers, I've tried to push JSON (6MB) into KAFKA topic:
kafka-console-producer.sh --bootstrap-server localhost:9092 --topic test_topic < /test.json
But I see the error: Error when sending message to topic test_topic with key: null, value: 6846219 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback) org.apache.kafka.common.errors.RecordTooLargeException: The message is 6846307 bytes when serialized which is larger than 1048576, which is the value of the max.request.size configuration.
I've also added in docker-compose.yml:
KAFKA_MESSAGE_MAX_BYTES: 2000000
Could you clarify how to configure the server in the right way? Thanks!