Closed OopsOutOfMemory closed 5 years ago
@felipemmelo could you please have a look at this?
Hi there, sure thing.
It usually happens (to the best of my experience) in one of two situations.
How is your message being produced? Is there any chance your schema does not match the data?
Hi @felipemmelo , thanks for your advise.
I checked the schema and the payload found they are matched.
Message is produced by go client https://github.com/Shopify/sarama.
Manually consume from topic:
bin/kafka-avro-console-consumer --zookeeper localhost:2181 --topic VIP_E3AVAIcYWzLHS5YM_0000000000 --from-beginning
{
"uri": {
"string": "/js/vendor/modernizr-2.8.3.min.js"
},
"protocol": {
"string": "HTTP/1.1"
},
"code": {
"long": 200
},
"ip": {
"string": "10.128.2.1"
},
"timestamp": {
"string": "29/Jan/2018:20:29:23"
},
"method": {
"string": "GET"
},
"raw_text": {
"string": "\"10.128.2.1\",\"[29/Jan/2018:20:29:23\",\"GET /js/vendor/modernizr-2.8.3.min.js HTTP/1.1\",\"200\""
},
"_appId": "1380542074",
"_repo": "weblog",
"weblog_1380542074_timestamp": "2018-08-16T10:08:07.469125479+08:00"
}
Schema:
{
"type": "record",
"name": "VIP_E3AVAIcYWzLHS5YM_0000000000",
"fields": [{
"name": "uri",
"type": ["null", "string"]
}, {
"name": "protocol",
"type": ["null", "string"]
}, {
"name": "code",
"type": ["null", "long"]
}, {
"name": "ip",
"type": ["null", "string"]
}, {
"name": "timestamp",
"type": ["null", "string"]
}, {
"name": "method",
"type": ["null", "string"]
}, {
"name": "raw_text",
"type": ["null", "string"]
}, {
"name": "_appId",
"type": "string"
}, {
"name": "_repo",
"type": "string"
}, {
"name": "weblog_1380542074_timestamp",
"type": "string"
}]
}
Is there anything wrong with my usage?
Hi @OopsOutOfMemory , thanks for the reply.
The usage seems to be alright. It may be related to how Sarama is generating the Avro record. I'll look into that library and try to replicate the error. Would there be any open/public topic I could connect to in order to test it?
Also, could you let me know which naming strategy you're using for Schema Registry? Currently ABRiS only supports topic subject strategy.
Finally, is bin/kafka-avro-console-consumer
also provided by Sarama?
Thanks in advance.
@felipemmelo
It usually happens (to the best of my experience) in one of two situations. ...
- The payload is being produced by a Confluent-compliant producer - which includes the id of the schema to the message - and consumed by a standard Avro parser.
In this case where we are using a Confluent compliant producer, can you confirm for me how to get around this issue?
@felipemmelo
It usually happens (to the best of my experience) in one of two situations. ...
- The payload is being produced by a Confluent-compliant producer - which includes the id of the schema to the message - and consumed by a standard Avro parser.
In this case where we are using a Confluent compliant producer, can you confirm for me how to get around this issue?
Hi ... did you get the resolution of this issue. .. I am facing the same issue in my code, my schema is being generated with 'id' field which is causing outofbound error.
Can you please let me know the resolution if any?
Hi @NitinRamola ,
Which API are you using fromAvro
or fromConfluentAvro
? If you have the id attached to the top of the payload, the latter should be used.
Hi @NitinRamola ,
Which API are you using
fromAvro
orfromConfluentAvro
? If you have the id attached to the top of the payload, the latter should be used.
Hi Felipemmelo, Thanks for your suggestion, yes i m using fromAvro right now, let me check with fromConfluentAvro and will reply back to you...
Hi @NitinRamola , Which API are you using
fromAvro
orfromConfluentAvro
? If you have the id attached to the top of the payload, the latter should be used.Hi Felipemmelo, Thanks for your suggestion, yes i m using fromAvro right now, let me check with fromConfluentAvro and will reply back to you...
HI Felipemmelo, I can't use 3rd party packages currently although your solution looks good, is there any official API suggested to resolve this error ?
Hi @NitinRamola , not that I know, this is why we've developed this one. You can try to use KafkaStreams which is also provided by Confluent, or, if you don't want to use Confluent-compliant payload (i.e. schema id on top of payload), there is a built-in Avro module in Spark 2.4.
@felipemmelo
It usually happens (to the best of my experience) in one of two situations. ...
- The payload is being produced by a Confluent-compliant producer - which includes the id of the schema to the message - and consumed by a standard Avro parser.
In this case where we are using a Confluent compliant producer, can you confirm for me how to get around this issue?
Hi ... did you get the resolution of this issue. .. I am facing the same issue in my code, my schema is being generated with 'id' field which is causing outofbound error.
Can you please let me know the resolution if any?
Sorry, genuinely can't remember.
Hey, guys.
I've encountered a problem like below, could anyone help me please?
Code:
Avro Schema
Exception: