confluentinc / confluent-kafka-dotnet

Confluent's Apache Kafka .NET client
https://github.com/confluentinc/confluent-kafka-dotnet/wiki
Apache License 2.0
78 stars 866 forks source link

Avro.AvroException: Unable to find type 'IDictionary<string,System.String>' in all loaded assemblies #2019

Open marcels2204 opened 1 year ago

marcels2204 commented 1 year ago

Description

Using Confluent.SchemaRegistry.Serdes.Avro v2.0.2 with Confluent Cloud, .NET 7 and C# 11.

I've the following schema. Producing works fine. On Deserializing occurs an error: Avro.AvroException: Unable to find type 'IDictionary<string,System.String>' in all loaded assemblies

Please advies.

{ "fields": [ { "default": null, "name": "Rows", "type": [ "null", { "items": { "type": "map", "values": "string" }, "type": "array" } ] }, { "default": null, "name": "TableName", "type": [ "null", "string" ] }, { "default": null, "name": "sku", "type": [ "null", "int" ] } ], "name": "CPQModelTables_value", "namespace": "nord.pim", "type": "record" }

How to reproduce

Checklist

Please provide the following information:

marcels2204 commented 1 year ago

[08:16:44 WRN] Consuming message failed: Confluent.Kafka.ConsumeException: Local: Value deserialization error ---> Avro.AvroException: Unable to find type 'IDictionary<string,System.String>' in all loaded assemblies in field Rows ---> Avro.AvroException: Unable to find type 'IDictionary<string,System.String>' in all loaded assemblies at Avro.Specific.ObjectCreator.<>c__DisplayClass14_0.<FindType>b__0(String _) at System.Collections.Concurrent.ConcurrentDictionary2.GetOrAdd(TKey key, Func2 valueFactory) at Avro.Specific.ObjectCreator.FindType(String name) at Avro.Specific.ObjectCreator.GetType(String name, Type schemaType) at Avro.Specific.ObjectCreator.New(String name, Type schemaType) at Avro.Specific.SpecificDefaultReader.ReadArray(Object reuse, ArraySchema writerSchema, Schema readerSchema, Decoder dec) at Avro.Generic.DefaultReader.Read(Object reuse, Schema writerSchema, Schema readerSchema, Decoder d) at Avro.Generic.DefaultReader.ReadUnion(Object reuse, UnionSchema writerSchema, Schema readerSchema, Decoder d) at Avro.Generic.DefaultReader.Read(Object reuse, Schema writerSchema, Schema readerSchema, Decoder d) at Avro.Specific.SpecificDefaultReader.ReadRecord(Object reuse, RecordSchema writerSchema, Schema readerSchema, Decoder dec) --- End of inner exception stack trace --- at Avro.Specific.SpecificDefaultReader.ReadRecord(Object reuse, RecordSchema writerSchema, Schema readerSchema, Decoder dec) at Avro.Generic.DefaultReader.Read(Object reuse, Schema writerSchema, Schema readerSchema, Decoder d) at Avro.Generic.DefaultReader.Read[T](T reuse, Decoder decoder) at Avro.Specific.SpecificReader1.Read(T reuse, Decoder dec) at Confluent.SchemaRegistry.Serdes.SpecificDeserializerImpl1.Deserialize(String topic, Byte[] array) at Confluent.SchemaRegistry.Serdes.AvroDeserializer1.DeserializeAsync(ReadOnlyMemory1 data, Boolean isNull, SerializationContext context) at Confluent.Kafka.SyncOverAsync.SyncOverAsyncDeserializer1.Deserialize(ReadOnlySpan1 data, Boolean isNull, SerializationContext context) at Confluent.Kafka.Consumer2.Consume(Int32 millisecondsTimeout) --- End of inner exception stack trace --- at Confluent.Kafka.Consumer2.Consume(Int32 millisecondsTimeout) at Confluent.Kafka.Consumer2.Consume(CancellationToken cancellationToken) at Nord.Shared.Kafka.BaseConsumer2.<>c__DisplayClass9_0.<<ExecuteAsync>b__0>d.MoveNext() in C:\_Dev\be\nord.shared.kafka\Nord.Shared.Kafka\BaseConsumer.cs:line 107 --- End of stack trace from previous location --- at Polly.AsyncPolicy.<>c__DisplayClass40_0.<<ImplementationAsync>b__0>d.MoveNext() --- End of stack trace from previous location --- at Polly.Retry.AsyncRetryEngine.ImplementationAsync[TResult](Func3 action, Context context, CancellationToken cancellationToken, ExceptionPredicates shouldRetryExceptionPredicates, ResultPredicates1 shouldRetryResultPredicates, Func5 onRetryAsync, Int32 permittedRetryCount, IEnumerable1 sleepDurationsEnumerable, Func4 sleepDurationProvider, Boolean continueOnCapturedContext) at Polly.AsyncPolicy.ExecuteAsync(Func3 action, Context context, CancellationToken cancellationToken, Boolean continueOnCapturedContext) at Nord.Shared.Kafka.BaseConsumer2.ExecuteAsync(CancellationToken stoppingToken) in C:_Dev\be\nord.shared.kafka\Nord.Shared.Kafka\BaseConsumer.cs:line 78`

Ortub commented 10 months ago

having the same issue, array with map inside is not working as expected.

schema:

      {
              "name": "ListOfMap",
              "type": {
                "type": "array",
                "items": {
                  "type": "map",
                  "values": "string"
                }
              }
            }

"exception": "Avro.AvroException: Unable to find type 'IDictionary<string,System.String>' in all loaded assemblies in field ListOfMap

Rades98 commented 3 months ago

I think that problem is in Avro.Specific.ObjectCreator, where are methods like TryGetIListItemTypeName and TryGetNullableItemTypeName, but nothing like TryGetIDictionaryItemTypeName.. so this may be not problem of this package after all but the problem of base library

at this point we fixed it like this

"name": "DictWithPseudoDict,
 "type": {
     "type": "map",
     "values": {
         "type": "array",
         "items": {
             "type": "record",
             "name": "KeyValuePair",
             "namespace": "MyCompany.SharedTypes",
             "fields": [
                 {
                     "name": "Key",
                     "type": "string"
                 },
                 {
                     "name": "Value",
                     "type": "string"
                 }
             ]
         }
     }
 }

Knowing that it is not that kind of stuff we all needed, but it is the best working solution for us so far..