Closed mhowlett closed 6 years ago
Some outstanding issues:
The avrogen.exe tool build is failing with the following on mac os and linux (however, this is just a build problem, the executable works fine on all platforms):
/usr/local/share/dotnet/sdk/2.1.101/Microsoft.Common.CurrentVersion.targets(2991,5): error MSB3552: Resource file "**/*.resx" cannot be found. [/git/avro/lang/csharp/src/apache/codegen/Avro.codegen.csproj]
Some of the Confluent.Kafka.Avro.Integration tests are failing with errors like the below (a type loading error). However, the AvroSpecific and AvroGeneric examples work as expected (and test the same functionality).
Failed Confluent.Kafka.Avro.IntegrationTests.Tests.ProduceConsume(bootstrapServers: "10.200.7.144:9092", schemaRegistryServers: "10.200.7.144:8081")
Error Message:
System.Reflection.ReflectionTypeLoadException: Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.
at System.Reflection.RuntimeModule.GetTypes(RuntimeModule module)
at System.Reflection.Assembly.GetTypes()
at Avro.Specific.ObjectCreator.FindType(String name, Boolean throwError)
at Avro.Specific.ObjectCreator.GetType(String name, Type schemaType)
at Avro.Specific.ObjectCreator.New(String name, Type schemaType)
at Avro.Specific.SpecificDefaultReader.ReadRecord(Object reuse, RecordSchema writerSchema, Schema readerSchema, Decoder dec)
at Avro.Generic.DefaultReader.Read(Object reuse, Schema writerSchema, Schema readerSchema, Decoder d)
at Avro.Generic.DefaultReader.Read[T](T reuse, Decoder decoder)
at Avro.Specific.SpecificReader`1.Read(T reuse, Decoder dec)
at Confluent.Kafka.Serialization.SpecificDeserializerImpl`1.Deserialize(String topic, Byte[] array) in /git/confluent-kafka-dotnet/src/Confluent.Kafka.Avro/SpecificDeserializerImpl.cs:line 128
at Confluent.Kafka.Serialization.AvroDeserializer`1.Deserialize(String topic, Byte[] data) in /git/confluent-kafka-dotnet/src/Confluent.Kafka.Avro/AvroDeserializer.cs:line 106
at Confluent.Kafka.Serialization.MessageExtensions.Deserialize[TKey,TValue](Message message, IDeserializer`1 keyDeserializer, IDeserializer`1 valueDeserializer) in /git/confluent-kafka-dotnet/src/Confluent.Kafka/Serialization/Extensions/Message.cs:line 58
Expected: True
Actual: False
Stack Trace:
at Confluent.Kafka.Avro.IntegrationTests.Tests.<>c.<ProduceConsume>b__3_2(Object o, Message e) in /git/confluent-kafka-dotnet/test/Confluent.Kafka.Avro.IntegrationTests/Tests/ProduceConsume.cs:line 88
at Confluent.Kafka.Consumer`2.Consume(Message`2& message, Int32 millisecondsTimeout) in /git/confluent-kafka-dotnet/src/Confluent.Kafka/Consumer.cs:line 167
at Confluent.Kafka.Consumer`2.Consume(Message`2& message, TimeSpan timeout) in /git/confluent-kafka-dotnet/src/Confluent.Kafka/Consumer.cs:line 179
at Confluent.Kafka.Consumer`2.Poll(TimeSpan timeout) in /git/confluent-kafka-dotnet/src/Confluent.Kafka/Consumer.cs:line 219
at Confluent.Kafka.Avro.IntegrationTests.Tests.ProduceConsume(String bootstrapServers, String schemaRegistryServers) in /git/confluent-kafka-dotnet/test/Confluent.Kafka.Avro.IntegrationTests/Tests/ProduceConsume.cs:line 98
[xUnit.net 00:00:48.0957330] Confluent.Kafka.Avro.IntegrationTests.Tests.SharedSchemaRegistryClient(bootstrapServers: "10.200.7.144:9092", schemaRegistryServers: "10.200.7.144:8081") [FAIL]
System.CodeDom
is what we needed to be able to target netstandard2.0 (for .net core compatibility). It's been available for 7 months, but I only just realized it's here.The tests use NUnit 2.6.4, not the lates since NUnit 3.x is a complete rewrite, and it's quite an effort to convert all the tests over. For future reference, to run tests, I used this: https://github.com/nunit/nunit-console/releases/tag/3.8
Unfortunately, there are 11 test failures, so we shouldn't merge until we get to the bottom of them. This is unexpected - there were no code changes required to get the library compiling, and it should have just worked.