Closed lazyval closed 5 years ago
This looks great @lazyval, huge thanks. Have you tested this output in a realistic example? The tests in avrohugger serve mostly as regression tests. The integration tests live in sbt-avrohugger (a change in sbt compelled me to move them there, now they might be able to be moved back, but that's another story), where you can run scripted
or e.g. scripted avrohugger/SpecificSerializationTests
to test under real serialization. You'll have to bump the avrohugger version to the next SNAPSHOT and publishLocal
, and update the sbt-avrohugger dependencies to the SNAPSHOT, after that scripted
will look for the latest and see the snapshot with your changes.
Note: I just released a bugfix, so current version is now 1.0.0-RC15, meaning that next snapshot now should be set as 1.0.0-RC16-SNAPSHOT
@julianpeeters cool!
Have you tested this output in a realistic example?
not quite, but I finally completed a run on scripted
for avrohugger-sbt and it succeeded! Thanks for an instructions on how to set it up
[success] Total time: 2002 s, completed Dec 10, 2018 12:05:01 AM
Fantastic. Hehe, yeah those tests take forever. I bet there's a way to speed those up, probably with scalacheck, but that's yet another story :)
On the scripted
run, was that with the same schema that you used in the avrohugger test? There's not currently a schema with optional big decimal right now (else woulda been failing this whole time, of course). Hmm actually I found this one: https://github.com/julianpeeters/sbt-avrohugger/blob/master/src/sbt-test/avrohugger/SpecificSerializationTests/src/main/avro/logical.avdl#L7
Looks like we used the same schema as for standard
tests, but punted on implementing nested big decimals for specific
. We could uncomment that for a minimal but sufficient test.
I suspect to see an error due to the get
and put
methods calling methods on the wrong schema type, from e.g. this line https://github.com/julianpeeters/avrohugger/pull/108/files#diff-edbc9f6bf9a8d8ffbc5c25f3c20777dbR13 where schema
ends up being the schema of the union still, and not the schema of the bytes logical decimal. I suspect we'll have to pass the rhs of that line (getSchema.getFields().get(field$).schema()
), as schemaAccessor
or something, so we can build up the accessor call as we descend and build up the type trees, e.g. getSchema.getFields().get(field$).schema().getTypes ...
.
I started in on that for the get
so feel free to ask any questions if you prefer to implement that yourself or if you prefer to tag team I can push my stuff to a branch that we can collaborate on. No preference here.
Haven't looked at get
and set
part and that test just yet, but I will give it a try.
So the test indeed fails, though for a bit different reason:
[info] [error] /private/var/folders/g2/r4vzb8rj1_34c04skj9_y9dm0000gn/T/sbt_bc0669c9/SpecificSerializationTests/src/test/scala/specific/SpecificPrimitivesSpec.scala:82:44: type mismatch;
[info] [error] found : java.time.Instant
[info] [error] required: Option[BigDecimal]
[info] [error] val record1 = LogicalIdl(bigDecimal, topMillisInstant, LocalDate.now(clock))
[info] [error] ^
[info] [error] /private/var/folders/g2/r4vzb8rj1_34c04skj9_y9dm0000gn/T/sbt_bc0669c9/SpecificSerializationTests/src/test/scala/specific/SpecificPrimitivesSpec.scala:82:75: type mismatch;
[info] [error] found : java.time.LocalDate
[info] [error] required: java.time.Instant
[info] [error] val record1 = LogicalIdl(bigDecimal, topMillisInstant, LocalDate.now(clock))
[info] [error] ^
[info] [error] /private/var/folders/g2/r4vzb8rj1_34c04skj9_y9dm0000gn/T/sbt_bc0669c9/SpecificSerializationTests/src/test/scala/specific/SpecificPrimitivesSpec.scala:83:44: type mismatch;
[info] [error] found : java.time.Instant
[info] [error] required: Option[BigDecimal]
[info] [error] val record2 = LogicalIdl(bigDecimal, topMillisInstant, LocalDate.now(clock))
[info] [error] ^
[info] [error] /private/var/folders/g2/r4vzb8rj1_34c04skj9_y9dm0000gn/T/sbt_bc0669c9/SpecificSerializationTests/src/test/scala/specific/SpecificPrimitivesSpec.scala:83:75: type mismatch;
[info] [error] found : java.time.LocalDate
[info] [error] required: java.time.Instant
[info] [error] val record2 = LogicalIdl(bigDecimal, topMillisInstant, LocalDate.now(clock))
[info] [error] ^
[info] [error] four errors found
[info] [error] (Test / compileIncremental) Compilation failed
I guess I should also fix the test to match new schema with uncommented field.
ah I see thanks, yeah fixing that spec would be a great check I think.
I'm also running into the problem with decimals in unions and would die for this fix ;) Anything I could do to help?
This is expected to now be fixed in sbt-avrohugger 2.0.0-RC16. Thanks again to @lazyval for getting this started.
When generator decides if there are decimal fields in the schema, it traverses all top level fields, but does not jump into UNION type. So if there is decimal inside, it's ignored.
Fixes #107
Maybe there is more generic way to approach this problem when there are deeply nested UNIONs, but I could't come up with recursive traversal at first glance.