type {
nullable: true
logical_type {
urn: "beam:logical:pythonsdk_any:v1"
}
which results in this exception:
java.lang.IllegalArgumentException: Unexpected type_info: TYPEINFO_NOT_SET
INFO: at org.apache.beam.sdk.schemas.SchemaTranslation.fieldTypeFromProtoWithoutNullable(SchemaTranslation.java:479)
...
when that schema tries to get loaded into Beam in a SqlTransform. The code should be smart enough to properly encode nullable atomic avro types such as: 'type': ['null', 'string'] into the corresponding beam type and back:
type {
nullable: true
atomic_type: STRING
}
if no such nullable type conversion is possible, we can default back to the Any type until a proper union coder is added to the beam Python SDK.
Issue Priority
Priority: 2 (default / most bugs should be filed as P2)
What happened?
The Python SDK's avro_type_to_beam_type function maps all Union types to:
which results in this exception:
when that schema tries to get loaded into Beam in a SqlTransform. The code should be smart enough to properly encode nullable atomic avro types such as:
'type': ['null', 'string']
into the corresponding beam type and back:if no such nullable type conversion is possible, we can default back to the Any type until a proper union coder is added to the beam Python SDK.
Issue Priority
Priority: 2 (default / most bugs should be filed as P2)
Issue Components