AbsaOSS / ABRiS

Avro SerDe for Apache Spark structured APIs.
Apache License 2.0
230 stars 75 forks source link

Question: Is it possible to change the schema to handle precision error? #207

Closed moyphilip closed 3 years ago

moyphilip commented 3 years ago

I have a field with the schema:

                        {
                            "name": "amount",
                            "type": [
                                "null",
                                {
                                    "type": "bytes",
                                    "scale": 2,
                                    "precision": 64,
                                    "connect.version": 1,
                                    "connect.parameters": {
                                        "scale": "2"
                                    },
                                    "connect.name": "org.apache.kafka.connect.data.Decimal",
                                    "logicalType": "decimal"
                                }
                            ],
                            "default": null
                        },

When I try to deserialize the field I get the following error:

Caused by: org.apache.spark.sql.AnalysisException: decimal can only support precision up to 38;

Is there way a workaround for this? I am unable to convert this value to string.

kevinwallimann commented 3 years ago

Hi @moyphilip This issue is not related to ABRiS, but a problem in Spark. I suggest looking at their documentation or try stackoverflow to find any workarounds. There seems to be an open issue here https://issues.apache.org/jira/browse/SPARK-28318