Open andygrove opened 2 weeks ago
We need to implement CAST from timestamp to numeric to match Spark's behavior.
We can break this into multiple issues if needed.
The following example is casting timestamp c to different numeric types.
c
scala> spark.sql("select c, cast(c as boolean), cast(c as byte), cast(c as short), cast(c as int), cast(c as long), cast(c as decimal) from t3").show +-------------------+----+---+-----+----------+----------+----------+ | c| c| c| c| c| c| c| +-------------------+----+---+-----+----------+----------+----------+ |2024-01-01 00:00:00|true|-16|25328|1704092400|1704092400|1704092400| |2023-12-12 00:00:00|true|-16| 1264|1702364400|1702364400|1702364400| +-------------------+----+---+-----+----------+----------+----------+
No response
Also, Spark returns timestamp in seconds and Comet returns timestamp in microseconds.
// input: 2023-12-31 10:00:00.0, expected: 1.7040456E9, actual: 1.7040456E15
What is the problem the feature request solves?
We need to implement CAST from timestamp to numeric to match Spark's behavior.
We can break this into multiple issues if needed.
The following example is casting timestamp
c
to different numeric types.Describe the potential solution
No response
Additional context
No response