Closed v0y4g3r closed 1 year ago
CSV has the similar issue:
table struct:
MySQL [(none)]> desc table host_memory_used_bytes;
+-----------+----------------------+------+---------+---------------+
| Field | Type | Null | Default | Semantic Type |
+-----------+----------------------+------+---------+---------------+
| ts | TimestampMillisecond | NO | | TIME INDEX |
| collector | String | YES | | PRIMARY KEY |
| host | String | YES | | PRIMARY KEY |
| val | Float64 | YES | | FIELD |
+-----------+----------------------+------+---------+---------------+
4 rows in set (0.001 sec)
CSV head:
ts,collector, host, val
1692270325901,"memory","f52aa51213c3",2015292000.0
1692270329712,"memory","25a3ccdaf3c6",4079696000.0
1692270332056,"memory","4147c672f2d4",767628000.0
1692270340901,"memory","f52aa51213c3",2018216000.0
1692270344712,"memory","25a3ccdaf3c6",4076592000.0
Error:
2023-08-17T12:49:09.487401Z ERROR frontend::instance: Failed to execute query: copy host_memory_used_bytes from 'data.csv' with (format = 'csv') trace_id=7097919323603341306 err.msg=Failed to read record batch, source: Arrow error: Parser error: Error parsing column 0 at line 2: Parser error: Error parsing timestamp from '1692270329712': error parsing date err.code=Unexpected err.source=Arrow error: Parser error: Error parsing column 0 at line 2: Parser error: Error parsing timestamp from '1692270329712': error parsing date err.source.sources=[Parser error: Error parsing column 0 at line 2: Parser error: Error parsing timestamp from '1692270329712': error parsing date]
CSV has the similar issue:
table struct:
MySQL [(none)]> desc table host_memory_used_bytes; +-----------+----------------------+------+---------+---------------+ | Field | Type | Null | Default | Semantic Type | +-----------+----------------------+------+---------+---------------+ | ts | TimestampMillisecond | NO | | TIME INDEX | | collector | String | YES | | PRIMARY KEY | | host | String | YES | | PRIMARY KEY | | val | Float64 | YES | | FIELD | +-----------+----------------------+------+---------+---------------+ 4 rows in set (0.001 sec)
CSV head:
ts,collector, host, val 1692270325901,"memory","f52aa51213c3",2015292000.0 1692270329712,"memory","25a3ccdaf3c6",4079696000.0 1692270332056,"memory","4147c672f2d4",767628000.0 1692270340901,"memory","f52aa51213c3",2018216000.0 1692270344712,"memory","25a3ccdaf3c6",4076592000.0
Error:
2023-08-17T12:49:09.487401Z ERROR frontend::instance: Failed to execute query: copy host_memory_used_bytes from 'data.csv' with (format = 'csv') trace_id=7097919323603341306 err.msg=Failed to read record batch, source: Arrow error: Parser error: Error parsing column 0 at line 2: Parser error: Error parsing timestamp from '1692270329712': error parsing date err.code=Unexpected err.source=Arrow error: Parser error: Error parsing column 0 at line 2: Parser error: Error parsing timestamp from '1692270329712': error parsing date err.source.sources=[Parser error: Error parsing column 0 at line 2: Parser error: Error parsing timestamp from '1692270329712': error parsing date]
The arrow-csv
doesn't support to casting Unix timestamp to timestamp.
What type of bug is this?
Unexpected error
What subsystems are affected?
Datanode
What happened?
When importing a parquet file to existing table, if the parquet file's timestamp type is bigint while the timestamp column in table is
timestamp
(whatever time unit), theCOPY FROM
statement will fail.What operating system did you use?
NA
Relevant log output and stack trace
How can we reproduce the bug?
create a table with timestamps
Create a sample file with bigint timestamp:
cpu
table