Budibase / budibase

Low code platform for building business apps and workflows in minutes. Supports PostgreSQL, MySQL, MariaDB, MSSQL, MongoDB, Rest API, Docker, K8s, and more 🚀
https://budibase.com
Other
22.86k stars 1.59k forks source link

Time data type returns "Invalid time value" on Postgres #9050

Closed windowsdeveloperwannabe closed 1 year ago

windowsdeveloperwannabe commented 1 year ago

Hosting

Describe the bug Trying to view a table that uses a time data type column and a row using the column returns error "Invalid time value".

To Reproduce

INSERT INTO time_test_table (id, daily_at) VALUES (1, '13:55');

CREATE TABLE timetz_test_table ( id INT PRIMARY KEY, daily_at TIMETZ );

INSERT INTO timetz_test_table (id, daily_at) VALUES (1, '13:55');


- In Budibase add data provider for Postgres
- Try to view the table

**Expected behavior**
Displaying the table without error.

**Desktop (please complete the following information):**
 - OS: Windows
 - Browser Chrome
 - Version 108
melohagan commented 1 year ago

Hey @windowsdeveloperwannabe

Thanks for raising this. If you edit the columns and tick Ignore time zones and refresh the page, then that will sort it.

With that said, it would be nice if we were able to check on table fetch if this should be checked by default when there's no timezone present.

Screenshot 2022-12-15 at 16 18 52

Screenshot 2022-12-15 at 16 19 08

xtrojak commented 1 year ago

@melohagan I'm having the same problem. I have basically the same setup, expect running version 2.13.10 on MacOS.

However, when I try to display the table containing column with type time [ without time zone ] (as described here) it does not even show me the column names, so I can't really apply the approach you proposed. Any other workarounds?

EDIT: Interesting is that it work when displaying views!

image

Shadowfita commented 1 year ago

I am getting this error message when trying to import a csv file into a budibase db table with ISO timestamps.

melohagan commented 1 year ago

Hey @Shadowfita

Could you provide the CSV with some sample data that doesn't work and I can try to reproduce

Shadowfita commented 1 year ago

Hi @melohagan ! Thank you, much appreciated.

Issue occurring with the "Date" column. If I delete the column, it imports fine. I've tried various formats. ISO standard, DD-MM-YYY, etc.

Date,Link or File,Topic,Source,Status,Related theme,By when,Outcome,LPST 2023-10-19T00:00:00.000Z,,Recruitment,Email,Resolved,People,WEEK 7,Completed ,Leon 2023-10-19T00:00:00.000Z,,Recruitment,Email,Resolved,People,WEEK 7,Completed ,Leon 2023-10-19T00:00:00.000Z,,Recruitment,Email,On-hold,People,WEEK 7,,Leon 2023-10-19T00:00:00.000Z,,Travel / accommodation,Email,Resolved,People,WEEK 8,,Leon 2023-10-20T00:00:00.000Z,,Travel / accommodation,Email,Resolved,People,WEEK 4,Completed ,Leon 2023-10-20T00:00:00.000Z,,Admin support,TEAMS Meeting ,Resolved,Process,ASAP,Completed ,Leon

Shadowfita commented 1 year ago

@melohagan I've just realised I was running a fairly old version of budibase. I think it's safe to assume this isn't an issue anymore. I will confirm later today.

Shadowfita commented 1 year ago

@melohagan Scratch that, I've tried importing csv with datetimes into a table on the latest budibase version and it's unable to validate the column.

Importing the following rows into table:

2023-10-30T00:00:00.000Z

2023-11-01T00:00:00.000Z

2023-11-01T00:00:00.000Z

2023-11-02T00:00:00.000Z

2023-11-02T00:00:00.000Z

2023-11-03T00:00:00.000Z

image

melohagan commented 1 year ago

Hey @Shadowfita

Are those newlines in the CSV at all? What if you tried:

2023-10-30T00:00:00.000Z
2023-11-01T00:00:00.000Z
2023-11-01T00:00:00.000Z
2023-11-02T00:00:00.000Z
2023-11-02T00:00:00.000Z
2023-11-03T00:00:00.000Z
Shadowfita commented 1 year ago

Hey @melohagan , apologies, there are no line breaks, just a formatting issue I had with the code block.

melohagan commented 1 year ago

Hey @Shadowfita

Not seeing this issue with the data provided:

Screenshot 2023-12-01 at 09 14 02

Screenshot 2023-12-01 at 09 17 17
melohagan commented 1 year ago

Feel free to provide your SQL schema and I can try to more accurately reproduce