duckdb / duckdb-java

DuckDB JDBC Driver
https://duckdb.org/docs/api/java.html
MIT License
41 stars 21 forks source link

DuckDB in DBeaver (JDBC) doesn't show information_schema tables/views #74

Open Alex-Monahan opened 2 years ago

Alex-Monahan commented 2 years ago

Hi Folks!

As pathfinding for a future blog post, I got DBeaver set up with DuckDB. It is super slick - it took like 2 minutes. I think it will make a create "speed run" little tutorial video.

One small gap is that while DBeaver shows the information_schema, it does not have the tables, views, or functions populated. I think those could come in handy for folks when they want to explore their DB's metadata and could make for a nice initial/demo experience as well.

This shows that the tables and views sections are empty (with no drilldown arrow next to them) for the metadata schemas. image

Thanks! -Alex

Mytherin commented 2 years ago

Are views shown in general? The information_schema/pg_catalog schemas only contain views, so perhaps the problem is with that.

Is this using the JDBC connector, or another connector?

Alex-Monahan commented 2 years ago

Yes, views can be shown, I just didn't create one in my example - sorry! It's using the JDBC connector

Mause commented 2 years ago

@Alex-Monahan can you check if this is still the case?

Alex-Monahan commented 2 years ago

@Alex-Monahan can you check if this is still the case?

I'll have a look!

Alex-Monahan commented 2 years ago

Unfortunately I still see the same behavior. I updated to the latest DBeaver and to the 0.5.1 driver. I tested with both an in memory DB, a file based one that I created in DBeaver, and a file based one that I created on the CLI (0.5.1). Do you want for me to test with a dev driver? Where can I find one? I didn't see it in Github actions after poking around for a little while...

A random thought - is it possible that DBeaver should be showing a "Functions" category in addition to tables, views, procedures, etc.? Could that be where all our information_schema content is hiding?

Also, when I tested the connection in DBeaver, it would hold the connection open and prevent me from using the database until I closed and reopened DBeaver. Do you think that's an issue on our side, or should I file a bug on the DBeaver side for them to close the connection correctly/differently after the test connection?

digitalghost-dev commented 1 year ago

I think I'm having the same issue. I installed the DuckDB CLI and created a table with a couple of rows. I then installed DBeaver and was able to connect to DuckDB very easily but the table I made in the CLI is not showing up anywhere in DBeaver.

Starlord22 commented 1 year ago

Looks like I am having the same issue. Anyone know how to resolve this issue?

franemar commented 1 year ago

With DBeaver version 23.04 and DuckDB driver v0.71 custom tables are been showing in the UI: image information_schema is still empty in the UI, but its views are available in the SQL script: image image So, this driver version seems to be working fine and there is a pending implementation in the DBeaver's UI.

jalvarezcabada commented 1 year ago

Hi everyone, I was testing duckdb and got the same issue but in DataGrip version 2023.2. I can't see the tables in the main schema. Sorry if this use case doesn't belong in this thread :smile: image

szarnyasg commented 1 year ago

@jalvarezcabada please open a separate issue about DataGrip-related problems.

As shown in https://github.com/duckdb/duckdb-java/issues/74, DBeaver now displays DuckDB tables correctly.

LonwoLonwo commented 8 months ago

As shown in https://github.com/duckdb/duckdb-java/issues/74, DBeaver now displays DuckDB tables correctly.

Hello @szarnyasg

The initial issue was about information_schema tables/views. Not about the custom tables.

2024-02-28 14_17_55-DBeaver Ultimate 24 0 0 - _duck_010 db_ Script-178

The issue is still here, and I want to know how we can fix it.

I think those could come in handy for folks when they want to explore their DB's metadata

Because I am one of these people :D

Please reopen this issue.