psycopg / psycopg2

PostgreSQL database adapter for the Python programming language
https://www.psycopg.org/
Other
3.31k stars 503 forks source link

Issues using [None, None] #1702

Closed bisoldi closed 3 months ago

bisoldi commented 3 months ago

I'm trying to use the below (example) SQL and dictionary parameter to insert in bulk.

INSERT INTO table_name (col1, col2) SELECT p.col1, p.col2 FROM UNNEST(%(name)s, %(reg)s)

params: dict = {'name': ['mike', 'joe'], 'reg': ['012345', '3948273']}

The problem comes in when the reg value for the people being inserted is None. My code handles it, creating the following dictionary:

params: dict = {'name': ['mike', 'joe'], 'reg': [None, None]}

However, when it gets serialized / mogrify'd, it interpolates into:

INSERT INTO table_name (col1, col2) SELECT p.col1, p.col2 FROM UNNEST(ARRAY['mike', 'joe'], '{NULL, NULL}')

My actual query has quite a few more columns, each of which could be empty and I don't want to write separate SQL, or dynamically generate the SQL based on available fields.

Is there a way to force the [None, None] into ARRAY[NULL, NULL] instead of the stringified set?

dvarrazzo commented 3 months ago

Hello,

nulls in array are a tricky matter, see #706, #325. The ARRAY[] construct fails in several situations if it contains only nulls or arrays of only nulls.

You can solve your problem by adding an explicit cast, such as %(reg)s::text[].

Psycopg 3 might work out of the box.

bisoldi commented 3 months ago

I appreciate the response. I had also tried providing an explicit cast %(reg)s::TEXT[] but it still serialized the list as the stringified set.

I'll give it another try...

Thanks!