porsager / postgres

Postgres.js - The Fastest full featured PostgreSQL client for Node.js, Deno, Bun and CloudFlare
The Unlicense
7.06k stars 257 forks source link

Interpolating Values into DB Logs #473

Closed harrisoncramer closed 1 year ago

harrisoncramer commented 1 year ago

First off, thank you for the awesome package!

Is it possible to configure the SQL connection to enable more human readable logging? Right now, my PostgreSQL logs show a long list of parameters and a series of $ signs where they will be injected into the query.

However, this is hard to read, it ends up looking like this:

2022-09-01 17:30:19.226 UTC [99] LOG:  execute b3njhn10dzn5:
              WITH cip_codes AS (
                SELECT DISTINCT cip_4,
                job_category_label
                FROM c2c.crosswalk
                WHERE job_category_label IN ($1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12,$13,$14,$15,$16,$17,$18,$19,$20,$21,$22,$23)
              ), grads_data AS (
                SELECT
                  SUM(ctotalt_2019) AS grads_2019,
                  SUM(ctotalt_2014) AS grads_2014,
                  field.unitid,
                  field.region,
                  field.cip_4
                FROM c2c.field
                JOIN c2c.institution ON institution.unitid = field.unitid
                WHERE field.cip_4 IN (SELECT cip_4 FROM cip_codes)
                AND pctpell >= $24 AND pctpell <= $25
                AND ugds_women >= $26 AND ugds_women <= $27
                AND par_ed_pct_1stgen >= $28 AND par_ed_pct_1stgen <= $29
                AND ugds_white <= $30 AND ugds_white >= $31
                AND kq5_cond_parq1 >= $32 AND mr_kq5_pq1 <= $33
                GROUP BY cip_4, field.region, field.unitid
              )
              SELECT
                SUM(grads_data.grads_2019) AS graduates,
                SUM(grads_data.grads_2019 - grads_data.grads_2014) / SUM(NULLIF(grads_2014, 0)) AS growth_rate,
                region
              FROM grads_data
              JOIN cip_codes ON cip_codes.cip_4 = grads_data.cip_4
              GROUP BY region

2022-09-01 17:30:19.226 UTC [99] DETAIL:  parameters: $1 = 'Management', $2 = 'Computer and Mathematical', $3 = 'Arts, Design, Entertainment, Sports, and Media', $4 = 'Food Preparation and Serving Related', $5 = 'Military Specific', $6 = 'Protective Service', $7 = 'Production', $8 = 'Sales and Related', $9 = 'Healthcare Practitioners and Technical', $10 = 'Life, Physical, and Social Science', $11 = 'Transportation and Material Moving', $12 = 'Personal Care and Service', $13 = 'Education Instruction and Library', $14 = 'Legal', $15 = 'Installation, Maintenance, and Repair', $16 = 'Farming, Fishing, and Forestry', $17 = 'Architecture and Engineering', $18 = 'Construction and Extraction', $19 = 'Building and Grounds Cleaning and Maintenance', $20 = 'Healthcare Support', $21 = 'Business and Financial Operations', $22 = 'Community and Social Service', $23 = 'Office and Administrative Support', $24 = '0', $25 = '1', $26 = '0', $27 = '1', $28 = '0', $29 = '1', $30 = '1', $31 = '0', $32 = '0', $33 = '100'

I've asked the same question on Reddit here and one user suggested that I stop using parameterized queries, is this possible with this library? Or is there another way to achieve this?

porsager commented 1 year ago

You're welcome and thanks :)

Not using parameterized queries wouldn't make any sense, and only bring downsides in my opinion. You can do it using sql.unsafe, but I wouldn't recommend it. I would think it's better to transform that log and replace the placeholders with the parameters instead if you absolutely need it.

harrisoncramer commented 1 year ago

Sounds good, thank you!