brianc / node-postgres

PostgreSQL client for node.js.
https://node-postgres.com
MIT License
12.21k stars 1.22k forks source link

pg-protocol can throw an uncatchable error "Cannot create a string longer than 0x1fffffe8 characters" when reading large data #2653

Open laurent22 opened 2 years ago

laurent22 commented 2 years ago

Using the "pg" package 8.5.1

To replicate the issue, save some large data as a blob into the database. In my case 340 MB was enough to trigger the bug. Then try to read back the data.

It will throw this uncatchable error that crashes the application:

Error: Cannot create a string longer than 0x1fffffe8 characters
    at Buffer.utf8Slice (<anonymous>)
    at Object.slice (node:buffer:593:37)
    at Buffer.toString (node:buffer:811:14)
    at BufferReader.string (/home/joplin/packages/server/node_modules/pg-protocol/src/buffer-reader.ts:35:32)
    at Parser.parseDataRowMessage (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:274:51)
    at Parser.handlePacket (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:172:21)
    at Parser.parse (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:101:30)
    at Socket.<anonymous> (/home/joplin/packages/server/node_modules/pg-protocol/src/index.ts:7:48)
    at Socket.emit (node:events:390:28)
    at addChunk (node:internal/streams/readable:315:12)

As a test I've tried to make it reject the promise when parser.parse(buffer, callback) throws an error in this function. At that time I can indeed catch the error, but rejecting the promise properly doesn't help for some reason and the error is still uncatchable:

https://github.com/brianc/node-postgres/blob/947ccee346f0d598e135548e1e4936a9a008fc6f/packages/pg-protocol/src/index.ts#L5

Any idea what might be the issue and how to fix it?

laurent22 commented 2 years ago

I think it may be related to this TODO here, as it's try to read as UTF-8 a binary blob:

https://github.com/brianc/node-postgres/blob/947ccee346f0d598e135548e1e4936a9a008fc6f/packages/pg-protocol/src/buffer-reader.ts#L6

And somewhat related is why does it try to read all fields as strings, even when it's a binary blob:

https://github.com/brianc/node-postgres/blob/947ccee346f0d598e135548e1e4936a9a008fc6f/packages/pg-protocol/src/parser.ts#L288

sehrope commented 2 years ago

And somewhat related is why does it try to read all fields as strings, even when it's a binary blob:

The PostgreSQL wire protocol has two modes for transferring fields: text and binary

The text mode is a string representation of each data type, such as "1234" to represent the number 1234 or "t" to represent the value true. The binary is more compact but is not documented anywhere outside of the server source.

This driver only supports reading the text mode and that class only handles text format responses: https://github.com/brianc/node-postgres/blob/947ccee346f0d598e135548e1e4936a9a008fc6f/packages/pg-protocol/src/parser.ts#L87

Any idea what might be the issue and how to fix it?

This is still a bug though as throwing uncatchable internal errors is never acceptable. If those long values cannot be deserialized as a string then there should be a length check and a proper error message bubbled up rather than crashing.

Even when this is fixed you likely don't want to ship that much data back and forth in one piece. There's many other options including reading slices of the data as slicing into smaller bytea or using the large object API (https://www.postgresql.org/docs/current/lo-funcs.html).

brianc commented 2 years ago

yah definitely a bug, uncatchable errors are always no bueno. I'll try to get this fixed. Everything @sehrope said is :+1: as always as well. :heart:

On Sun, Nov 14, 2021 at 9:50 AM Sehrope Sarkuni @.***> wrote:

And somewhat related is why does it try to read all fields as strings, even when it's a binary blob:

The PostgreSQL wire protocol has two modes for transferring fields: text and binary

The text mode is a string representation of each data type, such as "1234" to represent the number 1234 or "t" to represent the value true. The binary is more compact but is not documented anywhere outside of the server source.

This driver only supports reading the text mode and that class only handles text format responses: https://github.com/brianc/node-postgres/blob/947ccee346f0d598e135548e1e4936a9a008fc6f/packages/pg-protocol/src/parser.ts#L87

Any idea what might be the issue and how to fix it?

This is still a bug though as throwing uncatchable internal errors is never acceptable. If those long values cannot be deserialized as a string then there should be a length check and a proper error message bubbled up rather than crashing.

Even when this is fixed you likely don't want to ship that much data back and forth in one piece. There's many other options including reading slices of the data as slicing into smaller bytea or using the large object API ( https://www.postgresql.org/docs/current/lo-funcs.html).

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/brianc/node-postgres/issues/2653#issuecomment-968316112, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAAMHION47SFXFHZK354I5LUL7LCXANCNFSM5H723XBQ .

Joyjk commented 2 years ago

npm run serve

daytide-webapi@0.1.0 serve vue-cli-service serve

INFO Starting development server... 10% building 8/15 modules 7 active ...ytide-webapi\node_modules\webpack-dev-server\client\utils\sendMessage.js When setting useBuiltIns: 'usage', polyfills are automatically imported when needed. Please remove the direct import of @babel/polyfill or use useBuiltIns: 'entry' instead. 67% building 1002/1042 modules 40 active ...o\src\components\CartProdDelete.vue?vue&type=template&id=9160381e& When setting useBuiltIns: 'usage', polyfills are automatically imported when needed. Please remove the direct import of @babel/polyfill or use useBuiltIns: 'entry' instead. 69% building 7136/7162 modules 26 active ...odules\spdy-transport\node_modules\readable-stream\CONTRIBUTING.md n ode:buffer:593 slice: (buf, start, end) => buf.utf8Slice(start, end), ^

Error: Cannot create a string longer than 0x1fffffe8 characters at Object.slice (node:buffer:593:37) at Buffer.toString (node:buffer:811:14) at asString (H:\Laravel\Vue Nuxt Js\daytide-webapi\node_modules\webpack\lib\NormalModule.js:31:14) at H:\Laravel\Vue Nuxt Js\daytide-webapi\node_modules\webpack\lib\NormalModule.js:347:39 at H:\Laravel\Vue Nuxt Js\daytide-webapi\node_modules\loader-runner\lib\LoaderRunner.js:373:3 at iterateNormalLoaders (H:\Laravel\Vue Nuxt Js\daytide-webapi\node_modules\loader-runner\lib\LoaderRunner.js:214:10) at Array. (H:\Laravel\Vue Nuxt Js\daytide-webapi\node_modules\loader-runner\lib\LoaderRunner.js:205:4) at Storage.finished (H:\Laravel\Vue Nuxt Js\daytide-webapi\node_modules\enhanced-resolve\lib\CachedInputFileSystem.js:55:16) at H:\Laravel\Vue Nuxt Js\daytide-webapi\node_modules\enhanced-resolve\lib\CachedInputFileSystem.js:91:9
at H:\Laravel\Vue Nuxt Js\daytide-webapi\node_modules\graceful-fs\graceful-fs.js:123:16 at FSReqCallback.readFileAfterClose [as oncomplete] (node:internal/fs/read_file_context:68:3) { code: 'ERR_STRING_TOO_LONG' }


What is the problem with my Vuejs old project? It is not running.

laurent22 commented 2 years ago

What is the problem

Is it not clear from the thread above that you, presumably, just read?

xqin1 commented 11 months ago

any updates on this bug fix? With the latest version, there is still uncatchable error with large query result. thanks

alxndrsn commented 4 days ago

Just ran into this with v8.8.0. Hopefully there's a way to avoid converting big (> 250MB) binary fields into javascript Strings.