JaneJeon / blink

Modern, lightweight, planet-scale link shortener for teams 🎉
https://docs.blink.rest
GNU Affero General Public License v3.0
281 stars 27 forks source link

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory #793

Closed NicoloLazzaroni closed 1 year ago

NicoloLazzaroni commented 1 year ago

Hello,

When running dockerized blink i get this error on shorting certain URLS which have no metadata (in this case a URL which sends to a download page):

{"level":30,"time":1682532313255,"pid":7,"hostname":"CENSORED","msg":"Scraping https://openvpn.net/downloads/openvpn-connect-v3-macos.dmg for metadata..."}

<--- Last few GCs --->

[7:0x7ffa1fe3c2e0]   308077 ms: Scavenge 4034.8 (4117.4) -> 4032.7 (4127.9) MB, 30.2 / 0.0 ms  (average mu = 0.785, current mu = 0.725) allocation failure; 
[7:0x7ffa1fe3c2e0]   308176 ms: Scavenge 4044.6 (4127.9) -> 4041.4 (4130.7) MB, 52.1 / 0.0 ms  (average mu = 0.785, current mu = 0.725) allocation failure; 
[7:0x7ffa1fe3c2e0]   312267 ms: Mark-sweep 4048.0 (4130.7) -> 4041.1 (4144.9) MB, 4062.7 / 0.0 ms  (average mu = 0.565, current mu = 0.135) allocation failure; scavenge might not succeed

<--- JS stacktrace --->

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory

I also found that if you insert an already existing custom name when creating a short-url you get a mysql error instead of a user friendly error message.

Thank you

Best Regards

JaneJeon commented 1 year ago

Acknowledged both, will look into it.

Though my first hunch is that the former error has nothing to do with not having metadata but rather with the metadata scraping process actually downloading the .dmg file... o.O

JaneJeon commented 1 year ago

As for the former, something like this: https://github.com/sindresorhus/got/blob/main/documentation/examples/advanced-creation.js#L43 should do the trick.

As for the latter issues, I'm gonna look into it, though a stack trace/logs should help as well, if you still have it/can reproduce it reliably.

NicoloLazzaroni commented 1 year ago

Sure,

This is the exact error i get when adding a different url with an already existing brand link. Screenshot 2023-04-27 105644

This is the console log: {"level":40,"time":1682585786525,"pid":7,"hostname":"6a3ee1f4eafe","req":{"method":"POST","url":"/api/links","query":{},"params":{},"headers":{"host":"CENSORED","user-agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/112.0","accept":"*/*","accept-language":"it-IT,it;q=0.8,en-US;q=0.5,en;q=0.3","accept-encoding":"gzip, deflate, br","content-type":"application/json","origin":"CENSORED","cookie":"connect.sid=s%3ApGKvLJTM8nHxgnA1eiGIefHVGg4PEQlX.ukuLu%2FGyREDjuiTlveUi4jtoZIHyec40N%2FmY227LDyc","sec-fetch-dest":"empty","sec-fetch-mode":"cors","sec-fetch-site":"same-origin","dnt":"1","sec-gpc":"1","x-forwarded-proto":"https","x-forwarded-for":"CENSORED, CENSORED","x-forwarded-host":"CENSORED","x-forwarded-server":"CENSORED","content-length":"49","connection":"Keep-Alive"},"remoteAddress":"::ffff:172.24.0.1","remotePort":46820},"err":{"type":"UniqueViolationError","message":"insert into \"links\" (\"created_at\", \"creator_id\", \"hash\", \"meta\", \"original_url\", \"updated_at\") values ($1, $2, $3, $4, $5, $6) returning * - duplicate key value violates unique constraint \"links_hash_unique\"","stack":"UniqueViolationError: insert into \"links\" (\"created_at\", \"creator_id\", \"hash\", \"meta\", \"original_url\", \"updated_at\") values ($1, $2, $3, $4, $5, $6) returning * - duplicate key value violates unique constraint \"links_hash_unique\"\n at wrapError (/home/node/node_modules/db-errors/lib/dbErrors.js:19:14)\n at handleExecuteError (/home/node/node_modules/objection/lib/queryBuilder/QueryBuilder.js:1123:32)\n at AuthZQueryBuilder.execute (/home/node/node_modules/objection/lib/queryBuilder/QueryBuilder.js:449:20)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)","name":"UniqueViolationError","nativeError":{"type":"DatabaseError","message":"insert into \"links\" (\"created_at\", \"creator_id\", \"hash\", \"meta\", \"original_url\", \"updated_at\") values ($1, $2, $3, $4, $5, $6) returning * - duplicate key value violates unique constraint \"links_hash_unique\"","stack":"error: insert into \"links\" (\"created_at\", \"creator_id\", \"hash\", \"meta\", \"original_url\", \"updated_at\") values ($1, $2, $3, $4, $5, $6) returning * - duplicate key value violates unique constraint \"links_hash_unique\"\n at Parser.parseErrorMessage (/home/node/node_modules/pg-protocol/dist/parser.js:287:98)\n at Parser.handlePacket (/home/node/node_modules/pg-protocol/dist/parser.js:126:29)\n at Parser.parse (/home/node/node_modules/pg-protocol/dist/parser.js:39:38)\n at Socket.<anonymous> (/home/node/node_modules/pg-protocol/dist/index.js:11:42)\n at Socket.emit (node:events:513:28)\n at addChunk (node:internal/streams/readable:324:12)\n at readableAddChunk (node:internal/streams/readable:297:9)\n at Readable.push (node:internal/streams/readable:234:10)\n at TCP.onStreamRead (node:internal/stream_base_commons:190:23)","length":199,"name":"error","severity":"ERROR","code":"23505","detail":"Key (hash)=(test) already exists.","schema":"public","table":"links","constraint":"links_hash_unique","file":"nbtinsert.c","line":"663","routine":"_bt_check_unique"},"client":"postgres","table":"links","columns":["hash"],"constraint":"links_hash_unique","statusCode":409},"msg":"insert into \"links\" (\"created_at\", \"creator_id\", \"hash\", \"meta\", \"original_url\", \"updated_at\") values ($1, $2, $3, $4, $5, $6) returning * - duplicate key value violates unique constraint \"links_hash_unique\""}

I also found out that when you try to create a short url to an already existing url (eg. a short-url to https://google.com with an already existing short-url to https://google.com) the result is the new short link does not get created, the brand-link in the creation prompt gets filled with the brand-link of the existing-url and the QR is the one of the already-existing URL too.

JaneJeon commented 1 year ago

Hey uhh, I just realized that you were using mysql as your database of choice. I only support postgres currently. I'll make that clearer in the documentation (i.e. the second issue is a "wontfix"), though the first error is still within scope.