Open brunoamaral opened 2 months ago
Curious if you have rebooted the container since making the changes? Are there any logs that show anything related to when it was deleted?
@dkornahrens, yes the container was restarted a while back and the notification is still sent. (At first I thought I hadn't deleted it so I didn't check immediately)
The only logs I found were this:
2024-08-19T07:01:41+01:00 [MONITOR] WARN: Monitor #3 'Daniel TraΓ§a': Failing: timeout of 48000ms exceeded | Interval: 60 seconds | Type: http | Down Count: 0 | Resend Interval: 0
Trace: KnexTimeoutError: Knex: Timeout acquiring a connection. The pool is probably full. Are you missing a .transacting(trx) call?
at Client_SQLite3.acquireConnection (/app/node_modules/knex/lib/client.js:312:26)
at runNextTicks (node:internal/process/task_queues:60:5)
at listOnTimeout (node:internal/timers:538:9)
at process.processTimers (node:internal/timers:512:7)
at async Runner.ensureConnection (/app/node_modules/knex/lib/execution/runner.js:287:28)
at async Runner.run (/app/node_modules/knex/lib/execution/runner.js:30:19)
at async RedBeanNode.findOne (/app/node_modules/redbean-node/dist/redbean-node.js:499:19)
at async Proxy.updateTlsInfo (/app/server/model/monitor.js:1144:27)
at async Proxy.handleTlsInfo (/app/server/model/monitor.js:1692:9)
at async TLSSocket.<anonymous> (/app/server/model/monitor.js:523:29) {
sql: undefined,
bindings: undefined
}
at process.unexpectedErrorHandler (/app/server/server.js:1905:13)
at process.emit (node:events:517:28)
at emit (node:internal/process/promises:149:20)
at processPromiseRejections (node:internal/process/promises:283:27)
at processTicksAndRejections (node:internal/process/task_queues:96:32)
at runNextTicks (node:internal/process/task_queues:64:3)
at listOnTimeout (node:internal/timers:538:9)
at process.processTimers (node:internal/timers:512:7)
If you keep encountering errors, please report to https://github.com/louislam/uptime-kuma/issues
Trace: KnexTimeoutError: Knex: Timeout acquiring a connection. The pool is probably full. Are you missing a .transacting(trx) call?
at Client_SQLite3.acquireConnection (/app/node_modules/knex/lib/client.js:312:26)
at runNextTicks (node:internal/process/task_queues:60:5)
at process.processTimers (node:internal/timers:509:9)
at async Runner.ensureConnection (/app/node_modules/knex/lib/execution/runner.js:287:28)
at async Runner.run (/app/node_modules/knex/lib/execution/runner.js:30:19)
at async RedBeanNode.normalizeRaw (/app/node_modules/redbean-node/dist/redbean-node.js:572:22)
at async Monitor.getNotificationList (/app/server/model/monitor.js:1442:32)
at async Monitor.sendNotification (/app/server/model/monitor.js:1402:38)
at async beat (/app/server/model/monitor.js:964:21)
at async Timeout.safeBeat [as _onTimeout] (/app/server/model/monitor.js:1032:17) {
sql: 'SELECT notification.* FROM notification, monitor_notification WHERE monitor_id = ? AND monitor_notification.notification_id = notification.id ',
bindings: [ 1 ]
}
at Timeout.safeBeat [as _onTimeout] (/app/server/model/monitor.js:1034:25)
at runNextTicks (node:internal/process/task_queues:60:5)
at process.processTimers (node:internal/timers:509:9)
This might be related to the performance bugs resolved by #4500
I think I am using sqlite because this was installed with docker. As for retention, I checked and it's 180 days.
There's an option to auto vacuum but my db was created after version 1.10. Should I still trigger it?
Please do so. But please also look at the DB size on disk before and after doing this.
π I have found these related issues/pull requests
Didn't find any similar issues
π‘οΈ Security Policy
Description
I have also checked the kuma.db file and found the removed site is still listed in the
monitor
tableπ Reproduction steps
Add a site to monitor, delete it, check the kuma.db file to make sure it's gone.
π Expected behavior
Site deleted and no more notifications sent
π Actual Behavior
Site isn't deleted.
π» Uptime-Kuma Version
1.23.12
π» Operating System and Arch
Ubuntu
π Browser
Chrome latest
π₯οΈ Deployment Environment
π Relevant log output
No response