lightninglabs / lightning-terminal

Lightning Terminal: Your Home for Lightning Liquidity
MIT License
501 stars 88 forks source link

Umbrel Lightning Terminal #593

Closed WrogbeNepe closed 7 months ago

WrogbeNepe commented 1 year ago

Hello, I'm reaching out because the Lightning Terminal app in Umbrel keeps crashing every 1-2 weeks with the error message 'ECONNREFUSED'. The app only starts working again after a re-download. Is there a way to fix this issue?

levmi commented 1 year ago

Do you know what version of Lightning Terminal you’re running? Is it the most up to date release?

WrogbeNepe commented 1 year ago

I'm using version 0.10.1-alpha

levmi commented 1 year ago

Can you provide any logs from either Umbrel or litd itself?

WrogbeNepe commented 1 year ago

===================== = Umbrel debug info =

Umbrel version

0.5.4

Flashed OS version

v0.4.11

Raspberry Pi Model

Revision : c03114 Serial : 100000003972ee85 Model : Raspberry Pi 4 Model B Rev 1.4

Firmware

Dec 1 2021 15:01:54 Copyright (c) 2012 Broadcom version 71bd3109023a0c8575585ba87cbb374d2eeb038f (clean) (release) (start)

Temperature

temp=56.9'C

Throttling

throttled=0x0

Memory usage

          total        used        free      shared  buff/cache   available

Mem: 3.8G 2.9G 80M 3.0M 793M 795M Swap: 4.1G 2.0G 2.0G

total: 77.0% mempool: 23.6% bitcoin: 19% lightning: 17.7% electrs: 7.4% sphinx-relay: 4.7% snowflake: 1.1% ride-the-lightning: 1.1% system: 0.9% lightning-terminal: 0.9% robosats: 0.6%

Memory monitor logs

2023-07-08 09:06:45 Warning memory usage at 91% 2023-07-08 11:53:50 Warning memory usage at 91% 2023-07-09 05:17:13 Warning memory usage at 91% 2023-07-09 06:20:14 Warning memory usage at 91% 2023-07-09 11:19:20 Warning memory usage at 91% 2023-07-09 11:28:20 Warning memory usage at 91% 2023-07-09 12:18:21 Warning memory usage at 91% 2023-07-09 13:30:22 Warning memory usage at 91% 2023-07-10 14:58:47 Warning memory usage at 91% 2023-07-10 21:18:53 Warning memory usage at 91%

Filesystem information

Filesystem Size Used Avail Use% Mounted on /dev/root 30G 3.6G 25G 13% / /dev/sda1 916G 632G 238G 73% /home/umbrel/umbrel

Startup service logs

Jun 07 13:18:22 umbrel umbrel startup[2691]: Creating electrs_electrs_1 ... done Jun 07 13:18:22 umbrel umbrel startup[2691]: Creating electrs_app_1 ... Jun 07 13:18:22 umbrel umbrel startup[2691]: Creating mempool_mariadb_1 ... done Jun 07 13:18:23 umbrel umbrel startup[2691]: Creating bitcoin_i2pd_daemon_1 ... done Jun 07 13:18:24 umbrel umbrel startup[2691]: Creating electrs_app_proxy_1 ... done Jun 07 13:18:24 umbrel umbrel startup[2691]: Creating ride-the-lightning_web_1 ... done Jun 07 13:18:24 umbrel umbrel startup[2691]: Creating ride-the-lightning_tor_server_1 ... done Jun 07 13:18:25 umbrel umbrel startup[2691]: Creating bitcoin_tor_1 ... done Jun 07 13:18:25 umbrel umbrel startup[2691]: Creating mempool_tor_server_1 ... done Jun 07 13:18:25 umbrel umbrel startup[2691]: Creating lightning-terminal_app_proxy_1 ... done Jun 07 13:18:26 umbrel umbrel startup[2691]: Creating ride-the-lightning_boltz_1 ... done Jun 07 13:18:26 umbrel umbrel startup[2691]: Creating lightning_tor_1 ... done Jun 07 13:18:26 umbrel umbrel startup[2691]: Creating lightning_app_1 ... done Jun 07 13:18:26 umbrel umbrel startup[2691]: Creating ride-the-lightning_app_proxy_1 ... done Jun 07 13:18:26 umbrel umbrel startup[2691]: Creating bitcoin_app_proxy_1 ... done Jun 07 13:18:26 umbrel umbrel startup[2691]: Creating mempool_app_proxy_1 ... done Jun 07 13:18:26 umbrel umbrel startup[2691]: Creating lightning_tor_server_1 ... done Jun 07 13:18:27 umbrel umbrel startup[2691]: Creating lightning_app_proxy_1 ... done Jun 07 13:18:27 umbrel umbrel startup[2691]: Creating bitcoin_bitcoind_1 ... done Jun 07 13:18:27 umbrel umbrel startup[2691]: Creating bitcoin_server_1 ... Jun 07 13:18:29 umbrel umbrel startup[2691]: Creating lightning_lnd_1 ... done Jun 07 13:18:30 umbrel umbrel startup[2691]: Creating electrs_app_1 ... done Jun 07 13:18:31 umbrel umbrel startup[2691]: Creating bitcoin_server_1 ... done Jun 07 13:18:32 umbrel umbrel startup[2691]: Umbrel is now accessible at Jun 07 13:18:32 umbrel umbrel startup[2691]: http://umbrel.local Jun 07 13:18:32 umbrel umbrel startup[2691]: http://192.168.1.185 Jun 07 13:18:32 umbrel systemd[1]: Started Umbrel Startup Service. Jun 20 01:23:40 umbrel passwd[1834]: pam_unix(passwd:chauthtok): password changed for umbrel Jun 22 02:09:24 umbrel passwd[22852]: pam_unix(passwd:chauthtok): password changed for umbrel

External storage service logs

Jun 07 13:17:08 umbrel external storage mounter[495]: Waiting for USB devices... Jun 07 13:17:09 umbrel external storage mounter[495]: Checking if the device is ext4... Jun 07 13:17:09 umbrel external storage mounter[495]: Yes, it is ext4 Jun 07 13:17:09 umbrel external storage mounter[495]: Checking filesystem for corruption... Jun 07 13:17:09 umbrel external storage mounter[495]: e2fsck 1.44.5 (15-Dec-2018) Jun 07 13:17:10 umbrel external storage mounter[495]: umbrel: recovering journal Jun 07 13:17:13 umbrel external storage mounter[495]: Clearing orphaned inode 15731536 (uid=1000, gid=1000, mode=0100600, size=0) Jun 07 13:17:13 umbrel external storage mounter[495]: Clearing orphaned inode 15731404 (uid=1000, gid=1000, mode=0100600, size=0) Jun 07 13:17:13 umbrel external storage mounter[495]: Clearing orphaned inode 15731424 (uid=1000, gid=1000, mode=0100600, size=0) Jun 07 13:17:13 umbrel external storage mounter[495]: Clearing orphaned inode 15731423 (uid=1000, gid=1000, mode=0100600, size=0) Jun 07 13:17:13 umbrel external storage mounter[495]: Setting free inodes count to 60681806 (was 60667827) Jun 07 13:17:13 umbrel external storage mounter[495]: Setting free blocks count to 77261519 (was 80712223) Jun 07 13:17:13 umbrel external storage mounter[495]: umbrel: clean, 373170/61054976 files, 166928689/244190208 blocks Jun 07 13:17:13 umbrel external storage mounter[495]: Mounting partition... Jun 07 13:17:13 umbrel external storage mounter[495]: Checking if device contains an Umbrel install... Jun 07 13:17:13 umbrel external storage mounter[495]: Yes, it contains an Umbrel install Jun 07 13:17:13 umbrel external storage mounter[495]: Bind mounting external storage over local Umbrel installation... Jun 07 13:17:13 umbrel external storage mounter[495]: Bind mounting external storage over local Docker data dir... Jun 07 13:17:13 umbrel external storage mounter[495]: Bind mounting external storage to /swap Jun 07 13:17:13 umbrel external storage mounter[495]: Bind mounting SD card root at /sd-card... Jun 07 13:17:13 umbrel external storage mounter[495]: Checking Umbrel root is now on external storage... Jun 07 13:17:14 umbrel external storage mounter[495]: Checking /var/lib/docker is now on external storage... Jun 07 13:17:14 umbrel external storage mounter[495]: Checking /swap is now on external storage... Jun 07 13:17:14 umbrel external storage mounter[495]: Setting up swapfile Jun 07 13:17:15 umbrel external storage mounter[495]: Setting up swapspace version 1, size = 4 GiB (4294963200 bytes) Jun 07 13:17:15 umbrel external storage mounter[495]: no label, UUID=c827d793-5324-4a6b-8672-db07e75a1256 Jun 07 13:17:15 umbrel external storage mounter[495]: Checking SD Card root is bind mounted at /sd-root... Jun 07 13:17:15 umbrel external storage mounter[495]: Starting external drive mount monitor... Jun 07 13:17:15 umbrel external storage mounter[495]: Mount script completed successfully! Jun 07 13:17:15 umbrel systemd[1]: Started External Storage Mounter.

External storage SD card update service logs

-- Logs begin at Wed 2023-06-07 13:17:01 UTC, end at Tue 2023-07-11 04:43:08 UTC. -- Jun 07 13:17:40 umbrel systemd[1]: Starting External Storage SDcard Updater... Jun 07 13:17:40 umbrel external storage updater[2613]: Checking if SD card Umbrel is newer than external storage... Jun 07 13:17:40 umbrel external storage updater[2613]: No, SD version is not newer, exiting. Jun 07 13:17:40 umbrel systemd[1]: Started External Storage SDcard Updater.

Karen logs

Pulling lnd ... extracting (34.5%) Pulling lnd ... extracting (41.6%) Pulling lnd ... extracting (46.4%) Pulling lnd ... extracting (52.3%) Pulling lnd ... extracting (57.1%) Pulling lnd ... extracting (65.4%) Pulling lnd ... extracting (69.0%) Pulling lnd ... extracting (70.2%) Pulling lnd ... extracting (76.1%) Pulling lnd ... extracting (83.2%) Pulling lnd ... extracting (91.6%) Pulling lnd ... extracting (96.3%) Pulling lnd ... extracting (100.0%) Pulling lnd ... pull complete Pulling lnd ... extracting (100.0%) Pulling lnd ... extracting (100.0%) Pulling lnd ... pull complete Pulling lnd ... extracting (72.7%) Pulling lnd ... extracting (100.0%) Pulling lnd ... pull complete Pulling lnd ... extracting (100.0%) Pulling lnd ... extracting (100.0%) Pulling lnd ... pull complete Pulling lnd ... digest: sha256:a6e99f1a2a790ea630... Pulling lnd ... status: downloaded newer image fo... Pulling lnd ... done Starting app lightning... Executing hook: /home/umbrel/umbrel/app-data/lightning/hooks/pre-start Creating lightning_tor_server_1 ... Creating lightning_app_proxy_1 ... Creating lightning_app_1 ... Creating lightning_tor_1 ... Creating lightning_lnd_1 ... Creating lightning_lnd_1 ... done Creating lightning_tor_server_1 ... done Creating lightning_tor_1 ... done Creating lightning_app_proxy_1 ... done Creating lightning_app_1 ... done Untagged: lightninglabs/lnd:v0.16.3-beta@sha256:e0bc0b0e62ec722e66ba835a2871c3c4c0655621d8b441b692d736bc50e6150a Deleted: sha256:0f41da161c242697fb2c38e5215d828633ba723a8764be4150dbd50a429286b5 Deleted: sha256:8ef6e00ac0c52579772c1df0a330ebf5752413b6fed378c1079b966c6dbbd349 Deleted: sha256:d8e2699b57d00fc70d3c6b28ab80e5c4dbfd313834ad8ef76664f1bc256a9da4 Deleted: sha256:8e8866c3d0b7ec36a0fa4e676067425ad45fdd945426d97e40a0052f4744a253 Deleted: sha256:029daedcb5b7315188d17390abc765f14a4a6635fd89e2195dc19f6c46acc9a7 Deleted: sha256:a72c225ef1fb1bdfa6f2129d9c705e6bcae21c14bfde5288f03259e03a99d1ee Deleted: sha256:30a72f56bc376a9c359450fe6524b8c5157cda4c922c16fcb44cf7a5ccbbcb20 Error response from daemon: conflict: unable to remove repository reference "getumbrel/umbrel-lightning:v1.1.2@sha256:3ab819f6335abebb160d2c7b4bd979ad706e25208d219beb9f82d4c1dadd3eff" (must force) - container 3c5c81db7e50 is using its referenced image 1213221bf138 Error response from daemon: conflict: unable to remove repository reference "getumbrel/tor:0.4.7.8@sha256:2ace83f22501f58857fa9b403009f595137fa2e7986c4fda79d82a8119072b6a" (must force) - container 4d90c0cc430f is using its referenced image 105438dd043f Got signal: debug karen is getting triggered!

Docker containers

NAMES STATUS lightning_lnd_1 Up About a minute lightning_app_1 Up About a minute lightning_app_proxy_1 Up About a minute lightning_tor_1 Up About a minute lightning_tor_server_1 Up About a minute lightning-terminal_web_1 Up 2 days lightning-terminal_tor_server_1 Up 2 days lightning-terminal_app_proxy_1 Up 2 days electrs_app_1 Up 2 days electrs_tor_1 Up 2 days electrs_electrs_1 Up 2 days electrs_app_proxy_1 Up 2 days electrs_tor_server_1 Up 2 days dashboard Up 2 weeks manager Up 2 weeks sphinx-relay_app_proxy_1 Up 3 weeks sphinx-relay_tor_server_1 Up 3 weeks sphinx-relay_server_1 Up 54 seconds bitcoin_server_1 Up 4 weeks bitcoin_bitcoind_1 Up 4 weeks bitcoin_i2pd_daemon_1 Up 5 days bitcoin_app_proxy_1 Up 4 weeks bitcoin_tor_server_1 Up 4 weeks bitcoin_tor_1 Up 4 weeks robosats_tor_server_1 Up 4 weeks robosats_app_proxy_1 Up 4 weeks robosats_web_1 Up 4 weeks (unhealthy) ride-the-lightning_boltz_1 Up 4 weeks ride-the-lightning_tor_server_1 Up 4 weeks ride-the-lightning_web_1 Up 4 weeks ride-the-lightning_app_proxy_1 Up 4 weeks mempool_web_1 Up 4 weeks mempool_api_1 Up 4 weeks mempool_tor_server_1 Up 4 weeks mempool_mariadb_1 Up 4 weeks mempool_app_proxy_1 Up 4 weeks snowflake_app_proxy_1 Up 4 weeks snowflake_proxy_1 Up 4 weeks snowflake_web_1 Up 4 weeks snowflake_tor_server_1 Up 4 weeks nginx Up 4 weeks tor_server Up 4 weeks auth Up 4 weeks tor_proxy Up 4 weeks

Umbrel logs

Attaching to manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:43:55 GMT] "GET /v1/system/memory HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:43:55 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:43:55 GMT] "GET /v1/system/storage HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:43:55 GMT] "GET /v1/system/get-update HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:43:56 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:43:57 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:43:58 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:43:59 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:44:01 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager manager | ::ffff:10.21.21.2 - - [Tue, 11 Jul 2023 04:44:03 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" manager | manager | umbrel-manager

Tor Proxy logs

Attaching to tor_proxy tor_proxy | Jul 10 21:16:47.000 [notice] Heartbeat: Tor's uptime is 33 days 5:55 hours, with 12 circuits open. I've sent 1.13 GB and received 5.97 GB. I've received 1260 connections on IPv4 and 0 on IPv6. I've made 1111 connections with IPv4 and 0 with IPv6. tor_proxy | Jul 10 21:16:47.000 [notice] While bootstrapping, fetched this many bytes: 634165 (consensus network-status fetch); 11092 (authority cert fetch); 8452798 (microdescriptor fetch) tor_proxy | Jul 10 21:16:47.000 [notice] While not bootstrapping, fetched this many bytes: 21448850 (consensus network-status fetch); 64939 (authority cert fetch); 30335888 (microdescriptor fetch) tor_proxy | Jul 10 21:16:47.000 [notice] Average packaged cell fullness: 43.544%. TLS write overhead: 3% tor_proxy | Jul 11 03:16:47.000 [notice] Heartbeat: Tor's uptime is 33 days 11:55 hours, with 12 circuits open. I've sent 1.13 GB and received 6.02 GB. I've received 1261 connections on IPv4 and 0 on IPv6. I've made 1113 connections with IPv4 and 0 with IPv6. tor_proxy | Jul 11 03:16:47.000 [notice] While bootstrapping, fetched this many bytes: 634165 (consensus network-status fetch); 11092 (authority cert fetch); 8452798 (microdescriptor fetch) tor_proxy | Jul 11 03:16:47.000 [notice] While not bootstrapping, fetched this many bytes: 21580678 (consensus network-status fetch); 64939 (authority cert fetch); 30411199 (microdescriptor fetch) tor_proxy | Jul 11 03:16:47.000 [notice] Average packaged cell fullness: 43.562%. TLS write overhead: 3%

App logs

bitcoin

Attaching to bitcoin_server_1, bitcoin_bitcoind_1, bitcoin_i2pd_daemon_1, bitcoin_app_proxy_1, bitcoin_tor_server_1, bitcoin_tor_1 app_proxy_1 | yarn run v1.22.19 app_proxy_1 | $ node ./bin/www app_proxy_1 | [HPM] Proxy created: / -> http://10.21.22.2:3005 app_proxy_1 | Waiting for 10.21.22.2:3005 to open... app_proxy_1 | Bitcoin Node is now ready... app_proxy_1 | Listening on port: 2100 server_1 | yarn run v1.22.18 server_1 | $ node ./bin/www server_1 | Sat, 10 Jun 2023 22:42:57 GMT morgan deprecated morgan(options): use morgan("default", options) instead at app.js:33:9 server_1 | Sat, 10 Jun 2023 22:42:57 GMT morgan deprecated default format: use combined format at app.js:33:9 server_1 | Listening on port 3005 bitcoind_1 | 2023-07-11T04:21:29Z UpdateTip: new best=0000000000000000000018d0570ded5f2832bc95ee59c57c25aaf4854bb53d3c height=798211 version=0x2cacc000 log2_work=94.293697 tx=863145223 date='2023-07-11T04:21:06Z' progress=1.000000 cache=63.8MiB(310598txo) bitcoind_1 | 2023-07-11T04:27:12Z Saw new header hash=00000000000000000001757bca401076cda1c8bcf3c1cc7917af958bb394f583 height=798212 bitcoind_1 | 2023-07-11T04:27:12Z [net] Saw new cmpctblock header hash=00000000000000000001757bca401076cda1c8bcf3c1cc7917af958bb394f583 peer=8659 bitcoind_1 | 2023-07-11T04:27:13Z UpdateTip: new best=00000000000000000001757bca401076cda1c8bcf3c1cc7917af958bb394f583 height=798212 version=0x20fd4000 log2_work=94.293710 tx=863149273 date='2023-07-11T04:26:58Z' progress=1.000000 cache=64.5MiB(315128txo) bitcoind_1 | 2023-07-11T04:28:46Z New outbound peer connected: version: 70016, blocks=798212, peer=9242 (block-relay-only) bitcoind_1 | 2023-07-11T04:32:03Z Socks5() connect to 185.82.200.131:8333 failed: general failure bitcoind_1 | 2023-07-11T04:38:06Z Socks5() connect to 170.75.170.252:39388 failed: connection refused bitcoind_1 | 2023-07-11T04:40:41Z New outbound peer connected: version: 70016, blocks=798212, peer=9249 (block-relay-only) i2pd_daemon_1 | 04:25:04@976/error - Tunnel: Tunnel with id 2972803422 already exists i2pd_daemon_1 | 04:25:24@802/error - SSU2: RelayIntro unknown router to introduce i2pd_daemon_1 | 04:25:38@259/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message i2pd_daemon_1 | 04:25:43@728/error - Destination: Can't send LeaseSet request, no outbound tunnels found i2pd_daemon_1 | 04:25:43@728/error - Destination: Can't publish LeaseSet. No outbound tunnels i2pd_daemon_1 | 04:27:14@259/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message i2pd_daemon_1 | 04:27:43@728/error - Destination: Can't publish LeaseSet. Destination is not ready i2pd_daemon_1 | 04:37:23@728/error - Garlic: Failed to decrypt message i2pd_daemon_1 | 04:38:13@728/error - Destination: Can't publish LeaseSet. No outbound tunnels i2pd_daemon_1 | 04:38:55@728/error - Garlic: Failed to decrypt message tor_1 | Jul 11 04:02:42.000 [notice] Have tried resolving or connecting to address '[scrubbed]' at 3 different places. Giving up. tor_1 | Jul 11 04:32:03.000 [notice] Have tried resolving or connecting to address '[scrubbed]' at 3 different places. Giving up. tor_1 | Jul 11 04:42:44.000 [notice] Heartbeat: Tor's uptime is 30 days 5:54 hours, with 42 circuits open. I've sent 12.07 GB and received 24.45 GB. I've received 30464 connections on IPv4 and 0 on IPv6. I've made 214 connections with IPv4 and 0 with IPv6. tor_1 | Jul 11 04:42:44.000 [notice] While bootstrapping, fetched this many bytes: 632034 (consensus network-status fetch); 14357 (authority cert fetch); 8476580 (microdescriptor fetch) tor_1 | Jul 11 04:42:44.000 [notice] While not bootstrapping, fetched this many bytes: 20133545 (consensus network-status fetch); 74551 (authority cert fetch); 28443353 (microdescriptor fetch) tor_1 | Jul 11 04:42:44.000 [notice] Average packaged cell fullness: 47.690%. TLS write overhead: 3%

electrs

Attaching to electrs_app_1, electrs_tor_1, electrs_electrs_1, electrs_app_proxy_1, electrs_tor_server_1 app_1 | > umbrel-electrs@1.0.1 dev:backend app_1 | > npm run start -w umbrel-electrs-backend app_1 | app_1 | app_1 | > umbrel-electrs-backend@0.1.12 start app_1 | > node ./bin/www app_1 | app_1 | Sun, 09 Jul 2023 04:22:51 GMT morgan deprecated morgan(options): use morgan("default", options) instead at app.js:28:9 app_1 | Sun, 09 Jul 2023 04:22:51 GMT morgan deprecated default format: use combined format at app.js:28:9 app_1 | Listening on port 3006 app_proxy_1 | yarn run v1.22.19 app_proxy_1 | $ node ./bin/www app_proxy_1 | [HPM] Proxy created: / -> http://10.21.22.4:3006 app_proxy_1 | Waiting for 10.21.22.4:3006 to open... app_proxy_1 | Electrs is now ready... app_proxy_1 | Listening on port: 2102 electrs_1 | [2023-07-11T04:15:43.298Z INFO electrs::index] indexing 1 blocks: [798208..798208] electrs_1 | [2023-07-11T04:15:45.022Z INFO electrs::chain] chain updated: tip=00000000000000000001f69af4ecaf4a9ac22631ccfaa7cda0fd52fdce873c91, height=798208 electrs_1 | [2023-07-11T04:18:53.552Z INFO electrs::index] indexing 1 blocks: [798209..798209] electrs_1 | [2023-07-11T04:18:53.869Z INFO electrs::chain] chain updated: tip=000000000000000000002461b6330af6269e52fad8143a9872afe2bae302c427, height=798209 electrs_1 | [2023-07-11T04:19:43.279Z INFO electrs::index] indexing 1 blocks: [798210..798210] electrs_1 | [2023-07-11T04:19:43.711Z INFO electrs::chain] chain updated: tip=0000000000000000000332bba47bcdba3890c7903b6b8a652c8dbee008763427, height=798210 electrs_1 | [2023-07-11T04:21:30.366Z INFO electrs::index] indexing 1 blocks: [798211..798211] electrs_1 | [2023-07-11T04:21:30.808Z INFO electrs::chain] chain updated: tip=0000000000000000000018d0570ded5f2832bc95ee59c57c25aaf4854bb53d3c, height=798211 electrs_1 | [2023-07-11T04:27:24.108Z INFO electrs::index] indexing 1 blocks: [798212..798212] electrs_1 | [2023-07-11T04:27:24.285Z INFO electrs::chain] chain updated: tip=00000000000000000001757bca401076cda1c8bcf3c1cc7917af958bb394f583, height=798212 tor_1 | Jul 10 22:22:37.000 [notice] Heartbeat: Tor's uptime is 1 day 18:00 hours, with 9 circuits open. I've sent 22.11 MB and received 34.84 MB. I've received 0 connections on IPv4 and 0 on IPv6. I've made 24 connections with IPv4 and 0 with IPv6. tor_1 | Jul 10 22:22:37.000 [notice] While bootstrapping, fetched this many bytes: 638800 (consensus network-status fetch); 14103 (authority cert fetch); 8739384 (microdescriptor fetch) tor_1 | Jul 10 22:22:37.000 [notice] While not bootstrapping, fetched this many bytes: 1096479 (consensus network-status fetch); 46150 (authority cert fetch); 2405711 (microdescriptor fetch) tor_1 | Jul 10 23:15:27.000 [notice] No circuits are opened. Relaxed timeout for circuit 1533 (a Hidden service: Uploading HS descriptor 4-hop circuit in state doing handshakes with channel state open) to 60000ms. However, it appears the circuit has timed out anyway. [1 similar message(s) suppressed in last 26100 seconds] tor_1 | Jul 11 00:50:51.000 [notice] No circuits are opened. Relaxed timeout for circuit 1576 (a Measuring circuit timeout 4-hop circuit in state doing handshakes with channel state open) to 60000ms. However, it appears the circuit has timed out anyway. [1 similar message(s) suppressed in last 5760 seconds] tor_1 | Jul 11 04:22:37.000 [notice] Heartbeat: Tor's uptime is 2 days 0:00 hours, with 9 circuits open. I've sent 25.21 MB and received 38.19 MB. I've received 0 connections on IPv4 and 0 on IPv6. I've made 25 connections with IPv4 and 0 with IPv6. tor_1 | Jul 11 04:22:37.000 [notice] While bootstrapping, fetched this many bytes: 638800 (consensus network-status fetch); 14103 (authority cert fetch); 8739384 (microdescriptor fetch) tor_1 | Jul 11 04:22:37.000 [notice] While not bootstrapping, fetched this many bytes: 1252460 (consensus network-status fetch); 53250 (authority cert fetch); 2474976 (microdescriptor fetch)

lightning

Attaching to lightning_lnd_1, lightning_app_1, lightning_app_proxy_1, lightning_tor_1, lightning_tor_server_1 app_1 | Checking LND status... app_1 | Waiting for LND... app_1 | Checking LND status... app_1 | Waiting for LND... app_1 | Checking LND status... app_1 | Waiting for LND... app_1 | Checking LND status... app_1 | LND ready! app_1 | Attempting to unlock wallet... app_1 | Wallet unlocked! tor_1 | Jul 11 04:43:12.000 [notice] Bootstrapped 55% (loading_descriptors): Loading relay descriptors tor_1 | Jul 11 04:43:12.000 [notice] Bootstrapped 60% (loading_descriptors): Loading relay descriptors tor_1 | Jul 11 04:43:13.000 [notice] Bootstrapped 69% (loading_descriptors): Loading relay descriptors tor_1 | Jul 11 04:43:13.000 [notice] Bootstrapped 75% (enough_dirinfo): Loaded enough directory info to build circuits tor_1 | Jul 11 04:43:13.000 [notice] Bootstrapped 80% (ap_conn): Connecting to a relay to build circuits tor_1 | Jul 11 04:43:13.000 [notice] Bootstrapped 85% (ap_conn_done): Connected to a relay to build circuits tor_1 | Jul 11 04:43:14.000 [notice] Bootstrapped 89% (ap_handshake): Finishing handshake with a relay to build circuits lnd_1 | 2023-07-11 04:44:12.247 [INF] HSWC: Payment circuits loaded: num_pending=1, num_open=1 lnd_1 | 2023-07-11 04:44:12.266 [INF] HSWC: Trimming open circuits for chan_id=728773:7:1, start_htlc_id=7827 lnd_1 | 2023-07-11 04:44:12.266 [INF] HSWC: Trimming open circuits for chan_id=726641:1556:1, start_htlc_id=13796 lnd_1 | 2023-07-11 04:44:12.266 [INF] HSWC: Trimming open circuits for chan_id=721975:1154:1, start_htlc_id=7852 lnd_1 | 2023-07-11 04:44:12.618 [WRN] CRTR: Routing failure for local channel 799225106662817793 occurred lnd_1 | 2023-07-11 04:44:12.618 [WRN] CRTR: Routing failure for local channel 799225106662817793 occurred lnd_1 | 2023-07-11 04:44:12.642 [INF] LTND: Channel backup proxy channel notifier starting lnd_1 | 2023-07-11 04:44:12.644 [INF] ATPL: Instantiating autopilot with active=false, max_channels=5, allocation=0.600000, min_chan_size=20000, max_chan_size=16777215, private=false, min_confs=1, conf_target=3 lnd_1 | 2023-07-11 04:44:12.655 [INF] LTND: We're not running within systemd or the service type is not 'notify' tor_1 | Jul 11 04:43:14.000 [notice] Bootstrapped 90% (ap_handshake_done): Handshake finished with a relay to build circuits tor_1 | Jul 11 04:43:14.000 [notice] Bootstrapped 95% (circuit_create): Establishing a Tor circuit tor_1 | Jul 11 04:43:15.000 [notice] Bootstrapped 100% (done): Done app_proxy_1 | yarn run v1.22.19 app_proxy_1 | $ node ./bin/www app_proxy_1 | [HPM] Proxy created: / -> http://10.21.22.3:3006 app_proxy_1 | Waiting for 10.21.22.3:3006 to open... app_proxy_1 | Lightning Node is now ready... app_proxy_1 | Listening on port: 2101 lnd_1 | 2023-07-11 04:44:12.671 [INF] LTND: Waiting for chain backend to finish sync, start_height=798212

lightning-terminal

Attaching to lightning-terminal_web_1, lightning-terminal_tor_server_1, lightning-terminal_app_proxy_1 web_1 | 2023-07-09 13:43:03.499 [ERR] LITD: Error while stopping litd: RPC middleware receive failed: rpc error: code = Unavailable desc = error reading from server: EOF web_1 | 2023-07-09 13:43:03.501 [ERR] LITD: Error shutting down: RPC middleware receive failed: rpc error: code = Unavailable desc = error reading from server: EOF web_1 | 2023-07-09 13:43:04.441 [WRN] GRPC: [core] grpc: addrConn.createTransport failed to connect to {10.21.21.9:10009 10.21.21.9:10009 0 }. Err: connection error: desc = "transport: Error while dialing dial tcp 10.21.21.9:10009: connect: connection refused". Reconnecting... web_1 | 2023-07-09 13:43:04.441 [WRN] GRPC: [core] grpc: addrConn.createTransport failed to connect to {10.21.21.9:10009 10.21.21.9:10009 0 }. Err: connection error: desc = "transport: Error while dialing dial tcp 10.21.21.9:10009: connect: connection refused". Reconnecting... web_1 | 2023-07-11 04:42:30.864 [WRN] GRPC: [core] grpc: addrConn.createTransport failed to connect to {10.21.21.9:10009 10.21.21.9:10009 0 }. Err: connection error: desc = "transport: Error while dialing dial tcp 10.21.21.9:10009: connect: connection refused". Reconnecting... web_1 | 2023-07-11 04:42:30.863 [WRN] GRPC: [core] grpc: addrConn.createTransport failed to connect to {10.21.21.9:10009 10.21.21.9:10009 0 }. Err: connection error: desc = "transport: Error while dialing dial tcp 10.21.21.9:10009: connect: connection refused". Reconnecting... web_1 | 2023-07-11 04:42:31.878 [WRN] GRPC: [core] grpc: addrConn.createTransport failed to connect to {10.21.21.9:10009 10.21.21.9:10009 0 }. Err: connection error: desc = "transport: Error while dialing dial tcp 10.21.21.9:10009: connect: connection refused". Reconnecting... web_1 | 2023-07-11 04:42:31.882 [WRN] GRPC: [core] grpc: addrConn.createTransport failed to connect to {10.21.21.9:10009 10.21.21.9:10009 0 }. Err: connection error: desc = "transport: Error while dialing dial tcp 10.21.21.9:10009: connect: connection refused". Reconnecting... web_1 | 2023-07-11 04:42:53.336 [WRN] GRPC: [core] grpc: addrConn.createTransport failed to connect to {10.21.21.9:10009 10.21.21.9:10009 0 }. Err: connection error: desc = "transport: Error while dialing dial tcp 10.21.21.9:10009: i/o timeout". Reconnecting... web_1 | 2023-07-11 04:42:53.789 [WRN] GRPC: [core] grpc: addrConn.createTransport failed to connect to {10.21.21.9:10009 10.21.21.9:10009 0 }. Err: connection error: desc = "transport: Error while dialing dial tcp 10.21.21.9:10009: i/o timeout". Reconnecting... app_proxy_1 | Validating token: 0dd5f160c71a ... app_proxy_1 | Validating token: 0dd5f160c71a ... app_proxy_1 | Validating token: 0dd5f160c71a ... app_proxy_1 | Validating token: 0dd5f160c71a ... app_proxy_1 | Validating token: 0dd5f160c71a ... app_proxy_1 | Validating token: 0dd5f160c71a ... app_proxy_1 | Validating token: 0dd5f160c71a ... app_proxy_1 | [HPM] Client disconnected app_proxy_1 | [HPM] Client disconnected app_proxy_1 | [HPM] Client disconnected

mempool

Attaching to mempool_web_1, mempool_api_1, mempool_tor_server_1, mempool_mariadb_1, mempool_app_proxy_1 api_1 | Jul 9 04:22:33 [110] INFO: Disconnected from Electrum Server at 10.21.21.10:50001 api_1 | Jul 9 04:22:36 [110] INFO: Disconnected from Electrum Server at 10.21.21.10:50001 api_1 | Jul 9 04:22:36 [110] ERR: Electrum error: {"errno":-113,"code":"EHOSTUNREACH","syscall":"connect","address":"10.21.21.10","port":50001} api_1 | Jul 9 04:22:36 [110] ERR: Electrum error: {"errno":-113,"code":"EHOSTUNREACH","syscall":"connect","address":"10.21.21.10","port":50001} api_1 | Jul 9 04:22:56 [110] INFO: Connected to Electrum Server at 10.21.21.10:50001 (["electrs/0.9.14","1.4"]) api_1 | Jul 9 04:32:44 [110] INFO: Running forensics scans api_1 | Jul 9 16:32:44 [110] INFO: Running forensics scans api_1 | Jul 10 04:32:44 [110] INFO: Running forensics scans api_1 | Jul 10 16:32:44 [110] INFO: Running forensics scans api_1 | Jul 11 04:32:44 [110] INFO: Running forensics scans app_proxy_1 | [HPM] Proxy created: / -> http://10.21.21.26:3006 app_proxy_1 | Waiting for 10.21.21.26:3006 to open... app_proxy_1 | mempool is now ready... app_proxy_1 | Listening on port: 3006 app_proxy_1 | [HPM] Upgrading to WebSocket app_proxy_1 | [HPM] Client disconnected app_proxy_1 | [HPM] Upgrading to WebSocket app_proxy_1 | [HPM] Client disconnected app_proxy_1 | [HPM] Upgrading to WebSocket app_proxy_1 | [HPM] Client disconnected mariadb_1 | 2023-06-07 15:19:49 9 [Warning] Aborted connection 9 to db: 'mempool' user: 'mempool' host: '10.21.21.27' (Got an error reading communication packets) mariadb_1 | 2023-06-07 15:19:53 10 [Warning] Aborted connection 10 to db: 'unconnected' user: 'unauthenticated' host: '10.21.21.27' (This connection closed normally without authentication) mariadb_1 | 2023-06-07 15:20:18 11 [Warning] Aborted connection 11 to db: 'mempool' user: 'mempool' host: '10.21.21.27' (Got an error reading communication packets) mariadb_1 | 2023-06-07 15:20:22 12 [Warning] Aborted connection 12 to db: 'unconnected' user: 'unauthenticated' host: '10.21.21.27' (This connection closed normally without authentication) mariadb_1 | 2023-06-07 15:20:42 13 [Warning] Aborted connection 13 to db: 'mempool' user: 'mempool' host: '10.21.21.27' (Got an error reading communication packets) mariadb_1 | 2023-06-07 15:20:46 14 [Warning] Aborted connection 14 to db: 'unconnected' user: 'unauthenticated' host: '10.21.21.27' (This connection closed normally without authentication) mariadb_1 | 2023-06-07 15:21:12 15 [Warning] Aborted connection 15 to db: 'mempool' user: 'mempool' host: '10.21.21.27' (Got an error reading communication packets) mariadb_1 | 2023-06-07 15:21:15 16 [Warning] Aborted connection 16 to db: 'unconnected' user: 'unauthenticated' host: '10.21.21.27' (This connection closed normally without authentication) mariadb_1 | 2023-06-07 15:21:42 17 [Warning] Aborted connection 17 to db: 'mempool' user: 'mempool' host: '10.21.21.27' (Got an error reading communication packets) mariadb_1 | 2023-06-07 15:21:45 18 [Warning] Aborted connection 18 to db: 'unconnected' user: 'unauthenticated' host: '10.21.21.27' (This connection closed normally without authentication) web_1 | /var/www/mempool/browser/resources

ride-the-lightning

Attaching to ride-the-lightning_boltz_1, ride-the-lightning_tor_server_1, ride-the-lightning_web_1, ride-the-lightning_app_proxy_1 app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | [HPM] Client disconnected boltz_1 | INFO : 2023/07/11 04:43:50 Connecting to LND block epoch stream boltz_1 | INFO : 2023/07/11 04:43:50 Connected to LND block epoch stream boltz_1 | INFO : 2023/07/11 04:43:50 Retrying LND connection in 15 seconds boltz_1 | ERROR: 2023/07/11 04:43:50 Lost connection to LND block epoch stream: rpc error: code = Unknown desc = waiting to start, RPC services not available boltz_1 | INFO : 2023/07/11 04:44:05 Connecting to LND block epoch stream boltz_1 | INFO : 2023/07/11 04:44:05 Connected to LND block epoch stream boltz_1 | ERROR: 2023/07/11 04:44:05 Lost connection to LND block epoch stream: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access boltz_1 | INFO : 2023/07/11 04:44:05 Retrying LND connection in 15 seconds boltz_1 | INFO : 2023/07/11 04:44:20 Connecting to LND block epoch stream boltz_1 | INFO : 2023/07/11 04:44:20 Connected to LND block epoch stream web_1 | [6/20/2023, 1:51:35 AM] INFO: Invoice => Sorted Invoices List Received. web_1 | web_1 | [6/20/2023, 1:51:35 AM] INFO: Channels => Closed Channels List Received. web_1 | web_1 | [6/20/2023, 1:51:39 AM] INFO: Graph => Graph Information Received. web_1 | web_1 | [6/20/2023, 1:52:13 AM] INFO: CLWebSocket => Disconnecting from the LND's Websocket Server... web_1 | web_1 | [6/20/2023, 1:52:13 AM] INFO: WebSocketServer => Disconnected due to 1001 : 1687225893794, Total WS clients: 0. web_1 |

robosats

Attaching to robosats_tor_server_1, robosats_app_proxy_1, robosats_web_1 app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | Validating token: b92f4a0e365b ... app_proxy_1 | [HPM] Client disconnected web_1 | 10.21.0.9 - - [20/Jun/2023:01:41:42 +0000] "GET /static/assets/avatars/SpoiledDanger184.small.webp HTTP/1.1" 200 1512 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:18 +0000] "GET /static/assets/avatars/GutsyAircraft233.small.webp HTTP/1.1" 200 1442 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:18 +0000] "GET /static/assets/avatars/SlightLotion927.small.webp HTTP/1.1" 200 1898 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:18 +0000] "GET /static/assets/avatars/PerceptiveCombo622.small.webp HTTP/1.1" 200 1126 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:18 +0000] "GET /static/assets/avatars/PettishFinance721.small.webp HTTP/1.1" 200 1804 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:18 +0000] "GET /static/assets/avatars/PerturbingHouse233.small.webp HTTP/1.1" 200 1236 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:18 +0000] "GET /static/assets/avatars/LogicalAngle234.small.webp HTTP/1.1" 200 1806 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:19 +0000] "GET /static/assets/avatars/SpasmodicZeugma871.small.webp HTTP/1.1" 200 1304 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:19 +0000] "GET /static/assets/avatars/ObtuseLiar854.small.webp HTTP/1.1" 200 1154 "http://umbrel.local:12596/offers/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "::ffff:10.21.0.1" web_1 | 10.21.0.9 - - [20/Jun/2023:01:42:27 +0000] "GET /ws/chat/59845/?token_sha256_hex=80cdb804e8378d3a1ddc9e7eb121783bc8fb60af9248adac21194c446fbf5fdf HTTP/1.1" 101 8337 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36" "-"

snowflake

Attaching to snowflake_app_proxy_1, snowflake_proxy_1, snowflake_web_1, snowflake_tor_server_1 app_proxy_1 | yarn run v1.22.19 app_proxy_1 | $ node ./bin/www app_proxy_1 | [HPM] Proxy created: / -> http://snowflake_web_1:3800 app_proxy_1 | Waiting for snowflake_web_1:3800 to open... app_proxy_1 | Tor Snowflake Proxy is now ready... app_proxy_1 | Listening on port: 3800 proxy_1 | 2023/07/11 04:38:01 Timed out waiting for client to open data channel. proxy_1 | 2023/07/11 04:41:32 sdp offer successfully received. proxy_1 | 2023/07/11 04:41:32 Generating answer... proxy_1 | 2023/07/11 04:41:53 Timed out waiting for client to open data channel. proxy_1 | 2023/07/11 04:42:03 sdp offer successfully received. proxy_1 | 2023/07/11 04:42:03 Generating answer... proxy_1 | 2023/07/11 04:42:23 Timed out waiting for client to open data channel. proxy_1 | 2023/07/11 04:43:44 sdp offer successfully received. proxy_1 | 2023/07/11 04:43:44 Generating answer... proxy_1 | 2023/07/11 04:44:04 Timed out waiting for client to open data channel. web_1 | 2023/06/07 13:18:13 Using index file at /snowflake/index.html web_1 | 2023/06/07 13:18:13 Server is starting with command: bash -c tail -n 10000 -f /snowflake/snowflake.log | grep "Traffic Relayed" web_1 | 2023/06/07 13:18:13 URL: http://127.0.0.1:3800/ web_1 | 2023/06/07 13:18:13 URL: http://10.21.0.13:3800/

sphinx-relay

Attaching to sphinx-relay_app_proxy_1, sphinx-relay_tor_server_1, sphinx-relay_server_1 app_proxy_1 | yarn run v1.22.19 app_proxy_1 | $ node ./bin/www app_proxy_1 | [HPM] Proxy created: / -> http://sphinx-relay_server_1:3300 app_proxy_1 | Waiting for sphinx-relay_server_1:3300 to open... app_proxy_1 | Sphinx Relay is now ready... app_proxy_1 | Listening on port: 3300 app_proxy_1 | [HPM] Error occurred while proxying request umbrel.local:3300/connect to http://sphinx-relay_server_1:3300/ [ECONNRESET] (https://nodejs.org/api/errors.html#errors_common_system_errors) server_1 | internalRepr: Map(1) { 'content-type' => [Array] }, server_1 | options: {} server_1 | } server_1 | } server_1 | ===> subscribed invoices with pubkey: 03af5eb3ecb78c9345e22fc552618ebd8caa2b2788b27aa8729d07cdfb43842e70 server_1 | [info] 07-11-23T04:44:16 [MISC] >>> FINISH SETUP server_1 | [info] 07-11-23T04:44:17 [MISC] => Relay version: v2.3.2, commit: ca178e5 server_1 | [info] 07-11-23T04:44:17 [MISC] >> aXA6Omh0dHA6Ly9pN2thNGd0c3dsbTJpcGpjdm55NnNhZWpmZ2Z5bGZyZndjZm5kN3pzbmNlaTU2ajVpMmZmYXBhZC5vbmlvbjozMzAwOjo5NmExNDgwZjQ1OWMyNGEwY2RiNzRmZWQzYjljMWRhMQ== server_1 | [info] 07-11-23T04:44:17 [TRIBES] try to connect: tls://tribes.sphinx.chat:8883 server_1 | [info] 07-11-23T04:44:17 [TRIBES] connected!

==== Result ====

The debug script did not automatically detect any issues with your Umbrel.

levmi commented 1 year ago

Thanks for sharing! Will look into this with the team and see if we can find the cause here, should hear more from us soon :)

guggero commented 1 year ago

Your Lightning Terminal cannot start because lnd isn't running: Error while dialing dial tcp 10.21.21.9:10009: connect: connection refused

And your lnd is waiting for the chain to be synced: Waiting for chain backend to finish sync, start_height=798212

So it sounds like lnd restarts for some reason and then it takes a long time for it to start up again. Unfortunately the Umbrel log summary only shows a few lines per application, so there is nothing in the logs you posted that would explain the root cause.

Can you try if you can find the full lnd.log file? Maybe the Umbrel support team (or their FAQs) can help you find it?

levmi commented 1 year ago

The Umbrel telegram group is usually pretty responsive for issues if you want to start by asking how to get those log files there: https://t.me/getumbrel

tlindi commented 7 months ago

Similar issue here as OP.

Latest Umbrel with all the updates. LND is up but lit login gives:

failed to connect
could not start Lit: could not start LND
could not set up LND clients: could not create LND Services client: error subscribing to lnd wallet state: lnd version incompatible, need at least v0.13.0-beta, got error on state subscription: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 10.21.21.9:10009: connect: connection refused"

Similar happens sometimes after a week or so.

Problem disappears always by restarting only the Lightning Terminal service on Umbrel via commanline/shell

restart is most easy with command: ~/umbrel/scripts/app restart lightning-terminal

Last time today after 2 day uptime. Occured after suspected bad connection / communication errors with Tor. Suspect that cause I couldnt access node remotely ( ovet Tor) but suddenly connection came back.

I'll dig logs for you to study - lnd, LiT and Tor - any other?

ellemouton commented 7 months ago

@tlindi - in the next LiT release (which will be out fairly soon), there will be connection re-tries to LND. So then even if LiT cannot connect on the first try, it will continue re-trying until the connection succeeds. I hope that this will resolve the issue for you 🙏

tlindi commented 7 months ago

I'll dig logs for you to study - lnd, LiT and Tor - any other? lightning-terminal_app_proxy_1.log lightning-terminal_tor_server_1.log lightning-terminal_web_1.log

There they are if needed.

tlindi commented 7 months ago

connection re-tries to LND

Mind to link the fixing PR here, please?

ellemouton commented 7 months ago

Mind to link the fixing PR here, please?

https://github.com/lightninglabs/lightning-terminal/pull/694

ViktorTigerstrom commented 7 months ago

@tlindi, the new litd version (v0.12.3-alpha) that includes #694 is now live on Umbrel :rocket:!

tlindi commented 7 months ago

Thank you for information. Lately issue has not been repeating. But I'd trust you have taken care of it.

From my point of view this can be closed. I'll open new issue if problem reappears.