Closed hMsats closed 1 year ago
Thanks for the report 🙏, this is getting worked on
Likely related to https://github.com/ElementsProject/lightning/issues/6481
@ddustin I recklessly added to channeld/channeld.c
the line:
/* We must regossip the scid since it has changed */
peer->gossip_scid_announced = false;
to version v23.08.1 and recompiled getting v23.08.1-modded (for both my main node and test node).
Added 100k sats to my main node on the channel with my test node. This time the **BROKEN**
after the State changed from CHANNELD_AWAITING_SPLICE to CHANNELD_NORMAL
is gone at my main node. So that's good. But (edit: I waited much longer than 12 blocks) I still only got in the listnodes
at my main node:
{
"nodeid": "02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392"
},
so no alias.
I still got a **BROKEN**
but at my test node. The id of my main node is
02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6
. This is the output of my test node:
2023-09-19T22:06:36.798Z INFO plugin-clnrest.py: Killing plugin: disabled itself: No module named 'flask'
2023-09-19T22:06:41.017Z INFO plugin-bcli: bitcoin-cli initialized and connected to bitcoind.
2023-09-19T22:06:47.373Z INFO lightningd: --------------------------------------------------
2023-09-19T22:06:47.373Z INFO lightningd: Server started with public key 02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392, alias myshtery (color #02aecd) and lightningd v23.08.1-modded
2023-09-19T22:13:44.877Z UNUSUAL 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-channeld-chan#34: STFU complete: we are quiescent
2023-09-19T22:13:44.877Z UNUSUAL 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-channeld-chan#34: STFU complete: setting stfu_wait_single_msg = true
2023-09-19T22:13:45.616Z INFO 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-channeld-chan#34: Splice signing tx: 02000000027329c10b37a02c83f24bcf075505b7df3ddc3abc9c41b87652065cda3b45e6840100000000fdffffff70d72d1f9814469522302b472227e6ddc7d6c66f90428b73ed8e01672a92daa90100000000000000000220a10700000000002200205f77d335059966a0d3069135995e6f76597c6f1bdc8ced3eafbdb97059ba4f7425450200000000002251202cebfc59a034eef952c25c87bad0e07e18e5c044cb78fc414d7367825d306ab224560c00
2023-09-19T22:13:45.932Z INFO 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-chan#34: State changed from CHANNELD_NORMAL to CHANNELD_AWAITING_SPLICE
2023-09-19T22:56:55.471Z INFO 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-chan#34: State changed from CHANNELD_AWAITING_SPLICE to CHANNELD_NORMAL
2023-09-19T22:57:32.815Z **BROKEN** 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-gossipd: invalid local_channel_announcement 0bbe02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b601b001003e02abb4d5c1bb49e8b31d0251419b46fde7374d97da9f7723d4141bb8f01389455fe28e223e2ef638e254611f6208e5bbcf1909c3cc1e8e4b39523d8ef44025142d60f21483cbf56f102e28d251bac956e67f41d386e59183766888a7c2322875ed42cb88a871764f92c0af2458e21bfe2cdb2b6a977ec227e525a53ace5a8f0532ec9efc4a897644c488545623f25e36690945d67eb3b599d4305f8463cf0c6c13cb0e959b93569b728e189eef58fd4fd0fb48309641acb7c299d4843adf2743c7b6f60423074209208bb65286fe20a74d42246e81c0c65a8f53b092ac5c884d15a2de96692fbc97c515d14c9f3a8919d19ff233c68ed5bcd1b19e16332f6300006fe28c0ab6f1b372c1a6a246ae63f74f931e8365e15a089c68d61900000000000c562500011d000002888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b602aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392036c80f806562299e8feb61210d094a7e19bfbbb7c78c3fd428241a91945e07d660380b33e2775a40fc4d8123a559e1ea8e17d9ad0edd729f8ac6cf49808db1943a6 (000100000000000000000000000000000000000000000000000000000000000000000460426164206e6f64655f7369676e61747572655f3220333034343032323031343264363066323134383363626635366631303265323864323531626163393536653637663431643338366535393138333736363838386137633233323238303232303735656434326362383861383731373634663932633061663234353865323162666532636462326236613937376563323237653532356135336163653561386620686173682065346536316230643438353964393663356234306266363438383633643039366363613630333936623632366437303838663536623233633536333963326333206f6e206368616e6e656c5f616e6e6f756e63656d656e7420303130303365303261626234643563316262343965386233316430323531343139623436666465373337346439376461396637373233643431343162623866303133383934353566653238653232336532656636333865323534363131663632303865356262636631393039633363633165386534623339353233643865663434303235313432643630663231343833636266353666313032653238643235316261633935366536376634316433383665353931383337363638383861376332333232383735656434326362383861383731373634663932633061663234353865323162666532636462326236613937376563323237653532356135336163653561386630353332656339656663346138393736343463343838353435363233663235653336363930393435643637656233623539396434333035663834363363663063366331336362306539353962393335363962373238653138396565663538666434666430666234383330393634316163623763323939643438343361646632373433633762366636303432333037343230393230386262363532383666653230613734643432323436653831633063363561386635336230393261633563383834643135613264653936363932666263393763353135643134633966336138393139643139666632333363363865643562636431623139653136333332663633303030303666653238633061623666316233373263316136613234366165363366373466393331653833363565313561303839633638643631393030303030303030303030633536323530303031316430303030303238383832343430323963353930393539333033386162313966323639393437633732306465333432336534393137393162343663376339326637363237396236303261656364653362336464653435303838393735356434646263653533386536623965396466613737303264313462356263356334333739653162333330333932303336633830663830363536323239396538666562363132313064303934613765313962666262623763373863336664343238323431613931393435653037643636303338306233336532373735613430666334643831323361353539653165613865313764396164306564643732396638616336636634393830386462313934336136)
However, this time it was enough to restart my test node and everything was normal at my main node:
{
"nodeid": "02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392",
"alias": "myshtery",
"color": "02aecd",
"last_timestamp": 1695165792,
"features": "080000000000000000000000000088a0880a0269a2",
"addresses": [
{
"type": "ipv4",
"address": "84.26.69.91",
"port": 6837
}
]
},
So after the restart of my test node the splicing worked perfectly! Of course, normally the other node (the test node) isn't yours, so a restart won't happen.
Thanks for your reckless testing!
It's possible that the (future) release could touch your issue, as a good handful of fixes are in #6677 but in either case a regression test for this sparse listnodes
result is definitely in order.
I was able to recreate the listnodes
problem on master and saw that it worked on #6677 and added a regression test test_splice_listnodes
.
Likely this would already be caught by test_splice_gossip
but seems like a good addition for extra thoroughness!
@ddustin Would like to test splicing again but can't run master because of issue #6700. Will test after this has been fixed.
@ddustin I first did another trick in the software to get master going again but then I ran the query command manually directly on the database as you suggested (thanks!) and got master also running again this way. I get the impression the channel is stuck in "CHANNELD_AWAITING_SPLICE" because more than 30 confirmations have already passed.
This is from listpeerchannels
:
{
"timestamp": "2023-09-21T17:16:41.062Z",
"old_state": "CHANNELD_NORMAL",
"new_state": "CHANNELD_AWAITING_SPLICE",
"cause": "user",
"message": "Broadcasting splice"
}
],
"status": [
"CHANNELD_AWAITING_SPLICE:Bad node_signature 304402202fa04d8640d7c79cfeb0fbf9e9123ce0b67f2877c34998b7358d3ad9479aaa4402200ed3289e1d97c3324957ecc0a212d0f16cf2e9bff92bb907b4b288bafdd04c55 hash 3f6931e5a79c47db308b84653b1efbbce96bc7fd25a921d2942b618b9bdd4d16 on announcement_signatures 0100748610b1fbaeca5a7975657618512bf4d7e6ad398b0113a0874fdcb4e4e93b5827af04c12176289960230bb0a4fa5a3a1c14fef3bc36614d0b2174781d16f7ac2fa04d8640d7c79cfeb0fbf9e9123ce0b67f2877c34998b7358d3ad9479aaa440ed3289e1d97c3324957ecc0a212d0f16cf2e9bff92bb907b4b288bafdd04c5551d77a05e4e3ad3ad0a9abcc2687b966e69a4832ce5b5748db9e78ef5124f8d003c34bb8b5af8e17b90c09625f591e5dcf6537cb2b2df290c36e45e614604e521e1e508284de0754705b9adb6e071abe5b598a7595b1fc0d4f81bcaccdff67c958c278f5a6a4123e868a7dd29e1c8fd8cbe72bf833ed2f29f769af2d3d92883a00006fe28c0ab6f1b372c1a6a246ae63f74f931e8365e15a089c68d61900000000000c5720000141000002888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b602aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392036c80f806562299e8feb61210d094a7e19bfbbb7c78c3fd428241a91945e07d660380b33e2775a40fc4d8123a559e1ea8e17d9ad0edd729f8ac6cf49808db1943a6",
"CHANNELD_AWAITING_SPLICE:Will attempt reconnect in 300 seconds"
],
I hope this isn't because of the (small) software trick I did first in runes but I don't think so.
I'll be sleeping and busy but hope this is useful :-)
Problem is I can connect with my test node to other nodes but not to my main node as I get:
2023-09-22T05:04:04.308Z INFO lightningd: Server started with public key 02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392, alias myshtery (color #02aecd) and lightningd v23.08.1-134-g10eecea
lightningd: db/bindings.c:22: check_bind_pos: Assertion `pos < tal_count(stmt->bindings)' failed.
lightningd: FATAL SIGNAL 6 (version v23.08.1-134-g10eecea)
0x56392e282cbe send_backtrace
common/daemon.c:33
0x56392e282d58 crashdump
common/daemon.c:75
0x7fb5515c608f ???
/build/glibc-SzIz7B/glibc-2.31/signal/../sysdeps/unix/sysv/linux/x86_64/sigaction.c:0
0x7fb5515c600b __GI_raise
../sysdeps/unix/sysv/linux/raise.c:51
0x7fb5515a5858 __GI_abort
/build/glibc-SzIz7B/glibc-2.31/stdlib/abort.c:79
0x7fb5515a5728 __assert_fail_base
/build/glibc-SzIz7B/glibc-2.31/assert/assert.c:92
0x7fb5515b6fd5 __GI___assert_fail
/build/glibc-SzIz7B/glibc-2.31/assert/assert.c:101
0x56392e297974 check_bind_pos
db/bindings.c:22
0x56392e297c0b db_bind_blob
db/bindings.c:92
0x56392e297fcf db_bind_signature
db/bindings.c:206
0x56392e26f5e8 wallet_htlc_sigs_add
wallet/wallet.c:3872
0x56392e25186e peer_got_commitsig
lightningd/peer_htlcs.c:2401
0x56392e214ed5 channel_msg
lightningd/channel_control.c:1141
0x56392e25d4b2 sd_msg_read
lightningd/subd.c:555
0x56392e3505bc next_plan
ccan/ccan/io/io.c:59
0x56392e350a89 do_plan
ccan/ccan/io/io.c:407
0x56392e350b26 io_ready
ccan/ccan/io/io.c:417
0x56392e3524bb io_loop
ccan/ccan/io/poll.c:453
0x56392e22adf0 io_loop_with_timers
lightningd/io_loop_with_timers.c:22
0x56392e22fb86 main
lightningd/lightningd.c:1328
0x7fb5515a7082 __libc_start_main
../csu/libc-start.c:308
0x56392e209fbd ???
???:0
0xffffffffffffffff ???
???:0
2023-09-22T05:06:19.044Z **BROKEN** lightningd: FATAL SIGNAL 6 (version v23.08.1-134-g10eecea)
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: common/daemon.c:38 (send_backtrace) 0x56392e282d06
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: common/daemon.c:75 (crashdump) 0x56392e282d58
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: /build/glibc-SzIz7B/glibc-2.31/signal/../sysdeps/unix/sysv/linux/x86_64/sigaction.c:0 ((null)) 0x7fb5515c608f
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ../sysdeps/unix/sysv/linux/raise.c:51 (__GI_raise) 0x7fb5515c600b
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: /build/glibc-SzIz7B/glibc-2.31/stdlib/abort.c:79 (__GI_abort) 0x7fb5515a5858
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: /build/glibc-SzIz7B/glibc-2.31/assert/assert.c:92 (__assert_fail_base) 0x7fb5515a5728
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: /build/glibc-SzIz7B/glibc-2.31/assert/assert.c:101 (__GI___assert_fail) 0x7fb5515b6fd5
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: db/bindings.c:22 (check_bind_pos) 0x56392e297974
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: db/bindings.c:92 (db_bind_blob) 0x56392e297c0b
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: db/bindings.c:206 (db_bind_signature) 0x56392e297fcf
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: wallet/wallet.c:3872 (wallet_htlc_sigs_add) 0x56392e26f5e8
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/peer_htlcs.c:2401 (peer_got_commitsig) 0x56392e25186e
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/channel_control.c:1141 (channel_msg) 0x56392e214ed5
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/subd.c:555 (sd_msg_read) 0x56392e25d4b2
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ccan/ccan/io/io.c:59 (next_plan) 0x56392e3505bc
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ccan/ccan/io/io.c:407 (do_plan) 0x56392e350a89
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ccan/ccan/io/io.c:417 (io_ready) 0x56392e350b26
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ccan/ccan/io/poll.c:453 (io_loop) 0x56392e3524bb
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/io_loop_with_timers.c:22 (io_loop_with_timers) 0x56392e22adf0
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/lightningd.c:1328 (main) 0x56392e22fb86
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ../csu/libc-start.c:308 (__libc_start_main) 0x7fb5515a7082
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: (null):0 ((null)) 0x56392e209fbd
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: (null):0 ((null)) 0xffffffffffffffff
Log dumped in crash.log.20230922050619
Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.
@ddustin I can't connect with my test node to other nodes without crashing so I think my test node database is corrupt. I will close and remove my test node and start from scratch. I won't be able to test anything for some time, so just ignore what I wrote lately. At least it isn't my main node :-)
@ddustin think I found the source of all my problems in the source code. See my pull request :-)
@ddustin After adding the extra ?, everything ran perfectly. I've never seen anything like it, what a difference. Splicing worked, including listnodes
, without encountering any problems. So thanks for all your hard work on this extremely difficult topic and your suggestions :1st_place_medal:. So I'm closing this closed issue :smile:
That's awesome! Thanks for figuring that out! 🔥
Issue and Steps to Reproduce
I promised to retry a splice after my previous issue was fixed. All went fine and 100 ksats were added to my main node, except that my test node (
02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392
) was no longer visible inlistnodes
of my main node (as was briefly mentioned in my previous issue), although it was visible inlistpeerchannels
of my main node. I uselistnodes
to get aliases. Restarted my test node and this time it didn't crash anymore! After the restart of my test node, it was visible in thelistnodes
of my main node but without the alias and other information:while others show more info, for example:
Note that my main node is perfectly visible in the
listnodes
of my test node. Restarted my main node but that didn't help, the alias of my test node is still not visible in thelistnodes
of my main node.Here is the output of my main node (see UNUSUAL and BROKEN):
EDIT:
1: stopped my test node, removed the test node gossip_store and restarted the test node -> alias still not visible at my main node
2: stopped my test node, deleted payments and invoices from the test node lightningd.sqlite3 -> alias almost immediately visible again at my main node (
sqlite3 -header -line lightningd.sqlite3 'DELETE FROM payments'; sqlite3 -header -line lightningd.sqlite3 'DELETE FROM invoices'
)getinfo
outputBoth nodes are: "version": "v23.08rc2-17-gc67f1f9",