ElementsProject / lightning

Core Lightning — Lightning Network implementation focusing on spec compliance and performance
Other
2.84k stars 901 forks source link

Invalid local_channel_announcement after a successful splice #6572

Closed hMsats closed 1 year ago

hMsats commented 1 year ago

Issue and Steps to Reproduce

I promised to retry a splice after my previous issue was fixed. All went fine and 100 ksats were added to my main node, except that my test node (02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392) was no longer visible in listnodes of my main node (as was briefly mentioned in my previous issue), although it was visible in listpeerchannels of my main node. I use listnodes to get aliases. Restarted my test node and this time it didn't crash anymore! After the restart of my test node, it was visible in the listnodes of my main node but without the alias and other information:

      {
         "nodeid": "02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392"
      },

while others show more info, for example:

      {
         "nodeid": "03b9aacb265dc5ebde04b91b28f7c8bb6ba0af146e5f37426915742daf8f195a09",
         "alias": "DeutscheBank|CLN",
         "color": "0018a8",
         "last_timestamp": 1686594519,
         "features": "88a0802a0a69a2",
         "addresses": [
            {
               "type": "ipv4",
               "address": "90.146.208.162",
               "port": 9735
            },
            {
               "type": "ipv6",
               "address": "fe80::3bc7:c09f:5a3e:f7ab:9735",
               "port": 9735
            },
            {
               "type": "torv3",
               "address": "bafx5t6dmxwm5ocwawt2o6yrdlyksuqt7c6liylopkok4y3rnytqghid.onion",
               "port": 9735
            }
         ]
      },

Note that my main node is perfectly visible in the listnodes of my test node. Restarted my main node but that didn't help, the alias of my test node is still not visible in the listnodes of my main node.

Here is the output of my main node (see UNUSUAL and BROKEN):

2023-08-16T04:06:48.504Z INFO    plugin-clnrest.py: Killing plugin: disabled itself: No module named 'flask'
2023-08-16T04:06:52.889Z INFO    plugin-bcli: bitcoin-cli initialized and connected to bitcoind.
2023-08-16T04:06:59.619Z INFO    lightningd: --------------------------------------------------
2023-08-16T04:06:59.619Z INFO    lightningd: Server started with public key 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6, alias bitcoinserver.nl (color #028882) and lightningd v23.08rc2-17-gc67f1f9
2023-08-16T04:13:05.466Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-channeld-chan#49387: Peer connection lost
2023-08-16T04:13:05.466Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-chan#49387: Peer transient failure in CHANNELD_NORMAL: channeld: Owning subdaemon channeld died (62208)
2023-08-16T04:15:26.564Z INFO    lightningd: Resolved invoice '2bujgd6tk0vde32qdntkasa0oi' with amount 667000msat in 1 htlcs
2023-08-16T04:20:23.781Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-channeld-chan#49387: Peer connection lost
2023-08-16T04:20:23.781Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-chan#49387: Peer transient failure in CHANNELD_NORMAL: channeld: Owning subdaemon channeld died (62208)
2023-08-16T04:21:39.962Z UNUSUAL 02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392-channeld-chan#49388: STFU complete: we are quiescent
2023-08-16T04:21:39.962Z UNUSUAL 02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392-channeld-chan#49388: STFU complete: setting stfu_wait_single_msg = true
2023-08-16T04:21:41.214Z INFO    02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392-channeld-chan#49388: Splice signing tx: 0200000002441755b5fefed3068b77d387f14c6488781fdce936ad58da4e44c3f9c6a04ba70100000000fdffffffde64e00f50613b9db44f65848ba391cad78eac87c3f9f37c8cead2c7367ed7de00000000000000000002a39801000000000022512049325308db47447f07080d4685b0f9cc3857a587bb8da4764b963d6d6a327fb8801a0600000000002200205f77d335059966a0d3069135995e6f76597c6f1bdc8ced3eafbdb97059ba4f7444420c00
2023-08-16T04:21:41.244Z INFO    02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392-chan#49388: State changed from CHANNELD_NORMAL to CHANNELD_AWAITING_SPLICE
2023-08-16T04:24:01.045Z INFO    02db3bce6ad28505ec56254e3c27b912f3d3723d7573e3b4174368b80ebf8f2ba8-channeld-chan#49298: Peer connection lost
2023-08-16T04:24:01.069Z INFO    02db3bce6ad28505ec56254e3c27b912f3d3723d7573e3b4174368b80ebf8f2ba8-chan#49298: Peer transient failure in CHANNELD_NORMAL: channeld: Owning subdaemon channeld died (62208)
2023-08-16T04:24:01.069Z INFO    02db3bce6ad28505ec56254e3c27b912f3d3723d7573e3b4174368b80ebf8f2ba8-channeld-chan#49310: Peer connection lost
2023-08-16T04:24:01.069Z INFO    02db3bce6ad28505ec56254e3c27b912f3d3723d7573e3b4174368b80ebf8f2ba8-chan#49310: Peer transient failure in CHANNELD_NORMAL: channeld: Owning subdaemon channeld died (62208)
2023-08-16T04:30:56.093Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-channeld-chan#49387: Peer connection lost
2023-08-16T04:30:56.094Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-chan#49387: Peer transient failure in CHANNELD_NORMAL: channeld: Owning subdaemon channeld died (62208)
2023-08-16T04:31:47.685Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-channeld-chan#49387: Peer connection lost
2023-08-16T04:31:47.686Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-chan#49387: Peer transient failure in CHANNELD_NORMAL: channeld: Owning subdaemon channeld died (62208)
2023-08-16T04:39:36.761Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-channeld-chan#49387: Peer connection lost
2023-08-16T04:39:36.781Z INFO    03864ef025fde8fb587d989186ce6a4a186895ee44a926bfc370e2c366597a3f8f-chan#49387: Peer transient failure in CHANNELD_NORMAL: channeld: Owning subdaemon channeld died (62208)
2023-08-16T04:43:05.316Z INFO    02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392-chan#49388: State changed from CHANNELD_AWAITING_SPLICE to CHANNELD_NORMAL
2023-08-16T04:43:50.280Z **BROKEN** 02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392-gossipd: invalid local_channel_announcement 0bbe02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b33039201b00100650e273449796e8d3b5295a63f22b87bd460eb57f1e4d678ce0f1aee61eaec7c46aaa2389c36cee4f7b9cc9bb2ead4a063ffcfc1054f285a8919e1dc657bd101142d60f21483cbf56f102e28d251bac956e67f41d386e59183766888a7c2322875ed42cb88a871764f92c0af2458e21bfe2cdb2b6a977ec227e525a53ace5a8f3c83a06cae9a58356ead11e4eda126305eb466d671dc9804385f89e244599c703a8ae86c392a4e458e0a0cbb9b06cc37f2750437fd1db14e76fc05823ecf393043c7b6f60423074209208bb65286fe20a74d42246e81c0c65a8f53b092ac5c884d15a2de96692fbc97c515d14c9f3a8919d19ff233c68ed5bcd1b19e16332f6300006fe28c0ab6f1b372c1a6a246ae63f74f931e8365e15a089c68d61900000000000c4246000087000102888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b602aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392036c80f806562299e8feb61210d094a7e19bfbbb7c78c3fd428241a91945e07d660380b33e2775a40fc4d8123a559e1ea8e17d9ad0edd729f8ac6cf49808db1943a6 (000100000000000000000000000000000000000000000000000000000000000000000460426164206e6f64655f7369676e61747572655f3120333034343032323036353065323733343439373936653864336235323935613633663232623837626434363065623537663165346436373863653066316165653631656165633763303232303436616161323338396333366365653466376239636339626232656164346130363366666366633130353466323835613839313965316463363537626431303120686173682063363932376262646536333132623836303736343833613764663665303065643563316332666164653431306434306632323733353965653163353935613965206f6e206368616e6e656c5f616e6e6f756e63656d656e7420303130303635306532373334343937393665386433623532393561363366323262383762643436306562353766316534643637386365306631616565363165616563376334366161613233383963333663656534663762396363396262326561643461303633666663666331303534663238356138393139653164633635376264313031313432643630663231343833636266353666313032653238643235316261633935366536376634316433383665353931383337363638383861376332333232383735656434326362383861383731373634663932633061663234353865323162666532636462326236613937376563323237653532356135336163653561386633633833613036636165396135383335366561643131653465646131323633303565623436366436373164633938303433383566383965323434353939633730336138616538366333393261346534353865306130636262396230366363333766323735303433376664316462313465373666633035383233656366333933303433633762366636303432333037343230393230386262363532383666653230613734643432323436653831633063363561386635336230393261633563383834643135613264653936363932666263393763353135643134633966336138393139643139666632333363363865643562636431623139653136333332663633303030303666653238633061623666316233373263316136613234366165363366373466393331653833363565313561303839633638643631393030303030303030303030633432343630303030383730303031303238383832343430323963353930393539333033386162313966323639393437633732306465333432336534393137393162343663376339326637363237396236303261656364653362336464653435303838393735356434646263653533386536623965396466613737303264313462356263356334333739653162333330333932303336633830663830363536323239396538666562363132313064303934613765313962666262623763373863336664343238323431613931393435653037643636303338306233336532373735613430666334643831323361353539653165613865313764396164306564643732396638616336636634393830386462313934336136)
2023-08-16T04:49:06.665Z INFO    022f3bfa1de8491bad95ea018b17d47efbfe713cd2da59ad5fc380198612b8954a-channeld-chan#49131: Peer connection lost
2023-08-16T04:49:06.690Z INFO    022f3bfa1de8491bad95ea018b17d47efbfe713cd2da59ad5fc380198612b8954a-chan#49131: Peer transient failure in CHANNELD_NORMAL: channeld: Owning subdaemon channeld died (62208)
2023-08-16T04:50:49.639Z INFO    lightningd: Resolved invoice '7hvf8u2p8p3vmrl1g184jkki3p' with amount 667000msat in 1 htlcs

EDIT:

1: stopped my test node, removed the test node gossip_store and restarted the test node -> alias still not visible at my main node

2: stopped my test node, deleted payments and invoices from the test node lightningd.sqlite3 -> alias almost immediately visible again at my main node (sqlite3 -header -line lightningd.sqlite3 'DELETE FROM payments'; sqlite3 -header -line lightningd.sqlite3 'DELETE FROM invoices')

getinfo output

Both nodes are: "version": "v23.08rc2-17-gc67f1f9",

ddustin commented 1 year ago

Thanks for the report 🙏, this is getting worked on

ddustin commented 1 year ago

Likely related to https://github.com/ElementsProject/lightning/issues/6481

hMsats commented 1 year ago

@ddustin I recklessly added to channeld/channeld.c the line:

/* We must regossip the scid since it has changed */
    peer->gossip_scid_announced = false;

to version v23.08.1 and recompiled getting v23.08.1-modded (for both my main node and test node).

Added 100k sats to my main node on the channel with my test node. This time the **BROKEN** after the State changed from CHANNELD_AWAITING_SPLICE to CHANNELD_NORMAL is gone at my main node. So that's good. But (edit: I waited much longer than 12 blocks) I still only got in the listnodes at my main node:

  {
         "nodeid": "02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392"
  },

so no alias.

I still got a **BROKEN** but at my test node. The id of my main node is 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6. This is the output of my test node:

2023-09-19T22:06:36.798Z INFO    plugin-clnrest.py: Killing plugin: disabled itself: No module named 'flask'
2023-09-19T22:06:41.017Z INFO    plugin-bcli: bitcoin-cli initialized and connected to bitcoind.
2023-09-19T22:06:47.373Z INFO    lightningd: --------------------------------------------------
2023-09-19T22:06:47.373Z INFO    lightningd: Server started with public key 02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392, alias myshtery (color #02aecd) and lightningd v23.08.1-modded
2023-09-19T22:13:44.877Z UNUSUAL 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-channeld-chan#34: STFU complete: we are quiescent
2023-09-19T22:13:44.877Z UNUSUAL 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-channeld-chan#34: STFU complete: setting stfu_wait_single_msg = true
2023-09-19T22:13:45.616Z INFO    02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-channeld-chan#34: Splice signing tx: 02000000027329c10b37a02c83f24bcf075505b7df3ddc3abc9c41b87652065cda3b45e6840100000000fdffffff70d72d1f9814469522302b472227e6ddc7d6c66f90428b73ed8e01672a92daa90100000000000000000220a10700000000002200205f77d335059966a0d3069135995e6f76597c6f1bdc8ced3eafbdb97059ba4f7425450200000000002251202cebfc59a034eef952c25c87bad0e07e18e5c044cb78fc414d7367825d306ab224560c00
2023-09-19T22:13:45.932Z INFO    02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-chan#34: State changed from CHANNELD_NORMAL to CHANNELD_AWAITING_SPLICE
2023-09-19T22:56:55.471Z INFO    02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-chan#34: State changed from CHANNELD_AWAITING_SPLICE to CHANNELD_NORMAL
2023-09-19T22:57:32.815Z **BROKEN** 02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b6-gossipd: invalid local_channel_announcement 0bbe02888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b601b001003e02abb4d5c1bb49e8b31d0251419b46fde7374d97da9f7723d4141bb8f01389455fe28e223e2ef638e254611f6208e5bbcf1909c3cc1e8e4b39523d8ef44025142d60f21483cbf56f102e28d251bac956e67f41d386e59183766888a7c2322875ed42cb88a871764f92c0af2458e21bfe2cdb2b6a977ec227e525a53ace5a8f0532ec9efc4a897644c488545623f25e36690945d67eb3b599d4305f8463cf0c6c13cb0e959b93569b728e189eef58fd4fd0fb48309641acb7c299d4843adf2743c7b6f60423074209208bb65286fe20a74d42246e81c0c65a8f53b092ac5c884d15a2de96692fbc97c515d14c9f3a8919d19ff233c68ed5bcd1b19e16332f6300006fe28c0ab6f1b372c1a6a246ae63f74f931e8365e15a089c68d61900000000000c562500011d000002888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b602aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392036c80f806562299e8feb61210d094a7e19bfbbb7c78c3fd428241a91945e07d660380b33e2775a40fc4d8123a559e1ea8e17d9ad0edd729f8ac6cf49808db1943a6 (000100000000000000000000000000000000000000000000000000000000000000000460426164206e6f64655f7369676e61747572655f3220333034343032323031343264363066323134383363626635366631303265323864323531626163393536653637663431643338366535393138333736363838386137633233323238303232303735656434326362383861383731373634663932633061663234353865323162666532636462326236613937376563323237653532356135336163653561386620686173682065346536316230643438353964393663356234306266363438383633643039366363613630333936623632366437303838663536623233633536333963326333206f6e206368616e6e656c5f616e6e6f756e63656d656e7420303130303365303261626234643563316262343965386233316430323531343139623436666465373337346439376461396637373233643431343162623866303133383934353566653238653232336532656636333865323534363131663632303865356262636631393039633363633165386534623339353233643865663434303235313432643630663231343833636266353666313032653238643235316261633935366536376634316433383665353931383337363638383861376332333232383735656434326362383861383731373634663932633061663234353865323162666532636462326236613937376563323237653532356135336163653561386630353332656339656663346138393736343463343838353435363233663235653336363930393435643637656233623539396434333035663834363363663063366331336362306539353962393335363962373238653138396565663538666434666430666234383330393634316163623763323939643438343361646632373433633762366636303432333037343230393230386262363532383666653230613734643432323436653831633063363561386635336230393261633563383834643135613264653936363932666263393763353135643134633966336138393139643139666632333363363865643562636431623139653136333332663633303030303666653238633061623666316233373263316136613234366165363366373466393331653833363565313561303839633638643631393030303030303030303030633536323530303031316430303030303238383832343430323963353930393539333033386162313966323639393437633732306465333432336534393137393162343663376339326637363237396236303261656364653362336464653435303838393735356434646263653533386536623965396466613737303264313462356263356334333739653162333330333932303336633830663830363536323239396538666562363132313064303934613765313962666262623763373863336664343238323431613931393435653037643636303338306233336532373735613430666334643831323361353539653165613865313764396164306564643732396638616336636634393830386462313934336136)

However, this time it was enough to restart my test node and everything was normal at my main node:

     {
         "nodeid": "02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392",
         "alias": "myshtery",
         "color": "02aecd",
         "last_timestamp": 1695165792,
         "features": "080000000000000000000000000088a0880a0269a2",
         "addresses": [
            {
               "type": "ipv4",
               "address": "84.26.69.91",
               "port": 6837
            }
         ]
      },

So after the restart of my test node the splicing worked perfectly! Of course, normally the other node (the test node) isn't yours, so a restart won't happen.

ddustin commented 1 year ago

Thanks for your reckless testing!

It's possible that the (future) release could touch your issue, as a good handful of fixes are in #6677 but in either case a regression test for this sparse listnodes result is definitely in order.

ddustin commented 1 year ago

I was able to recreate the listnodes problem on master and saw that it worked on #6677 and added a regression test test_splice_listnodes.

Likely this would already be caught by test_splice_gossip but seems like a good addition for extra thoroughness!

hMsats commented 1 year ago

@ddustin Would like to test splicing again but can't run master because of issue #6700. Will test after this has been fixed.

hMsats commented 1 year ago

@ddustin I first did another trick in the software to get master going again but then I ran the query command manually directly on the database as you suggested (thanks!) and got master also running again this way. I get the impression the channel is stuck in "CHANNELD_AWAITING_SPLICE" because more than 30 confirmations have already passed.

This is from listpeerchannels:

          {
               "timestamp": "2023-09-21T17:16:41.062Z",
               "old_state": "CHANNELD_NORMAL",
               "new_state": "CHANNELD_AWAITING_SPLICE",
               "cause": "user",
               "message": "Broadcasting splice"
            }
         ],
         "status": [
            "CHANNELD_AWAITING_SPLICE:Bad node_signature 304402202fa04d8640d7c79cfeb0fbf9e9123ce0b67f2877c34998b7358d3ad9479aaa4402200ed3289e1d97c3324957ecc0a212d0f16cf2e9bff92bb907b4b288bafdd04c55 hash 3f6931e5a79c47db308b84653b1efbbce96bc7fd25a921d2942b618b9bdd4d16 on announcement_signatures 0100748610b1fbaeca5a7975657618512bf4d7e6ad398b0113a0874fdcb4e4e93b5827af04c12176289960230bb0a4fa5a3a1c14fef3bc36614d0b2174781d16f7ac2fa04d8640d7c79cfeb0fbf9e9123ce0b67f2877c34998b7358d3ad9479aaa440ed3289e1d97c3324957ecc0a212d0f16cf2e9bff92bb907b4b288bafdd04c5551d77a05e4e3ad3ad0a9abcc2687b966e69a4832ce5b5748db9e78ef5124f8d003c34bb8b5af8e17b90c09625f591e5dcf6537cb2b2df290c36e45e614604e521e1e508284de0754705b9adb6e071abe5b598a7595b1fc0d4f81bcaccdff67c958c278f5a6a4123e868a7dd29e1c8fd8cbe72bf833ed2f29f769af2d3d92883a00006fe28c0ab6f1b372c1a6a246ae63f74f931e8365e15a089c68d61900000000000c5720000141000002888244029c5909593038ab19f269947c720de3423e491791b46c7c92f76279b602aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392036c80f806562299e8feb61210d094a7e19bfbbb7c78c3fd428241a91945e07d660380b33e2775a40fc4d8123a559e1ea8e17d9ad0edd729f8ac6cf49808db1943a6",
            "CHANNELD_AWAITING_SPLICE:Will attempt reconnect in 300 seconds"
         ],

I hope this isn't because of the (small) software trick I did first in runes but I don't think so.

I'll be sleeping and busy but hope this is useful :-)

hMsats commented 1 year ago

Problem is I can connect with my test node to other nodes but not to my main node as I get:

2023-09-22T05:04:04.308Z INFO    lightningd: Server started with public key 02aecde3b3dde450889755d4dbce538e6b9e9dfa7702d14b5bc5c4379e1b330392, alias myshtery (color #02aecd) and lightningd v23.08.1-134-g10eecea
lightningd: db/bindings.c:22: check_bind_pos: Assertion `pos < tal_count(stmt->bindings)' failed.
lightningd: FATAL SIGNAL 6 (version v23.08.1-134-g10eecea)
0x56392e282cbe send_backtrace
        common/daemon.c:33
0x56392e282d58 crashdump
        common/daemon.c:75
0x7fb5515c608f ???
        /build/glibc-SzIz7B/glibc-2.31/signal/../sysdeps/unix/sysv/linux/x86_64/sigaction.c:0
0x7fb5515c600b __GI_raise
        ../sysdeps/unix/sysv/linux/raise.c:51
0x7fb5515a5858 __GI_abort
        /build/glibc-SzIz7B/glibc-2.31/stdlib/abort.c:79
0x7fb5515a5728 __assert_fail_base
        /build/glibc-SzIz7B/glibc-2.31/assert/assert.c:92
0x7fb5515b6fd5 __GI___assert_fail
        /build/glibc-SzIz7B/glibc-2.31/assert/assert.c:101
0x56392e297974 check_bind_pos
        db/bindings.c:22
0x56392e297c0b db_bind_blob
        db/bindings.c:92
0x56392e297fcf db_bind_signature
        db/bindings.c:206
0x56392e26f5e8 wallet_htlc_sigs_add
        wallet/wallet.c:3872
0x56392e25186e peer_got_commitsig
        lightningd/peer_htlcs.c:2401
0x56392e214ed5 channel_msg
        lightningd/channel_control.c:1141
0x56392e25d4b2 sd_msg_read
        lightningd/subd.c:555
0x56392e3505bc next_plan
        ccan/ccan/io/io.c:59
0x56392e350a89 do_plan
        ccan/ccan/io/io.c:407
0x56392e350b26 io_ready
        ccan/ccan/io/io.c:417
0x56392e3524bb io_loop
        ccan/ccan/io/poll.c:453
0x56392e22adf0 io_loop_with_timers
        lightningd/io_loop_with_timers.c:22
0x56392e22fb86 main
        lightningd/lightningd.c:1328
0x7fb5515a7082 __libc_start_main
        ../csu/libc-start.c:308
0x56392e209fbd ???
        ???:0
0xffffffffffffffff ???
        ???:0
2023-09-22T05:06:19.044Z **BROKEN** lightningd: FATAL SIGNAL 6 (version v23.08.1-134-g10eecea)
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: common/daemon.c:38 (send_backtrace) 0x56392e282d06
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: common/daemon.c:75 (crashdump) 0x56392e282d58
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: /build/glibc-SzIz7B/glibc-2.31/signal/../sysdeps/unix/sysv/linux/x86_64/sigaction.c:0 ((null)) 0x7fb5515c608f
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ../sysdeps/unix/sysv/linux/raise.c:51 (__GI_raise) 0x7fb5515c600b
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: /build/glibc-SzIz7B/glibc-2.31/stdlib/abort.c:79 (__GI_abort) 0x7fb5515a5858
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: /build/glibc-SzIz7B/glibc-2.31/assert/assert.c:92 (__assert_fail_base) 0x7fb5515a5728
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: /build/glibc-SzIz7B/glibc-2.31/assert/assert.c:101 (__GI___assert_fail) 0x7fb5515b6fd5
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: db/bindings.c:22 (check_bind_pos) 0x56392e297974
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: db/bindings.c:92 (db_bind_blob) 0x56392e297c0b
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: db/bindings.c:206 (db_bind_signature) 0x56392e297fcf
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: wallet/wallet.c:3872 (wallet_htlc_sigs_add) 0x56392e26f5e8
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/peer_htlcs.c:2401 (peer_got_commitsig) 0x56392e25186e
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/channel_control.c:1141 (channel_msg) 0x56392e214ed5
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/subd.c:555 (sd_msg_read) 0x56392e25d4b2
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ccan/ccan/io/io.c:59 (next_plan) 0x56392e3505bc
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ccan/ccan/io/io.c:407 (do_plan) 0x56392e350a89
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ccan/ccan/io/io.c:417 (io_ready) 0x56392e350b26
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ccan/ccan/io/poll.c:453 (io_loop) 0x56392e3524bb
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/io_loop_with_timers.c:22 (io_loop_with_timers) 0x56392e22adf0
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: lightningd/lightningd.c:1328 (main) 0x56392e22fb86
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: ../csu/libc-start.c:308 (__libc_start_main) 0x7fb5515a7082
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: (null):0 ((null)) 0x56392e209fbd
2023-09-22T05:06:19.044Z **BROKEN** lightningd: backtrace: (null):0 ((null)) 0xffffffffffffffff
Log dumped in crash.log.20230922050619
Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.Lost connection to the RPC socket.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 
hMsats commented 1 year ago

@ddustin I can't connect with my test node to other nodes without crashing so I think my test node database is corrupt. I will close and remove my test node and start from scratch. I won't be able to test anything for some time, so just ignore what I wrote lately. At least it isn't my main node :-)

hMsats commented 1 year ago

@ddustin think I found the source of all my problems in the source code. See my pull request :-)

hMsats commented 1 year ago

@ddustin After adding the extra ?, everything ran perfectly. I've never seen anything like it, what a difference. Splicing worked, including listnodes, without encountering any problems. So thanks for all your hard work on this extremely difficult topic and your suggestions :1st_place_medal:. So I'm closing this closed issue :smile:

ddustin commented 1 year ago

That's awesome! Thanks for figuring that out! 🔥