Closed BimsaraFernando closed 2 weeks ago
I suspect it was returned because it was at max pages, but the pages were mostly empty? Going to look into it more as soon as I can. Use FH to inspect directory nodes. https://richardah.github.io/xrpl-keylet-tools/
I suspect it was returned because it was at max pages, but the pages were mostly empty? Going to look into it more as soon as I can. Use FH to inspect directory nodes. https://richardah.github.io/xrpl-keylet-tools/
Many of the pages have only 11 entries, not the full 32.
It's caused by constant deleting and adding state to a namespace. Look at the first directory page:
{
"result": {
"index": "EBAFA545CDF3D15FB785F75CB57B1DF116F3252B42B96041377B02A9AC789D52",
"ledger_hash": "476029E022F6F7A46972A540284A23AAA91E1FDC33D428E4FF7367A73E72833E",
"ledger_index": 8248527,
"node": {
"Flags": 0,
"IndexNext": "3d101",
"IndexPrevious": "3ffff",
"Indexes": [],
"LedgerEntryType": "DirectoryNode",
"Owner": "rsfTBRAbD2bYjVuXhJ2RReQXxR4K5birVW",
"RootIndex": "EBAFA545CDF3D15FB785F75CB57B1DF116F3252B42B96041377B02A9AC789D52",
"index": "EBAFA545CDF3D15FB785F75CB57B1DF116F3252B42B96041377B02A9AC789D52"
},
"validated": true
},
"status": "success",
"type": "response"
}
Notice that the next page is not index 1, but rather index 250113. This is because all of the interceding pages have since been deleted, and the doubly linked list now goes from page 0 (which can't be deleted unless the dir is completely empty) to page 250113.
The rippled code is only checking the index number not how many pages there actually are. https://github.com/Xahau/xahaud/blob/833df20fce95493c72c161e44415b3f448351c86/src/ripple/ledger/impl/ApplyView.cpp#L95
In other words this is a pre-existing rippled bug that we've just stumbled upon due to heavy state use. There's a few ways to fix it but for now I will advise the Evernode programmers to periodically rotate namespaces.
Nice find 🫨
I think there's no reason to have this condition. The sfIndex fields are uint64s, so they can continue to increment essentially indefinitely. The amount of ledgers it would take to make 32 * 2**64 hook state creates and deletes would very likely exceed the lifespan of the human race.
Issue Description
Steps to Reproduce
When there are more than 45k hook states, the Xahaud node prints the max directory limit error.
Expected Result
Hook states should be created since the maximum directory limit is greater than that.
Actual Result
xahaud node logs show the following error.
But the hook state count is checked at 2024-Sep-03 15:07:31.552141610 UTC, it shows only 45248 states.
state_set
method.hsfNSDELETE
, It keeps gettingtefPARTIAL
clearing the states 255 by 255 and thentesSUCCESS
after states have been removed in the namespace.Environment
Xahaud mainnet Account - rsfTBRAbD2bYjVuXhJ2RReQXxR4K5birVW
Supporting Files