Open PeterFarber opened 1 month ago
I was recently analysing the memory dump of our token and was kinda suprised that that it has almost 90mb - where effectively sth like a map of ~100 balances is being hold - BUT - we've changed its source code several times (if I understand correctly - aos load
sends an Eval
interaction to the AOS process - and it causes lua's load
to interpret the new code) - so maybe it has sth to code with loading/updated process' code?
Another example - our oracle process - https://cu.ao-testnet.xyz/state/fev8nSrdplynxom78XaQ65jSo7-88RxVVVPwHG8ffZk - 50mb
This is NOT a AOS process - but a process built with dev-cli
from this source code https://github.com/warp-contracts/ao-redstone-oracle/blob/main/redstone-oracle-process/process.lua
Quickly reviewing the contents of the dump - it seems that like 60% are the source codes and the rest is the state of the process (in this case some jsons stored in a lua table)
I was recently analysing the memory dump of our token and was kinda suprised that that it has almost 90mb - where effectively sth like a map of ~100 is being hold - BUT - we've changed its source code several times (if I understand correctly -
aos load
sends anEval
interaction to the AOS process - and it causes lua'sload
to interpret the new code) - so maybe it has sth to code with loading/updated process' code?Another example - our oracle process - https://cu.ao-testnet.xyz/state/fev8nSrdplynxom78XaQ65jSo7-88RxVVVPwHG8ffZk - 50mb
This is NOT a AOS process - but a process built with
dev-cli
from this source code https://github.com/warp-contracts/ao-redstone-oracle/blob/main/redstone-oracle-process/process.luaQuickly review the contents of the dump - it seems that like 60% are the source codes and the rest is the state of the process (in this case some jsons stored in a lua table)
Yeah looks like most of the dump is that the returned memory is never being freed. Do you have discord? Would like to review your finding. @ppedziwiatr
We have a dedicated Slack with you Guys - https://redstone-ujk1058.slack.com/archives/C06NG8C2CNR - so maybe drop us a message there?
Description
When running wasm64 (WebAssembly 64-bit) modules in our application, we've encountered an issue where memory usage steadily increases over time. This behavior occurs unpredictably and seems to be independent of specific module functions or operations.
Expected Behavior
Memory usage should remain stable or increase within expected limits as defined by the application's memory management and garbage collection routines.
Actual Behavior
Memory consumption grows continuously, eventually leading to performance degradation and potential crashes due to memory exhaustion.
Steps to Reproduce (uncertain)
Unfortunately, the issue of unexpected wasm64 memory growth appears sporadic, making it challenging to provide precise steps for reproduction.
Impact
This issue severely impacts the stability and performance of our application when running wasm64 modules, especially in long-running sessions.
Proposed Solution
Investigate and address the root cause of the memory growth in wasm64 modules. This may involve: