Open ddanilyuk opened 9 months ago
@ddanilyuk Thanks for looking into this! Looks like we indeed have an issue with reclaiming cached stores at the moment. Brandon and I will discuss potential solutions soon!
@ddanilyuk We chatted about this today. To zoom out a bit, we started introducing features from version 1.5 of the library that required changes to internals that affected performance characteristics of the TCA runtime. These changes can regress the performance of some TCA apps in some versions of TCA from 1.5 through 1.8. While we have concrete solutions to these regressions planned for 2.0, it will take a little time to get there, and in the meantime we will need to consider the trade-offs in which issues we should address and how we could temporarily address them.
Our main concern is if you've encountered a real world problem here, or if it's theoretical based off the memory growth. If this memory issue doesn't adversely affect most applications, we may play it safe for now and accept the bloat temporarily with a fix planned for 2.0. Alternately, we may explore introducing algorithms to prune these invalidated stores, but we have found that we must be careful here, as the invalidation logic can cause performance issues of its own. Note that folks adversely affected by these performance characteristics may wish to stick with TCA<1.5 for now.
I think ideally we keep our sights on TCA 2.0 and move quickly to it so that these changes in performance can be fully addressed. In the meantime, we should try to measure the trade-offs in applications seeing issues and best evaluate the fixes we apply pre-2.0.
@stephencelis: This makes me extremely hesitant to (as we were about to) upgrade our 1.4 app until 2.0 hits. Is it possible to be more explicit about what exactly is being leaked and under what circumstances so we can guestimate whether we will likely have issues here before we spend a bunch of time doing it?
@stephencelis I think this problem is a bit theoretical for me. In my main project, there aren't many IDs that change in lists during a user session, so it shouldn't cause significant memory issues. However, I'm unsure how scoping performance will degrade as the number of stores continues to grow.
Have you considered manually cleaning up the stores (or doing so with a feature dismiss)? If not, why not?
Description
Hi! Found an issue (i think) with scoping stores for lists.
Here is an example of a memory graph after opening and closing a feature with a List 10 times. After all this opening and closing, we have 70 cached stores of non-present rows. However, we should not have any of it since this feature is not present, isn't it?
This could potentially become a performance issue in large-scale apps.
Do we need a cleanup mechanism for the stores when the presented view closes? Or maybe we have a possibility to remove caching when using lists?
Repo with issue: [link]
Checklist
main
branch of this package.Expected behavior
No response
Actual behavior
No response
Steps to reproduce
No response
The Composable Architecture version information
1.8.0
Destination operating system
iOS 17
Xcode version information
15.2
Swift Compiler version information
No response