jkomoros / card-web

The web app behind thecompendium.cards
Apache License 2.0
46 stars 8 forks source link

Reduce memory use #659

Open jkomoros opened 1 year ago

jkomoros commented 1 year ago

Now that there are ~8k cards, we're hitting the memory limits again in some limited circumstances.

It should be possible to throw out all of the intermediate processed text in ProcessedRuns and only reextract when a card becomes active?

jkomoros commented 1 year ago

The crashes happen maybe half of the time when having devtools open and refreshing.

The largest strings when this happens are retained with traces like


59 [[1,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 2 ] } } ]],[2,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 4 ] } } ]],[3,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 6 ] } } ]],[4,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 8 ] } } ]],[5,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 10 ] } } ]],[6,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 12 ] } } ]],[7,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 14 ] } } ]],[8,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 16 ] } } ]],[9,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 18 ] } } ]],[10,[{ "targetChange": { "targetChangeType": "ADD", "targetIds": [ 20 ] } } ]],[11,[{ "targetCha"@7917 | 13 | 200 % | 71 801 81214 % |  
-- | -- | -- | -- | --
responseTextinqd@7657 | 12 | 1040 % | 71 802 63614 % |  
0insystem / CallSiteInfo@18029173 | 11 | 280 % | 280 % |  
0in(internal array)[]@18029171 | 10 | 320 % | 2000 % |  
0insystem / ErrorStackData@18029163🗖 | 9 | 120 % | 2120 % |  
<symbol>inType

It appears that part of the challenge is that very large strings coming from Firebase are being retained, with lots of substrings cut out of them, and never released?

A few random ideas: 1) making a copy of card objects right when they come back from firestore to make sure they're all acutal fresh smaller strings and not a subset of a large string?, 2) firebase to chunk up large requests for cards into smaller individual chunks so there are no huge stirngs like that to maintain in the first place?

Trying to set memoize(entries=1) by default doesn't fix the OOM (although also doesn't seem to reduce the hit rate very much, and FingerprintGenerator continues to be the most expensive calcuation)

jkomoros commented 1 year ago

https://github.com/firebase/firebase-js-sdk/issues/6118 seems related So does https://github.com/firebase/firebase-js-sdk/issues/4416, specifically https://github.com/firebase/firebase-js-sdk/issues/4416#issuecomment-788225325