Currently, there are a few parts that are a bit memory hungry, though it does scale in a reasonable way with the data size/functionality. Still there's definitely room for improvement.
A few in progress issues will significantly improve the way memory is used without having to specifically optimize, possibly there's nothing left to optimize after those.
The base amount of memory being used depends on the amount of CSS (and to some degree the structure of it). On a site with reasonable CSS this base amount is already quite low and under 100MB (some sites as low as 20 to 30MB), which is normal for a complex editor UI with a lot of data. This also includes memory used by the page itself.
But it's actually quite common for sites to have an unreasonable amount of CSS, which quickly brings this number into the hundreds. This is expected to improve with the new parser which duplicates far less strings. (string interning doesn't help much because a lot of stuff is concatenated/manipulated in inefficient ways).
(semi) leaks
There's a few cases where memory will leak a bit, though arguably it's somewhere in between. For instance, the previous inspection results are never cleared from memory, but in theory should stay reachable until you clear history. After clearing the array of previous results is still there but de facto unreachable through normal interactions.
It's also possible that elements disappear from the page, in which case they should be eligible for GC. Then again, atm you can still reach this inspection state using history. This case could perhaps later be handled by keeping track of all DOM changes in the frame and being able to rewind them.
Avoidable creation of garbage
On this front the situation is already quite good, as not a lot of event listeners or memoized data have to be recreated on renders, which is a common source of performance problems in React apps. It's uncommon for GC to even happen during JS execution, most of the time the browser has plenty of room to do it in between without interrupting anything.
There's a few interactions that incur more frequent GC interruptions, but none are concerning.
Finding references in other variables
Adding an alias: this temporarily loads a large list of color mappings which can probably be held in memory instead. This might be inherent to the library that is used.
Inspection on pages with a lot of CSS or a deep tree.
Currently, there are a few parts that are a bit memory hungry, though it does scale in a reasonable way with the data size/functionality. Still there's definitely room for improvement.
A few in progress issues will significantly improve the way memory is used without having to specifically optimize, possibly there's nothing left to optimize after those.
Current memory situation
Base amount
The base amount of memory being used depends on the amount of CSS (and to some degree the structure of it). On a site with reasonable CSS this base amount is already quite low and under 100MB (some sites as low as 20 to 30MB), which is normal for a complex editor UI with a lot of data. This also includes memory used by the page itself.
But it's actually quite common for sites to have an unreasonable amount of CSS, which quickly brings this number into the hundreds. This is expected to improve with the new parser which duplicates far less strings. (string interning doesn't help much because a lot of stuff is concatenated/manipulated in inefficient ways).
(semi) leaks
There's a few cases where memory will leak a bit, though arguably it's somewhere in between. For instance, the previous inspection results are never cleared from memory, but in theory should stay reachable until you clear history. After clearing the array of previous results is still there but de facto unreachable through normal interactions.
It's also possible that elements disappear from the page, in which case they should be eligible for GC. Then again, atm you can still reach this inspection state using history. This case could perhaps later be handled by keeping track of all DOM changes in the frame and being able to rewind them.
Avoidable creation of garbage
On this front the situation is already quite good, as not a lot of event listeners or memoized data have to be recreated on renders, which is a common source of performance problems in React apps. It's uncommon for GC to even happen during JS execution, most of the time the browser has plenty of room to do it in between without interrupting anything.
There's a few interactions that incur more frequent GC interruptions, but none are concerning.