Open andynuss opened 3 months ago
Most of the time travel was in secret agent too, but there are some small parts that might be making things worse, and/or some pages that could be causing issues when there are a ton of dom updates. We don't have controls yet to turn off time travel or to turn it down, but it seems to be a high priority to do something. Thanks for reporting this.
I am asking a question here about a broad topic, because of something quite noticeable in my scraping sessions as compared the old secret-agent codebase and this new one, and one big change that I have made:
I am going to quickly make changes to introduce some self-imposed yielding. And I am going to get less aggressive with my one big change to decide when a page has "loaded": instead of basing it on resource activity dial down, as I used to, I am basing it on a heuristic of percent change to the dom. But this involves repeated injections to get the dom (I didn't think that would be expensive) and a moderately cpu-expensive approach to compare the two doms.
This combination should ease the load on the cpu to bring it back to the realm of acceptable, but I was wondering about changes in Hero vs Secret Agent codebases that might involve more cpu tasks going on than before, even if we don't use them? Specifically, the "TimeTravel" feature?
Are there knobs that exist (or could be added) that allow us to disable features like this that might consume significant CPU? Aside from the immediate issue I was facing above, a "bare-metal" use of devtools has to be better for mitigating against bot detection, right?