Closed zw963 closed 2 years ago
BTW, ask another question, i saw some methods like: browser.reset
, browser.network.clear(:traffic)
, if any of those method should be invoke to release memory before goto 'some_url' again in the retry caluse?
@zw963 you can try play with options for wait_for_idle
method:
wait_for_idle(connections: 0, duration: 0.05, timeout: @page.browser.timeout)
browser.reset, browser.network.clear(:traffic), if any of those method should be invoke to release memory
Sort of, it could be, however, I don't think your case related to memory lack.
browser.reset
need to close browser tabs, so if you have many opened tabs (possibly by script on-page that opens tabs automatically) it could be a reason for many connections.
browser.network.clear(:traffic)
need to clear browser's cache or collected traffic, but in this case, we have an error right on the next visit same URL again, so it shouldn't be related to memory lack due to traffic.
Anyway, need to play with the website to see closer and analyze the reasons. It could be fab if you are able to provide more details and the source as well, let's convert it to a discussion and proceed with it.
I have code like this:
Then load it use:
It works when first time scrap, but, if timeout happen, when goto some_url again,
wait_for_idle
will keep block current network request, utilFerrum::TimeoutError
happen, and raise error like this:But after remove all
browser.network.wait_for_idle
, all works quite well.Thank you.