Closed toreau closed 1 year ago
This looks like an issue with this website detecting robots and refusing to work with them, if you use the devtools option, you can see a whole bunch of http errors from the website refusing to download content.
As a counter example, a slightly altered script to work with the google image function works fine
#!/usr/bin/env perl
#
use strict;
use warnings;
use Data::Dumper;
use feature 'say';
use Firefox::Marionette;
use Firefox::Marionette::Keys qw( PAGE_DOWN );
my $url = 'https://www.google.com.au/search?q=perl&hl=en&source=lnms&tbm=isch';
my $firefox = Firefox::Marionette->new(
'width' => 1920,
'height' => 1080,
'visible' => 1,
);
$firefox->go( $url );
my $html = $firefox->html;
while ( 1 ) {
$firefox->chrome->perform(
$firefox->key_down( PAGE_DOWN ),
$firefox->pause( 500 ),
)->release->content;
if ( $firefox->html eq $html ) {
last;
}
$html = $firefox->html;
}
$firefox->quit;
I have this simple test script, but lazy-loading doesn't work: