Open R0dri opened 3 years ago
I managed to come up with this line:
curl -s -L $sitemap | grep "<loc>" | awk -F"<loc>" '{print $2} ' | awk -F"</loc>" '{print $1};' | xargs -I {} webkit2png -F -D webgrab -W 1920 {}
it works good enough for me. I will be posting later a complete bash script with args im thinking in something like
webgrab theurl.com -S
-s -W 960px (Half of 1080p res)
-m -W 1280px (standard for 720p res)
-l -W 1920px (standard for 1080p res)
-n name of the directory that will be used
This will add http header, go to /page-sitemap.xml (default for wordpress pages), get all sites at the sitemap and download them into a carpet named webgrab.
I will post the script when I'm done soon.
I found this tool very impressive and useful. I will be using this for web design and will help my workflow a lot. I will be making a bash script (I'll post it here when it is done) so I can grab multiple pages at once from the sitemap. Probably the first level pages or maybe a tree in the future. It seems like a simple enough feature to add on, is there any interest in doing so?