Currently, the scraper scrapes websites sequentially. There is a lot of wasted time in the function waiting for network (probably 99%!). This increases the cost of the function even though it is limited to 20 websites only.
It would be better to do async requests and write the final file once all scrapers finish.
Currently, the scraper scrapes websites sequentially. There is a lot of wasted time in the function waiting for network (probably 99%!). This increases the cost of the function even though it is limited to 20 websites only. It would be better to do async requests and write the final file once all scrapers finish.