Hi everyone,
I have a Scrapy script that uses "scrapy_user_agents" and i was planning to change to "scrapy-fake-useragent". I was wondering if exist a mechanism to filter or validate the user agent to use. I need to check if its from the latests versions of the browser. Thats because currently some of my scripts fails, and when i investigate the error it turns out that the website indicates that my user-agent is outdated and "not secure" to use their page.
So, I came here to ask for advice, and if this is not the place for this type of question, I would appreciate it if you would let me know so I can ask it where appropriate.
Hi everyone, I have a Scrapy script that uses "scrapy_user_agents" and i was planning to change to "scrapy-fake-useragent". I was wondering if exist a mechanism to filter or validate the user agent to use. I need to check if its from the latests versions of the browser. Thats because currently some of my scripts fails, and when i investigate the error it turns out that the website indicates that my user-agent is outdated and "not secure" to use their page.
So, I came here to ask for advice, and if this is not the place for this type of question, I would appreciate it if you would let me know so I can ask it where appropriate.
Thanks in advance