Closed Appolinars closed 7 months ago
Thank you, @Appolinars . I will review and research on Monday so we can better decide what actions we can provide here.
I was not able to reproduce this issue.
My use of https://pagespeed.web.dev produces requests with User-Agent header: Mozilla/5.0 (Linux; Android 7.0; Moto G (4)) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4590.2 Mobile Safari/537.36 Chrome-Lighthouse
Since this package still uses User-Agent header for bot recognition, there's not much to be done within the perimeter of this package. I will refer you to clarifications section of the readme.
However, you can use reverse dns lookup to determine whether the source of the request is Google's servers:
import isbot from "isbot";
import { verify } from "reverse-dns-lookup";
import requestIP from "request-ip";
const clientIp = requestIP.getClientIp(request);
const isGoogleServer = await verify(clientIp, "google.com");
const knwonBot = isGoogleServer || isbot(request.headers["user-agent"]);
I was not able to reproduce this issue.
My use of https://pagespeed.web.dev produces requests with User-Agent header:
Mozilla/5.0 (Linux; Android 7.0; Moto G (4)) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4590.2 Mobile Safari/537.36 Chrome-Lighthouse
Since this package still uses User-Agent header for bot recognition, there's not much to be done within the perimeter of this package. I will refer you to clarifications section of the readme.
However, you can use reverse dns lookup to determine whether the source of the request is Google's servers:
import isbot from "isbot"; import { verify } from "reverse-dns-lookup"; import requestIP from "request-ip"; const clientIp = requestIP.getClientIp(request); const isGoogleServer = await verify(clientIp, "google.com"); const knwonBot = isGoogleServer || isbot(request.headers["user-agent"]);
Thanks for your reply
Steps to reproduce
In Next JS application I tried to conditionaly load Third Part Scripts so they won't affect my Page Speed.![image](https://github.com/omrilotan/isbot/assets/70111099/28091040-6e73-4c6e-b7e8-21753628a6d1)
Basically I just check if it's bot and return null:![image](https://github.com/omrilotan/isbot/assets/70111099/74a5c5a7-6a22-42a9-85bd-aa172ae86365)
But it does not work, PageSpeed Insights still sees my scripts. I found this issue on GoogleChrome's Github: https://github.com/GoogleChrome/lighthouse/issues/14917
So do I understand correctly that this package does not work anymore with Lighthouse?