GoogleChrome / lighthouse

Automated auditing, performance metrics, and best practices for the web.
https://developer.chrome.com/docs/lighthouse/overview/
Apache License 2.0
28.34k stars 9.36k forks source link

Inconsistent Pagespeed Insights scores based on the user location #10532

Open peixotorms opened 4 years ago

peixotorms commented 4 years ago

There is an issue that have been causing me an headache while testing from multiple regions, where Pagespeed Insights gives very different scores depending on where I am testing from.

It appears that Google detects my IP and routes my request to the nearest datacenter, from where it runs the pagespeed test against the website.

Get any website that is not behind cloudflare or a reverse proxy cdn. For example, the site of the ministry of foreign affairs of Taiwan (hosted in Taiwan) and test it on pagespeed insights from multiple locations.

https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fwww.mofa.gov.tw%2F

Looking at the mobile scores, this is what I get in average, out of 3 tests:

Taipei: 78 Singapore: 78 Bangkok, Thailand: 78 Bangalore, India: 78 Tokyo, Japan: 78

Seattle: 73 San Jose, CA: 65 Toronto, Canada: 60

London, UK: 58 Frankfurt, DE: 69 Istanbul, TR: 67

So what I see here, is that if the site is hosted in the same region as the person testing it via pagespeed insights, the score is pretty much consistent. However, if the person testing it happens to be on another continent, the scores will drop the further they are.

I believe, the scores should not be affected by the user testing them... it should be consistent, especially when companies have teams all over the world and each person is getting different results and could assume someone may be lying.

So I can suggest the following:

a) Have Pagespeed Insights check for the latency or IP location of the website (not the user) and always tests from the nearest available region to that server (this should ensure consistency).

b) Alternatively, add a dropdown option or some way to manually select the region for the testing tool, and run the test from the specified location.


And if you are adding any of these changes, may I suggest another option to adjust the network speed throttling as well (most people don't realize they are testing against the super slow 1.6 Mbps, 150 ms latency throttled network and simply assume that any mobile, will be that slow).

That not being possible, maybe make that information more visible on that page and advise to use the Desktop tab for measuring performance on fast or more modern mobile devices.

Regardless of this, the consistency issue is of higher priority please.

patrickhulce commented 4 years ago

Continued from https://github.com/GoogleChrome/lighthouse/issues/10457#issuecomment-617240689

Performance metrics and the score are expected to be different depending on the test location, i.e. users using your application from different locations will experience different performance and so any UX-centric performance metrics should capture that difference or they are not doing their job correctly.

Can't you guys proxify all requests through the USA or something like that, to ensure consistency?

That would harshly penalize sites without a US presence in addition to being plain inconsiderate, so no, we can't :)

For example, web.dev always makes requests from the same region

What led you to believe this?


In response to your original proposal...

The team's goal here is to increase the visibility of the fact that the score will be different depending on test location, not to eliminate its influence. In the event that any steps are taken to mitigate variance from these sources, it would never be guaranteed and always be subject to load balancing behavior anyhow.

Have Pagespeed Insights check for the latency or IP location of the website (not the user) and always tests from the nearest available region to that server

Considered, but sites that are deployed on a CDN would still face the exact same difficulties as today.

Alternatively, add a dropdown option or some way to manually select the region for the testing tool, and run the test from the specified location.

Not going to be an option due to core goal reasons above. If this is something required by your team, there are many paid services that offer a variety of guaranteed test locations.

peixotorms commented 4 years ago

Thank you for the heads up!

It makes sense... but then, there are a couple of issues.

What led you to believe this?

a) If I call the API for Pagespeed Insights, it appears to always test from the USA. b) Testing on web.dev also appears to test from the USA.

I can be wrong about it, no problem, but I looked at my access logs and tested from multiple regions and that's what it looked like, as opposed of using the website.

Even if I am wrong, the fact remains that using the website for testing, results in more variation than if using web.dev or the api.

If the website tool is already using the api... then I have no idea why the different results. It's pretty consistent via api or web.dev

Not going to be an option due to core goal reasons above. If this is something required by your team, there are many paid services that offer a variety of guaranteed test locations.

That's ok, and we already have something similar on webpagetest.org and of course there are other services.

The team's goal here is to increase the visibility of the fact that the score will be different depending on test location, not to eliminate its influence.

Then for the sake of transparency, can we please have a small note saying something like "scores calculated as measured from your location" or something like that. That will increase the visibility for that fact, and eliminate confusion.

patrickhulce commented 4 years ago

"scores calculated as measured from your location" or something like that

Great idea! :) 👍

exterkamp commented 4 years ago

"scores calculated as measured from your location" or something like that

Some way of indicating this would be nice to have. We'll have to think about how to surface this.

peixotorms commented 4 years ago

Can I ask also, why do you have web.dev and also pagespeed insights? Why not merge them together?

paulirish commented 4 years ago

After discussion, we'd rather just have consistent results (for any user location). So we'll treat that as the bug to be fixed.

patrickhulce commented 4 years ago

@paulirish while a laudable goal I would also love to see, do we really think this is achievable? :)

There have been many issues filed that boiled down to the server producing different content depending on where it was requested from. Even if all variance is eliminated and accounted for in simulation there will still be plenty of cases where the results differ and knowing which location was used would be useful. Perhaps it's worth exploring both angles separately?

Also @peixotorms just wanted to confirm for you that you were indeed correct that web.dev always requests results from the PSI API in the United States, so it almost always gets routed to corresponding PSI servers located in the US. It was a surprise to me and the web.dev team is aware of the issue, but AFAIK no timetable for a fix. I assume whatever is done will happen in conjunction with the decision here.