Closed miohtama closed 3 months ago
I believe I've isolated the source of Bad URLs / CLS issues on desktop. This is occurring on pages that load content client-side. The biggest culprit is DataTables with client-side fetch using Skeleton loader. The charts on the trading-pairs page also cause this issue (to a lesser degree).
This hypothesis explains why the issue is occurring on desktop and not mobile. For most of the problematic pages, the client-side loaded content is below-the-fold on mobile (so it doesn't affect CLS score), but above-the-fold on desktop.
The table below makes this clear. For the first set of pages, the client-side loaded content is below-the-fold on mobile and above-the-fold on desktop. For the second set, the client-side content is above-the-fold on both mobile and desktop. As a control group, the last set of pages have no client-side loaded content.
Page | CLS score – mobile | CLS score – desktop |
---|---|---|
below-fold mobile / above-fold desktop | ||
/trading-view/[chain] | 0.00 🟢 | 0.17 🟠 |
/trading-view/[chain]/[exchange] | 0.00 🟢 | 0.27 🔴 |
/trading-view/[chain]/[exchange]/[pair] | 0.00 🟢 | 0.16 🟠 |
/trading-view/[chain]/tokens/[token] | 0.00 🟢 | 0.37 🔴 |
below-fold on both | ||
/trading-view/exchanges | 0.18 🟠 | 0.14 🟠 |
/trading-view/[chain]/tokens | 0.61 🔴 | 0.68 🔴 |
control – no client-side content | ||
/ | 0.00 🟢 | 0.00 🟢 |
/about | 0.00 🟢 | 0.00 🟢 |
/trading-view | 0.00 🟢 | 0.00 🟢 |
/blog | 0.00 🟢 | 0.00 🟢 |
Here are some initial thoughts on ways we could improve CLS scores on bad pages above.
Optimize the backend
API requests sufficiently so we can pre-fetch the data needed during SSR. Data updates (changing the sort, pagination) would occur client-side.
This would address the worst-offending pages (the ones with DataTables / Skeleton loader). This is not a viable option for charts.
We can decrease the impact on CLS by ensuring that the pre-fetch placeholder content more closely matches the post-fetch (real) content in terms of layout impact. E.g, we could render an actual table with a fixed number of rows (matching the max page rows) with skeleton / placeholder content in each cell and fixed column width / row height.
Eliminate content that causes CLS issues when Google user agent is detected.
tbd
Another issue to be aware of is 503 errors that are not resolved in a short period of time. Our Chains Under Maintenance error page returns a 503
response code (which seems like the most appropriate option), and BNB Chain has been under maintenance for a couple days now (as of Jul 14).
BNB Chain represents the largest number of exchanges, tokens and pairs in our dataset, so the majority of indexible pages are our site are currently returning 503
errors for an extended period.
The following articles provide insights on how extended 503 errors can impact search rankings:
When drilling into Bad URLs on Google Search Console, the example URLs provided are all under /trading-view/binance
(not surprising since it represents the largest number of URLs with CLS issues):
Currently, attempting to run Lighthouse tests on any /trading-view/binance
page fails due to the 503
:
Lighthouse was unable to reliably load the page you requested. Make sure you are testing the correct URL and that the server is properly responding to all requests. (Status code: 503
To mitigate the search rankings impacts of extended 503
errors, we may wish to implement a different solution for indicating Chain Under Maintenance.
Rather than serving a 503
error page, we could serve a reduced version of the page being requested (exchange details, token details, pair details, etc.) that includes summary sections and a warning about stale data / chain data under maintenance.
This ticket was "done" in 2022, but failed to get "closed".
Problem
On Google Search Console, our Good URL % for desktop is low:
On the plus side:
<1%
on Jul 7 to25%
on Jul 1299.5%
on Jul 12Goal
Investigate to determine root cause and fix!
Possible causes
References