Currently, with third party plugins creating lots of links, it's possible for Google SEO to require hundreds of SQL queries during a single page load. With a smarter caching system, these could be avoided. The problem is making it smarter without breaking anything.
For example, instead of real URLs, placeholders could be returned and then replaced before the output is sent to the client. However this would break cases where the returned URL is not output to the user but something else is done with it instead (such as a mail sent to the user).
Alternatively, we could listen in on database queries and cache any IDs returned in them. However while this would reduce queries by Google SEO, it would make all other queries more expensive because the results would be looped over twice instead of once then.
Alternatively, we could unconditionally cache URLs. Like all forums, and all most active users, and all recent threads. However this would mean that a lot of URLs would be computed in every request, whether or not they are actually required. While this would reduce number of queries on high load pages, it would increase load on all other pages that don't need those URLs.
Yet another approach would be to count on cooperation from the third party, i.e. the third party notifying us which URLs it is going to need before it actually fetches those URLs. This is what Google SEO Sitemap currently does. However this would mean that other plugins have to be modified for the sake of Google SEO.
Currently, with third party plugins creating lots of links, it's possible for Google SEO to require hundreds of SQL queries during a single page load. With a smarter caching system, these could be avoided. The problem is making it smarter without breaking anything.
For example, instead of real URLs, placeholders could be returned and then replaced before the output is sent to the client. However this would break cases where the returned URL is not output to the user but something else is done with it instead (such as a mail sent to the user).
Alternatively, we could listen in on database queries and cache any IDs returned in them. However while this would reduce queries by Google SEO, it would make all other queries more expensive because the results would be looped over twice instead of once then.
Alternatively, we could unconditionally cache URLs. Like all forums, and all most active users, and all recent threads. However this would mean that a lot of URLs would be computed in every request, whether or not they are actually required. While this would reduce number of queries on high load pages, it would increase load on all other pages that don't need those URLs.
Yet another approach would be to count on cooperation from the third party, i.e. the third party notifying us which URLs it is going to need before it actually fetches those URLs. This is what Google SEO Sitemap currently does. However this would mean that other plugins have to be modified for the sake of Google SEO.