Brain-WP / Cortex

Routing system for WordPress
MIT License
347 stars 20 forks source link

[Question] Is there any way to give Cortext routes a higher priority over WP routes? #19

Closed wujekbogdan closed 6 years ago

wujekbogdan commented 7 years ago

Hey,

Firs of all - thanks for Cortex. It's a great tool! WP sucks, but with Cortex it suck a bit less ;P


I have a little problem and I'd like to know if the solution I came up with is the only solution, or maybe is there a better, simpler one.

I have a hierarchical taxonomy. Let's say it's a country/region/district hierarchy. So, among others, I will have the following routes:

It looks very simple so far, but there's a little problem. Let's pick the first route: mysite.com/{country}. This URL structure looks the same a default WordPress URL for pages. So, in order to create such a route I do the following trick:

// Get all pages
$pages = new \WP_Query([
    'post_type' => 'page',
    'posts_per_page' => -1,
]);

// Get an array of all the slugs
$pagesSlugs = wp_list_pluck($pages->posts, 'post_name');

// Build a pipe separated string
$pagesSlugsPipeSeparated = implode('|',$pagesSlugs);

// Register a route. 
$routes->addRoute(new QueryRoute(
    '{country:(?!' . $pagesSlugsPipeSeparated . ')[a-z0-9-]+}', // Exclude all pages using regexp
    function (array $matches) {
        return [
            // My query args
        ];
    },
    [
        'template' => 'test-router.php'
    ]
));

So, I build routes using regexp and exclude all pages slugs from the route. I could do it in an opposite way - I could join all taxonomy terms - but this solution is un-performant because of a huge amount of taxonomy terms.

I'd like to avoid so complex regexp if possible.


So the question is: is it possible to make Cortex routes to have a higher priority over WordPress default routes? Basically, I'd like it to work like the WordPress add_rewrite_rule function when the $after parameter is set to top.

I know that I can disable WordPress rewrite rules and use Cortex for everything, but this is not a solution.

gmazzap commented 7 years ago

Hi @wujekbogdan thanks for your interest and your feedback.

This is an issue everyone using Cortex and WordPress has faced at some point.

When I was lucky enough to be able to use Cortex for everything, that has been very easy.

In other cases I used an approach very similar to the yours. But with a cache in between. What I did is to write a custom Route class that decorates QueryRoute and maintain a cache of page slugs, that is updated when pages are updated.

So my routes looks something like:

$routes->addRoute(new TermsRoute(
    function (array $matches) {
        return [ ... ];
    }
));

TermsRoute internally creates a QueryRoute passing a regex similar to what you have as first argument and the closure passed to TermsRoute as second QueryRoute argument.

To get the list of pages, TermsRoute uses a method that looks into a transient for the list of page slugs.

When using a persistent object cache (e.g. Redis or Memcached) this is very fast.

Then in some place I have:

add_action( 'save_post_page', [ TermsRoute::class, 'rebuild_cache'] );

so everytime a page is updated, TermsRoute rebuild its cache. This happen on backend so there's less concern about performance.

I know this is some work, but unfortunately this is the best option I came up in terms of balancing performance and convenience for this use case.

wujekbogdan commented 7 years ago

@gmazzap Hey.

Thanks for your response. I didn't mention caching in my post because I didn't want to make my post too complex, but I do cache the slugs (I use wp transient as well). I do it in a less sophisticated way, I do not create an abstraction like you do, but the logic is pretty much the same.

I hoped that there's a better solution, but it seems that what we do is the only way to solve this problem :/

The problem with this approach is that regex length is limited. This is why I decided to exclude pages slugs (a blacklist approach), instead of whitelisting taxonomy terms. I have only ~30 pages, but thousands of taxonomy terms. But I can imagine a scenario when both solutions will result in a very long regexp that exceeds the limit.


Let's get back to the point - isn't it possible to force Cortex to have a higher priority over WP rewriting? I know that it's not a solution for all problem, but it would, at least, solve some of them.

gmazzap commented 7 years ago

Cortex has higher priority over WP rewrites. If something matches in Cortex should not match in WP. The other way around (make WP rewrites takes precedence over Cortex) is not really possible, because Cortex acts before rewrites are processed.

wujekbogdan commented 7 years ago

@gmazzap It's good (that it works this way) and it's bad (because it means that I messed up something in my code:))

Thanks!

gmazzap commented 7 years ago

:) in this article https://roots.io/routing-wp-requests/ there's an explaination of the working principle of Cortex, maybe it can help.