Closed conallCcraft closed 7 months ago
Okay so you mentioned a couple of things that aren't quite the same:
1) SEOmatic doing nothing at all 2) Preventing bots from indexing pages
You can disable SEOmatic via Twig or PHP code like this (Twig shown):
{% if entry.someCustomField == 'whatever' %}
{% do seomatic.config.renderEnabled(false) %}
{% endif %}
That will accomplish 1) but it will not accomplish 2), because there will be no robots
tag telling search engines not to index the site, and also there's nothing telling SEOmatic to not put the entry in the sitemap.
Typically you'd set up an SEO Settings field, and allow the user to override the Robots tag setting. This will cause it to output the <meta name="robots" content="none">
tag to prevent the page being indexed, and it will also ensure that the entry is not included in the sitemap.
Also, just disabling the entry via Craft's status will accomplish the same thing, but it will also make the URL not accessible at all (which often is what you want, as bots that don't respect the robots
tag won't be able to access the content at all -- but sometimes it's not what you want).
More on sitemaps here: https://nystudio107.com/blog/seo-myths-top-5-sitemap-myths-demystified
I'm just wondering if there is any way to programatically switch SEOmatic off/prevent SEOmatic loading for specific pages on a site, so that pages would be hidden from the sitemap and unavailable for indexing by crawlers. I'm aware there are user interface settings for these in the SEO tab for an entry - What I want to achieve is that a user would click one switch on an entry that would Hide that entry by preventing Seomatic from doing anything on that page at all, no meta tag or any site mapping.
Thanks