This is an admin extension for Optimizely CMS 12+ for managing robots content. Stott Robots Handler is a free to use module, however if you want to show your support, buy me a coffee on ko-fi:
Robots.txt content can be managed on a per site and host definition basis. A host of "default" applies to all unspecified hosts within a site, while specific host definitions will only apply to the specific host.
Introduced within version 4.0.0
Environment Robots allows you to configure the meta robots tag and X-Robots-Tag
header for all page requests within the current environment. This functionality provides the ability to prevent search engine robots from scanning indexing a site that is a lower level environment or a production environment that is not ready for general consumption.
Options will always exist for Integration, Preproduction, Production and the current environment name. This allows you to preconfigure a lower enviroment when cloning content from production to lower environments.
When a configuration is active, a Meta Tag Helper will look for and update the meta robots tag while a middleware will include the X-Robots-Tag
header. Its best in this case that your solution always renders the meta robots element and allow the Meta Tag Helper to either override it or remove it where needed.
The meta tag helper will execute for any meta
tag with a name
attribute. The logic within the robots tag helper will only execute where the name
attribute has a value of robots
. In such a circumstance it will perform one of the following actions
name
is robots
name
is not robots
Examples: | Page Robots | Environment Robots | Result |
---|---|---|---|
noindex,nofollow | noindex,nofollow,noimageindex | noindex,nofollow,noimageindex | |
noindex,nofollow | - | noindex,nofollow | |
- | noindex,nofollow,noimageindex | noindex,nofollow,noimageindex | |
- | - | meta robots tag is removed |
Install the Stott.Optimizely.RobotsHandler
package into your website.
You need to ensure the following lines are added to the startup class of your solution:
public void ConfigureServices(IServiceCollection services)
{
services.AddRobotsHandler();
}
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
app.UseRobotsHandler();
app.UseEndpoints(endpoints =>
{
endpoints.MapContent();
endpoints.MapControllers();
});
}
The call to services.AddRobotsHandler()
sets up the dependency injection requirements for the RobotsHandler solution and is required to ensure the solution works as intended. This works by following the Services Extensions pattern defined by Microsoft.
The call to app.UseRobotsHandler()
sets up the middleware required to create the X-Robots-Tag
header
The call to endpoints.MapControllers();
ensures that the routing for the administration page, assets and robots.txt are correctly mapped.
In the _ViewImports.cshtml
file you will need to add the following line to include meta robots tag helper.
@addTagHelper *, Stott.Optimizely.RobotsHandler
As this package includes static files such as JS and CSS files within the Razor Class Library, your solution must be configured to use Static Web Assets. This is done by adding webBuilder.UseStaticWebAssets();
to your Program.cs
as follows:
Host.CreateDefaultBuilder(args)
.ConfigureCmsDefaults()
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseStartup<Startup>();
webBuilder.UseStaticWebAssets();
});
You can read more about shared assets in Razor Class Libraries here: Create reusable UI using the Razor class library project in ASP.NET Core
This solution also includes an implementation of IMenuProvider
which ensures that the Robots Handler administration pages are included in the CMS Admin menu under the title of "Robots". You do not have to do anything to make this work as Optimizely CMS will scan and action all implementations of IMenuProvider
.
The configuration of the module has some scope for modification by providing configuration in the service extension methods. The provision of authorizationOptions
is optional in the following example.
Example:
services.AddRobotsHandler(authorizationOptions =>
{
authorizationOptions.AddPolicy(RobotsConstants.AuthorizationPolicy, policy =>
{
policy.RequireRole("WebAdmins");
});
});
If the authorizationOptions
is not provided, then any of the following roles will be required by default:
If you are using the new Optimizely Opti ID package for authentication into Optimizely CMS and the rest of the Optimizely One suite, then you will need to define the authorizationOptions
for this module as part of your application start up. This should be a simple case of adding policy.AddAuthenticationSchemes(OptimizelyIdentityDefaults.SchemeName);
to the authorizationOptions
as per the example below.
services.AddRobotsHandler(authorizationOptions =>
{
authorizationOptions.AddPolicy(RobotsConstants.AuthorizationPolicy, policy =>
{
policy.AddAuthenticationSchemes(OptimizelyIdentityDefaults.SchemeName);
policy.RequireRole("WebAdmins");
});
});
I am open to contributions to the code base. The following rules should be followed:
Thank you for your feedback and contributions go to the following members of the community:
Contributor | Bug Reports | Pull Requests |
---|---|---|
Paul Mcgann | 1 | 1 |
Ellinge | 1 | 1 |
Tomas Hensrud Gulla | - | 1 |
Anish Peethambaran | 1 | - |
Deepa V Puranik | 1 | - |
Mahdi Shahbazi | 1 | - |
Praveen Soni | 1 | - |
jhope-kc | 1 | - |