notum-cz / strapi-next-monorepo-starter

This is a ready-to-go starter template for Strapi projects. It combines the power of Strapi, NextJS, Shadcn/ui libraries with Turborepo setup and kickstarts your project development.
56 stars 2 forks source link

feat: robots.txt File #10

Open tocosastalo opened 3 months ago

tocosastalo commented 3 months ago

Ability to edit the robots.txt file

The robots.txt file is a simple text file placed in the root directory of a website. It serves as a set of instructions for web crawlers (like those used by search engines) on how to interact with the site’s content. By editing the robots.txt file, you can control which parts of your website are accessible to crawlers and which should be restricted.

Purpose of the robots.txt File:

Examples of Use Cases:

User-agent: BadBot Disallow: /

tocosastalo commented 2 months ago

related to #2