Closed jocxfin closed 6 months ago
robots.txt logic will disallow all robots and crawlers from visiting your site by default. If you want to allow them, set ROBOTS_ALLOW to true
ROBOTS_ALLOW
true
robots.txt logic will disallow all robots and crawlers from visiting your site by default. If you want to allow them, set
ROBOTS_ALLOW
totrue