Closed levogirar closed 3 years ago
Yes, everyday errors appears in PHP logs because search engines try to request the file "robots.txt" on the website but currently it doesn't exist. So it should be good to create a "robots.txt" that indicate where is the "sitemap.xml". This sitemap file tells search engine crawlers which pages can be requested from the website. It should be generated dynamically to list all current company profile pages available.
@Nils85 I almost got it all done. However, there is a gap in my knowledge and I am not able to get the permalink for each company. I tried new DataAccess\Company()
but don't know how to continue from that. Could you help me to get the permalink
for each company and store it inside $urls
?
The sitemap.php
looks like this, now.
<?php
header("Content-type: application/xml; charset=utf-8");
echo '<?xml version="1.0" encoding="UTF-8" ?>';
?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://wuwana.com</loc>
<changefreq>daily</changefreq>
<priority>1.0</priority>
</url>
<?php
// Dynamically generated sitemap for each company
$root = 'https://wuwana.com/';
$urls = array("permalink1", "permalink2", "permalink3");
foreach ($urls as $permalink) {
echo '<url>';
echo '<loc>' . $root . $permalink . '</loc>';
echo '<changefreq>weekly</changefreq>';
echo '</url>';
}
?>
</urlset>
And the output looks like this
@levogirar See the pull request #161
You have to test the sitemap generator because I didn't. And I think the robots.txt file needs to be dynamic too...
This helps bots crawl the website.