nuxt-modules / robots

Tame the robots crawling and indexing your Nuxt site.
https://nuxtseo.com/robots
MIT License
424 stars 34 forks source link

how to dynamic set sitemap in robots.txt #44

Closed Cappuuuu closed 2 years ago

Cappuuuu commented 3 years ago

in front-end, I can get window.location.origin like this screenshot

截圖 2021-03-31 上午11 10 31

but i want set dynamic sitemap in nuxt.config.js in robots.txt ex: robots: { UserAgent: '*', Sitemap: ${window.location.origin}/sitemap.xml }, here window object is not exist , Is there any good way to resolve it ? Thanks !

Tymo93 commented 3 years ago

why don't you use dotenv? you can set a variable to be your origin.

TitanFighter commented 3 years ago
modules: [
    ...
    ['@nuxtjs/robots', {
      UserAgent: '*',
      Sitemap: (req) => `https://${req.headers.host}/sitemap.xml`
    }],
    ...
]
danielgroen commented 3 years ago
modules: [
    ...
    ['@nuxtjs/robots', {
      UserAgent: '*',
      Sitemap: (req) => `https://${req.headers.host}/sitemap.xml`
    }],
    ...
]

Did the trick for me! Would be nice to add this up in the README.md since now it is kinda a hidden feature

IsraelOrtuno commented 2 years ago
modules: [
    ...
    ['@nuxtjs/robots', {
      UserAgent: '*',
      Sitemap: (req) => `https://${req.headers.host}/sitemap.xml`
    }],
    ...
]

Be aware that this would not work on target: 'static' mode

mklueh commented 2 years ago

@IsraelOrtuno could you please explain why it is not working for static mode?

IsraelOrtuno commented 2 years ago

@mklueh Static generated sites won't be running any logic in the server side (there's also no server at all) and in this case, the sitemap URL is being generated by a function which will also require access the request headers. For that case it's better to use a ENV variable as @Tymo93 mentioned.

mklueh commented 2 years ago

@IsraelOrtuno I see, but isn't there just a hook that gives you all the pages generated after the build is completed? Because I guess Nuxt knows what pages were touched anyway

IsraelOrtuno commented 2 years ago

Not sure about the hook but robots.txt serves for more purposes than adding the sitemap URL (which will likely be generated by your own app or a third party module). The point here is that if you pretend to have a dynamic sitemap URL (or any other property in your robots file), it has to run on SSR as that function needs to be executed, otherwise you can make it static with an ENV var.