Closed Tkachez closed 2 months ago
Are you able to provide a minimal reproduction? It seems like you're not deploying your site using a production
NODE_ENV
(the default)
I got the same issue and searched for two hours, why it didn't work and if I made a configuration error.
What I found out is that this issue relates with #137. When downgrading only the robots package to v4.0.2
the meta tag works again.
see: https://github.com/nuxt-modules/robots/issues/137#issuecomment-2299684308
{
"devDependencies": {
"@nuxtjs/seo": "^2.0.0-rc.18",
},
"overrides": {
"@nuxtjs/robots": "4.0.2"
}
}
Are you able to provide a minimal reproduction? It seems like you're not deploying your site using a
production
NODE_ENV
(the default)
Unfortunately I don't, but I've resolved the issue by downgrading the robots package to version to ^3.0.0
In this version everything works by default without extra steps
It seems like the documentation lack some step or one can skip it by selecting nuxt config option for example
Regarding a possible reproduction. I use Vercel as my hoster and use the default deployment process. Maybe this helps 👍
This is my config:
site: {
indexable: process.env.VERCEL_ENV === 'production',
redirectToCanonicalSiteUrl: true,
url: 'https://www.domain.com',
name: 'My project name',
description: 'My project description',
defaultLocale: 'en',
}
Sorry for the confusion, there was a bug in the patch version so please try the latest version.
redirectToCanonicalSiteUrl: true,
I'm a bit confused
What issue is fixed? I've tried updating version to 4.1.3
but still no fix for my issue
As you stated in your issue, the <meta>
robots tag was always the following:
<meta name="robots" content="noindex, nofollow">
The issue for that was because of a new release of @nuxtjs/robots
.
@harlan-zw pushed a bugfix and new release to @nuxtjs/robots
and also @nuxtjs/seo
that fixes the issue.
At least on my project, everything works fine now using routeRules to disable some pages from indexing.
As you stated in your issue, the
<meta>
robots tag was always the following:<meta name="robots" content="noindex, nofollow">
The issue for that was because of a new release of
@nuxtjs/robots
. @harlan-zw pushed a bugfix and new release to@nuxtjs/robots
and also@nuxtjs/seo
that fixes the issue.At least on my project, everything works fine now using routeRules to disable some pages from indexing.
Thanks I'll try via routeRules
If you still have issues with the latest release please provide a minimal reproduction and I'll do my best to help.
Package version: 4.0.2
So the issue is that the docs don't give any valid config for include every site page. I've tried just adding the module to the modules array to see if that works by default.
So no that fives me noindex nofollow meta tag
I tried adding a config like this
If I look into the nuxt dev tools it is ok. An my robots meta tag is as follows:
<meta name="robots" content="noindex, nofollow" data-hint="disabled in development">
If I run production build my tag is
<meta name="robots" content="noindex, nofollow">
I've also tried:
metaTag: false
, but I guess it's a bad ideaMy question is how do I just have my pages crawled by robots?