Sitemap generator for Remix applications
robots.txt
npm i remix-sitemap
// entry.server.tsx
import { createSitemapGenerator } from 'remix-sitemap';
// Step 1. setup the generator
const { isSitemapUrl, sitemap } = createSitemapGenerator({
siteUrl: 'https://example.com',
generateRobotsTxt: true
// configure other things here
})
export default async function handleRequest(
request: Request,
responseStatusCode: number,
responseHeaders: Headers,
remixContext: EntryContext
) {
// Step 2. add the sitemap response
if (isSitemapUrl(request))
return await sitemap(request, remixContext);
let markup = renderToString(
<RemixServer context={remixContext} url={request.url} />
);
responseHeaders.set('Content-Type', 'text/html');
return new Response('<!DOCTYPE html>' + markup, {
status: responseStatusCode,
headers: responseHeaders
});
}
Create a remix-sitemap.config.js
file at the project root
/** @type {import('remix-sitemap').Config} */
module.exports = {
siteUrl: 'https://example.com',
generateRobotsTxt: true
// configure other things here
}
Add a script using remix-sitemap
to package.json
to run after build.
For example if you are using npm-run-all
{
"scripts": {
"build": "npm-run-all build:*",
"build:remix": "remix build",
"build:sitemap": "remix-sitemap"
}
}
If you are using build time generation, the request will be empty
// app/routes/posts.$slug.tsx import type { SitemapFunction } from 'remix-sitemap';
export const sitemap: SitemapFunction = ({ config, request }) => { const posts = await getPosts();
return posts.map(post => ({
loc: /posts/${post.slug}
,
lastmod: post.updatedAt,
exclude: post.isDraft, // exclude this entry
// acts only in this loc
alternateRefs: [
{
href: ${config.siteUrl}/en/posts/${post.slug}
,
absolute: true,
hreflang: 'en'
},
{
href: ${config.siteUrl}/es
,
hreflang: 'es'
}
]
}));
};
### Exclude Route
```ts
// app/routes/private.tsx
import type { SitemapFunction } from 'remix-sitemap';
export const sitemap: SitemapFunction = () => ({
exclude: true
})
Url set can contain additional sitemaps defined by google. These are Google news, image or video.
You can add these sitemaps in sitemap function
by adding the news
, images
or videos
property.
export const sitemap: SitemapFunction = () => ({
images: [{ loc: 'https://example.com/example.jpg' }],
news: [{
title: 'Random News',
date: '2023-03-15',
publication: {
name: 'The Example Times',
language: 'en'
}
}]
});
If you have a lot of urls, you can split the sitemap in multiple files. You can do this by setting the size
property in the config.
This is only available in build time generation
/** @type {import('remix-sitemap').Config} */ module.exports = { siteUrl: 'https://example.com', size: 10000 }
you have two ways to cache the sitemap, the first one is using the Cache-Control
header
This is only available in runtime generation
createSitemapGenerator({ siteUrl: 'https://example.com', headers: { 'Cache-Control': 'max-age=3600' } })
and the second one is using the
cache
property in the configcreateSitemapGenerator({ siteUrl: 'https://example.com', cache: { get: async () => { return await redis.get('sitemap') || null; }, set: async (sitemap) => { await redis.set('sitemap', sitemap, 'EX', 3600); } } })
siteUrl
: the website base urlautoLastmod = true
: (optional) Add <lastmod />
property with the current date.priority = 0.7
: (optional) default priority for all entries.changefreq = 'daily'
: (optional) default changefreq for all entries.format = false
: (optional) format the sitemap for a better view.alternateRefs = []
: (optional) default multi language support by unique url for all entries.generateRobotsTxt = false
: (optional) generate robots.txt
file.robotsTxtOptions
: (optional) options for generating robots.txt
details.rateLimit
: (optional) limits the number of sitemap
functions that can execute at once.Runtime only
headers = {}
: (optional) headers for the sitemap and robots.txt response.cache
: (optional) cache the sitemap details.Build Time only
generateIndexSitemap = true
: (optional) generate index sitemap.sitemapBaseFileName = 'sitemap'
: (optional) the name of the generated sitemap file before the file extension.outDir = 'public'
: (optional) the directory to create the sitemaps files.size = 50000
: (optional) max size of the sitemap.policies = []
: (optional) policies for generating robots.txt
.additionalSitemaps = []
: (optional) add additionals sitemaps to robots.txt
.Fedeya hello@fedeminaya.com
Contributions, issues and feature requests are welcome!
Feel free to check issues page.
Give a ⭐️ if this project helped you!