Open better-salmon opened 10 months ago
Any update on this issue?
Edit by maintainers: Comment was automatically minimized because it was considered unhelpful. (If you think this was by mistake, let us know). Please only comment if it adds context to the issue. If you want to express that you have the same problem, use the upvote π on the issue description or subscribe to the issue for updates. Thanks!
Does this mean that custom cache handlers can't be used with the pages
router at all until this issue is fixed?
@iscekic hello! Even the latest Next.js 14.1.1-canary.7 fails to serve pre-rendered pages generated within the Pages Router using a custom cache handler from the current docs example. You may use App Router because it works in this case. Or, if you have to use Pages Router, look at the @neshca/cache-handler library. It overcomes this issue when utilizing the lib's default useFileSystem option.
So if I'm using the pages
router, there's no point in using it at all?
I tried getting it to work, but I ran into problems.
const { IncrementalCache } = require("@neshca/cache-handler");
const createRedisCache = require("@neshca/cache-handler/redis-strings").default;
const Redis = require("ioredis");
IncrementalCache.onCreation(async () => {
const url = `redis://${process.env.REDIS_URL}`;
const client = new Redis(url);
const redisCache = createRedisCache({
client,
useTtl: true,
timeoutMs: 5000,
});
return {
cache: [redisCache],
useFileSystem: true,
};
});
module.exports = IncrementalCache;
const { IncrementalCache } = require("@neshca/cache-handler");
const Redis = require("ioredis");
IncrementalCache.onCreation(async () => {
const url = `redis://${process.env.REDIS_URL}`;
const client = new Redis(url);
const redisCache = {
name: "redis-cache",
async get(key) {
console.log("requesting", key);
const result = await client.get(key);
return JSON.parse(result);
},
async set(key, value, maxAge) {
console.log("setting", key, value, maxAge);
await client.set(key, JSON.stringify(value), "EX", maxAge);
},
};
return {
cache: [redisCache],
useFileSystem: true,
};
});
module.exports = IncrementalCache;
Both of these configs behave the same.
In dev mode (with debug mode on), I'm getting:
using filesystem cache handler
not using memory store for fetch cache
Attempting to build with next build
fails, so I can't test:
Error: NextRouter was not mounted. https://nextjs.org/docs/messages/next-router-not-mounted
at h (/Users/igor/Projects/fireside-next-app/build/server/chunks/9218.js:2795:25075)
at o (/Users/igor/Projects/fireside-next-app/build/server/chunks/3034.js:1:15919)
at _ (/Users/igor/Projects/fireside-next-app/build/server/chunks/3034.js:1:11980)
at renderWithHooks (/Users/igor/Projects/fireside-next-app/node_modules/react-dom/cjs/react-dom-server.browser.development.js:5658:16)
at renderIndeterminateComponent (/Users/igor/Projects/fireside-next-app/node_modules/react-dom/cjs/react-dom-server.browser.development.js:5731:15)
at renderElement (/Users/igor/Projects/fireside-next-app/node_modules/react-dom/cjs/react-dom-server.browser.development.js:5946:7)
at renderNodeDestructiveImpl (/Users/igor/Projects/fireside-next-app/node_modules/react-dom/cjs/react-dom-server.browser.development.js:6104:11)
at renderNodeDestructive (/Users/igor/Projects/fireside-next-app/node_modules/react-dom/cjs/react-dom-server.browser.development.js:6076:14)
at renderNode (/Users/igor/Projects/fireside-next-app/node_modules/react-dom/cjs/react-dom-server.browser.development.js:6259:12)
at renderChildrenArray (/Users/igor/Projects/fireside-next-app/node_modules/react-dom/cjs/react-dom-server.browser.development.js:6211:7)
β Generating static pages (10/10)
Did you add the following code to the next.config.js
file? Notice that custom cacheHandler
will not work in dev mode.
const nextConfig = {
cacheHandler: process.env.NODE_ENV === 'production' ? require.resolve('./cache-handler.js') : undefined,
// Use `experimental` option instead of the `cacheHandler` property when using Next.js versions from 13.5.1 to 14.0.4
/* experimental: {
incrementalCacheHandlerPath:
process.env.NODE_ENV === 'production' ? require.resolve('./cache-handler.js') : undefined,
}, */
};
To use custom cacheHandler
build and run a production Next.js server:
npm run build
npm run start
If you see using filesystem cache handler
you are using the default Next.js cache. You will see this log when using @neshca/cache-handler
: using custom cache handler @neshca/cache-handler with 1 Handlers and file system caching
.
require.resolve('./cache-handler.js')
with correct path.Ah, I did forget the next config - once I set the cacheHandler
property, I did get repeated logs of:
using custom cache handler @neshca/cache-handler with 1 Handler and file system caching
And I see it creating entries in redis, so it seems to be working. Here's the config I used:
const { IncrementalCache } = require("@neshca/cache-handler");
const {
reviveFromBase64Representation,
replaceJsonWithBase64,
} = require("@neshca/json-replacer-reviver");
const noop = require("lodash/noop");
const Redis = require("ioredis");
const REVALIDATED_TAGS_KEY = "sharedRevalidatedTags";
IncrementalCache.onCreation(async () => {
const url = `redis://${process.env.REDIS_URL}`;
const client = new Redis(url);
function assertClientIsReady() {
if (client.status !== "ready") {
throw new Error("redis client is not ready (cache-handler)");
}
}
const redisCache = {
name: "redis-cache",
async get(key) {
assertClientIsReady();
const result = await client.get(key);
if (!result) {
return null;
}
return JSON.parse(result, reviveFromBase64Representation);
},
async set(key, value, ttl) {
assertClientIsReady();
const ttlArgs = typeof ttl === "number" ? ["EX", ttl] : [];
await client.set(
key,
JSON.stringify(value, replaceJsonWithBase64),
...ttlArgs,
);
},
async getRevalidatedTags() {
assertClientIsReady();
const sharedRevalidatedTags = await client.hgetall(REVALIDATED_TAGS_KEY);
const entries = Object.entries(sharedRevalidatedTags);
const revalidatedTags = entries.reduce((acc, [tag, revalidatedAt]) => {
acc[tag] = Number(revalidatedAt);
return acc;
}, {});
return revalidatedTags;
},
async revalidateTag(tag, revalidatedAt) {
assertClientIsReady();
await client.hset(REVALIDATED_TAGS_KEY, {
[tag]: revalidatedAt,
});
},
};
return {
cache: [redisCache],
useFileSystem: true,
};
});
module.exports = IncrementalCache;
Thanks for the assistance @better-salmon
You are welcome! If you'll have any questions or issues with @neshca/cache-handler
open a discussion or an issue in lib's repository.
@better-salmon does this issue apply to cases where @neshca/cache-handler
is used and useFileSystem
is set to true
?
Hello @MauriceArikoglu! By default, the useFileSystem
option is set to true
in @neshca/cache-handler
. This option allows the pre-rendered HTML file to be read from the file system, which helps overcome a bug. However, the bug occurs when using a custom cache handler that doesn't read from the file system, such as the example from the Next.js docs or @neshca/cache-handler
with useFileSystem
set to false
. To fix this issue, the Next.js server must implement a one-time server-side rendering of the page, then serve it to the client and put it in the custom cache. I hope this explanation makes things more straightforward!
@better-salmon does this also affect server-side rendering in general? Or is the cache ignored and only used in SSG-scenarios with ISR?
The cache is used for SSG/ISR pages in Pages and App Router, fetch call results, and unstable_cache callback calls in App Router. Server-side rendered pages, in general, are not affected by the cache in any way β such pages are marked as dynamic. Next.js doesn't cache them.
The cache is used for SSG/ISR pages in Pages and App Router, fetch call results, and unstable_cache callback calls in App Router. Server-side rendered pages, in general, are not affected by the cache in any way β such pages are marked as dynamic. Next.js doesn't cache them.
@better-salmon I would love to brainstorm about a caching implementation for SSR pages. SSG is not a viable option for us, since our build step is isolated and can't access backend. We do have SSR pages that could be cached, as they have a TTL of > 1h with infrequent changes to data. I never understood next's approach in only allowing static pages at build time. (I know there is workarounds but they are not really feasible for a big app, like generating with mock data at build and revalidating post deploy)
Simply set fallback: true
or fallback: 'blocking'
to fix this issue. The problem here is that your cache is in memory, next sets the cache during build but it gets reset when you start the server.
This issue should be closed.
I get following debug message when using redis cache, what could be wrong with configuration?
NEXT_PRIVATE_DEBUG_CACHE=1 npm run start
> my-web-app@1.0.0-development start
> next start
β² Next.js 14.2.3
- Local: http://localhost:3000
β Starting...
β Ready in 640ms
using custom cache handler @neshca/cache-handler is not configured yet
Here is my cache-handler.js
const { CacheHandler } = require('@neshca/cache-handler');
const createRedisHandler = require('@neshca/cache-handler/redis-stack').default;
const createLruHandler = require('@neshca/cache-handler/local-lru').default;
const { createClient } = require('redis');
const { PHASE_PRODUCTION_BUILD } = require('next/constants');
CacheHandler.onCreation(async () => {
let client;
try {
// Create a Redis client.
client = createClient({
url: 'redis://localhost:6379',
});
// Redis won't work without error handling.
client.on('error', () => {});
} catch (error) {
console.warn('Failed to create Redis client:', error);
}
if (client) {
try {
console.info('Connecting Redis client...');
// Wait for the client to connect.
// Caveat: This will block the server from starting until the client is connected.
// And there is no timeout. Make your own timeout if needed.
await client.connect();
console.info('Redis client connected.');
} catch (error) {
console.warn('Failed to connect Redis client:', error);
console.warn('Disconnecting the Redis client...');
// Try to disconnect the client to stop it from reconnecting.
client
.disconnect()
.then(() => {
console.info('Redis client disconnected.');
})
.catch(() => {
console.warn(
'Failed to quit the Redis client after failing to connect.',
);
});
}
}
/** @type {import("@neshca/cache-handler").Handler | null} */
let handler;
console.info('The redis client is ready: ', client?.isReady);
if (client?.isReady) {
// Create the `redis-stack` Handler if the client is available and connected.
handler = await createRedisHandler({
client,
keyPrefix: 'prefix:',
timeoutMs: 1000,
});
} else {
// Fallback to LRU handler if Redis client is not available.
// The application will still work, but the cache will be in memory only and not shared.
handler = createLruHandler();
console.warn(
'Falling back to LRU handler because Redis client is not available.',
);
}
return {
handlers: [handler],
};
});
module.exports = CacheHandler;
and I have added following configuration in next.js
cacheHandler:
process.env.NODE_ENV === 'production'
? require.resolve('./cache-handler.js')
: undefined,
@RishikeshDarandale I got the same issue. Have you solved your issue?
@cbou , I have written the custom handler myself as below and used explicitly fetch
(previously using apollo-graphql
client with node-fetch
) for server side graphql calls.
import Redis from 'ioredis';
const redis = new Redis();
class CacheHandler {
constructor(options) {
this.options = options;
}
static #debug = typeof process.env.NEXT_PRIVATE_DEBUG_CACHE !== 'undefined';
async get(key) {
key = 'next:' + key;
if (CacheHandler.#debug) {
console.debug('Get the key from the cache: ', key);
}
let data = await redis?.get(key);
if (data) {
data = JSON.parse(data);
if (CacheHandler.#debug) {
console.debug('Cache HIT for key', key);
}
} else {
if (CacheHandler.#debug) {
console.debug('Cache MISS for key', key);
}
}
return data;
}
async set(key, data, ctx) {
key = 'next:' + key;
if (CacheHandler.#debug) {
console.debug('Set the key from the cache: ', key, ctx);
}
await redis?.setex(
key,
ctx.revalidate,
JSON.stringify({
value: data,
lastModified: Date.now(),
tags: ctx.tags,
}),
);
if (CacheHandler.#debug) {
console.debug('The value is set in the cache for: ', key);
}
}
async revalidateTag(tag) {
const stream = redis.scanStream({ match: `next:*` });
stream.on('data', async (keys) => {
await Promise.all(
keys.map(async (key) => {
const value = await redis.get(key);
if (value && JSON.parse(value).tags.includes(tag)) {
await redis.del(key);
if (CacheHandler.#debug) {
console.debug('The key is deleted from the cache: ', key, value);
}
}
}),
);
});
}
}
export default CacheHandler;
Link to the code that reproduces this issue
https://github.com/better-salmon/custom-cache-404-for-ssg-pages
To Reproduce
npm run build && npm run start
.Current vs. Expected behavior
Expected behavior:
The page is loaded with content and status code 200.
Actual behavior:
404 page is shown.
Verify canary release
Provide environment information
Which area(s) are affected? (Select all that apply)
Not sure
Additional context
In the examples, there is a lack of reading static HTML from the disk, and the cache is empty; the app fails to serve such pages and shows 404 page.