Open jplhomer opened 2 years ago
Related, should we add an in-memory cache by default in @shopify/hydrogen/platforms/node
?
This comment here made me stop when I was going to try it 😅
I almost got this working with some nice utils from Miniflare!
// server.js
import {RedisStorage} from '@miniflare/storage-redis';
import IORedis from 'ioredis';
import {Cache} from '@miniflare/cache';
const redis = new IORedis({
host: process.env.REDIS_URL,
port: process.env.REDIS_PORT,
password: process.env.REDIS_PASSWORD,
});
const redisStorage = new RedisStorage(redis, 'cache');
cache = new Cache(redisStorage);
Problems arise due to mismatches between typeof Request
because Miniflare uses undici, while we use node-fetch. Once we make Node v16 a requirement, that becomes a possibility 👍
Stackblitz Node 16 - https://github.com/stackblitz/webcontainer-core/issues/560
@jplhomer Is there any progress on this? Our app lives in Node land.
It's a little bit hacky, but I managed to get this working by passing the request's url - undici
will happily accept a string.
import {hydrogenMiddleware} from '@shopify/hydrogen/middleware';
import serveStatic from 'serve-static';
import compression from 'compression';
import bodyParser from 'body-parser';
import connect from 'connect';
import path from 'path';
import {RedisStorage} from '@miniflare/storage-redis';
import IORedis from 'ioredis';
import {Cache} from '@miniflare/cache';
const redis = new IORedis({
host: process.env.REDIS_URL,
port: process.env.REDIS_PORT,
password: process.env.REDIS_PASSWORD,
});
const redisStorage = new RedisStorage(redis, 'cache');
class HackyCache extends Cache {
put = (req, res) => {
return super.put(req.url, res);
};
match = (req, options) => {
return super.match(req.url, options);
};
delete = (req, options) => {
return super.delete(req.url, options);
};
}
const redisCache = new HackyCache(redisStorage);
const port = process.env.PORT || 8080;
// Initialize your own server framework like connect
const app = connect();
// Add desired middlewares and handle static assets
app.use(compression());
app.use(serveStatic(path.resolve(__dirname, '../', 'client'), {index: false}));
app.use(bodyParser.raw({type: '*/*'}));
app.use(
hydrogenMiddleware({
getServerEntrypoint: () => import('./src/App.server'),
indexTemplate: () => import('./dist/client/index.html?raw'),
cache: redisCache,
}),
);
app.listen(port, () => {
console.log(`Hydrogen server running at http://localhost:${port}`);
});
@nattyg93 Ahh super cool — I'm going to check out your example and maybe steal it 😄
Something that I noticed is that if redis becomes unreachable then the app will crash. Now, I don't know if this is something you folks are likely to handle in Hydrogen, or whether you are of the opinion that it should be handled in the Cache implementation. As a work around I did the following, but It's not particularly robust.
class HandledRedisStorage extends RedisStorage {
has = async (key) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.has(key);
} catch (e) {
console.log(e);
return false;
}
};
hasMany = async (...args) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.hasMany(...args);
} catch (e) {
console.log(e);
return false;
}
};
get = async (...args) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.get(...args);
} catch (e) {
console.log(e);
return false;
}
};
getMany = async (...args) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.getMany(...args);
} catch (e) {
console.log(e);
return false;
}
};
put = async (...args) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.put(...args);
} catch (e) {
console.log(e);
return false;
}
};
putMany = async (...args) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.putMany(...args);
} catch (e) {
console.log(e);
return false;
}
};
delete = async (...args) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.delete(...args);
} catch (e) {
console.log(e);
return false;
}
};
deleteMany = async (...args) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.deleteMany(...args);
} catch (e) {
console.log(e);
return false;
}
};
list = async (...args) => {
try {
if (redis.status !== 'ready') {
return false;
}
return await super.list(...args);
} catch (e) {
console.log(e);
return false;
}
};
}
You can pass
cache
to Hydrogen's Node.js middleware, which is an instance of the Cache API.However, we don't provide any resources for Node.js users to do so. They're on their own to discover this (lack of docs probably) and there doesn't seem to be any Cache-interfaced Redis adapter in the wild.
This should be fairly trivial to implement, so we could provide an example. If we find it's super useful, even release it as a package, since Cache is a huge part of the Hydrogen framework story.