Open regisphilibert opened 1 year ago
I ran into this issue as well and build a caching layer using lowdb. I replaced all the sanity calls with cachedSanityQuery
providing the query
and params
as arguments.
it will create a uuid based on query
and param
check if the data already exists in lowdb, otherwise a request to sanity will be made and the data is stored in lowdb for future requests.
this drastically improved the build times and developer experience for our project and also dramatically reduced the number of request to the api.
I replaced all sanity.fetch()
calls with this… you might want to adjust the CACHE_MAX_AGE
for development if you change content more often.
// /* eslint-disable no-console */
import { v5 as uuidv5 } from "uuid";
import { dirname, join } from "path";
import { fileURLToPath } from "url";
import { createClient } from "@sanity/client";
import { sanityConfig } from "../../sanity.config.mjs";
import { Low, Memory } from "lowdb";
import { JSONFile } from "lowdb/node";
interface CachedUrl {
cacheKey: string;
data: unknown;
createdCacheAt: number;
}
interface Cache {
queries: CachedUrl[];
}
const CACHE_MAX_AGE = 1000 * 30;
const CACHE_ID_NAMESPACE = "1b671a64-40d5-491e-99b0-da01ff1f3341";
const dbDir = dirname(fileURLToPath(import.meta.url));
const file = join(dbDir, "db.json");
const debug = false;
const adapter = debug ? new JSONFile(file) : new Memory();
const db = new Low<Cache>(adapter);
const client = createClient(sanityConfig);
await db.read();
db.data ||= { queries: [] };
const cachedSanityQuery = async (
query: string,
params: Record<string, string>
) => {
const contentHash = uuidv5(
query.replace(/\s/g, "") + JSON.stringify(params),
CACHE_ID_NAMESPACE
);
const cachedData = db.data?.queries.find((entry) => {
// entry.cacheKey === contentHash &&
// console.log("found matching cacheKey", contentHash);
// entry.createdCacheAt + CACHE_MAX_AGE >= Date.now()
// ? console.log("is not yet expired")
// : console.log("is expired");
return (
entry.cacheKey === contentHash &&
entry.createdCacheAt + CACHE_MAX_AGE >= Date.now()
);
});
if (!cachedData) {
// console.warn("❌ cache miss");
const data = await client.fetch(query, params);
// console.log("isArray");
let newData;
if (Array.isArray(data)) {
newData = {
cacheKey: contentHash,
createdCacheAt: Date.now(),
dataArray: data,
};
} else {
newData = {
cacheKey: contentHash,
createdCacheAt: Date.now(),
...data,
};
}
db.data?.queries.push(newData);
await db.write();
return data;
}
// console.log("✅ cache hit");
if ("dataArray" in cachedData) {
return cachedData.dataArray;
}
return cachedData;
};
export { cachedSanityQuery };
It would be great if the plugin handled caching of API responses like Astro Image is doing.
Use case
I've stored some site metadata on a Sanity dataset (site title, site description, site default image etc...) so that SEO data or the footer can be populated with it.
Now when I invoke this function in the footer component to fetch the site description, it dramatically slows down build time. I'm using
use: Cdn
which must speeds up the response, but it seems each pages still needs to wait for it in order to be built. (~ 200ms)This also drastically increases API/CDN stats.