lukeautry / tsoa

Build OpenAPI-compliant REST APIs using TypeScript and Node
MIT License
3.42k stars 489 forks source link

Help requested: Using in-memory caching with TSOA #1518

Closed evakkuri closed 6 months ago

evakkuri commented 8 months ago

Sorting

Expected Behavior

I'm looking to implement an in-memory cache for my TSOA API, and I'm wondering what would be the best way to implement this. I cannot set the cache in a controller, as controllers are recreated for each request, and probably having multiple caches would not work anyway.

Would dependency injection be the way to go here? However, I'm looking for a lightweight solution, as to start with I would only use caching for a single endpoint, and would not like to migrate the entire application to use DI.

Would there be some way of just passing a singleton cache instance to controllers as parameter? Or some other lightweight solution.

Current Behavior

Not clear how to pass a common in-memory cache instance to requests

Possible Solution

Steps to Reproduce

Context (Environment)

Version of the library: 5.1.1 Version of NodeJS: 18.18.2

Detailed Description

Request for help with implementing in-memory caching for TSOA

Breaking change?

github-actions[bot] commented 8 months ago

Hello there evakkuri 👋

Thank you for opening your very first issue in this project.

We will try to get back to you as soon as we can.👀

pquerner commented 7 months ago

Implement a cache decorator which fetches or retrieves your item from the cache.

//edit

To elaborate a bit more, here is a cache decorator:

import App from '../app'; // unrelated to decorator, read part about "header" data below
import Cache from '../services/cache'; // service around "memory-cache-node"

const cache = Cache.instance;

export default (id: string, ttl = 0) => {
    return (target: unknown, propertyKey: string, descriptor: PropertyDescriptor) => {
        const original = descriptor.value;

        descriptor.value = async function (...args: any[]) {
            const key = JSON.stringify({id, args});
            const ttl_key = key + '_' + 'ttl';

            if (!cache.hasItem(key)) {
                //Store promise
                const result = await original.apply(this, args);
                if (!result?.success) {
                    return;
                }
                cache.storeExpiringItem(key, result, ttl);

                //Store ttl of promise until it will automatically get cleared
                let d = new Date();
                const leeway = 2 * 1000; //Some extra time to make sure its cleared from cache
                d = new Date(d.getTime() + (ttl * 1000) + leeway);
                cache.storeExpiringItem(ttl_key, d.getTime(), ttl);
            } else {
                App.cacheUsed = true;
                // eslint-disable-next-line @typescript-eslint/ban-ts-comment
                // @ts-ignore
                App.cacheUsedTTL = parseInt(cache.retrieveItemValue(ttl_key));
            }

            return cache.retrieveItemValue(key);
        }
    };
}
//controller
    @Cache(
        'unique_key',
        Controller.getCacheTTL() //in seconds
    )

which works by using https://www.npmjs.com/package/memory-cache-node as in-memory cache.

Not shown in the decorator, to indicate that a cache response was given and a TTL when the cache is flushed, you have to override the getHeaders method from your Controller class. (Thats what the static helpers App.cacheUsed and App.cacheUsedTTL are for)

github-actions[bot] commented 6 months ago

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days