koishijs / koishi

Cross-platform chatbot framework made with love
https://koishi.chat
MIT License
4.53k stars 248 forks source link

[RFC] Cache API #237

Closed shigma closed 2 years ago

shigma commented 3 years ago

鸽了挺久没做,现在差不多是时候了。

koishi-core 内部使用了 lru-cache 作为缓存以提高性能。然而这种实现方式存在一些问题:

为此,我计划设计一套 Cache API,并于之后的版本实装。

Cache 接口

Cache API 完全基于 table-key-value 架构设计,因此接口相对简单:

ctx.cache.get(table, key)
ctx.cache.set(table, key, value)

但是用户数据可能基于不同的键值查询,怎么办呢?我们需要把用户表按照键值拆分为多个子表:

const id = ctx.cache.get('onebot', '123456789')  // 获取内部用户 ID
const user = ctx.cache.get('user', id)           // 获取用户数据

定义缓存表

插件可以定义缓存表来自定义访问 Cache API。

// 比如搜图插件想要缓存 saucenao 的搜索结果
ctx.cache.extend('saucenao', options)

const value = ctx.cache.get('saucenao', 'https://image-url')

实现形式

为了确保向下兼容性,koishi-core 仍然会自带一个 lru-cache 作为内部的实现。同时,也可以通过插件定义其他的 Cache API 实现。插件的好处在于可以使用专业数据库提升存储能力,同时支持持续存储(当应用重载时,内存中的缓存会被清除,但是本地数据库中的数据并不会)。目前计划率先支持的实现是:

purerosefallen commented 3 years ago

Something like koishi-plugin-redis?

shigma commented 3 years ago

Sure, but I currently have no plan to officially support redis. The reason is as follows.

We discussed about redis support before. According to my knowledge, redis can only be connected through network. If one runs Koishi and its database in separate servers, redis may help optimizing performance. But if one doesn't, an in-memory cache will be much better.

And currently in fact, many users of Koishi runs Koishi together with its database. So it may not be of great need to implement such plugin. But I will be grad if you can write one after cache API is implemented.

purerosefallen commented 3 years ago

Redis could run on the same server with Koishi itself. In addition, it would be much better if cached data could survive between restarts. In addition, small-memory servers could also run Koishi by moving Redis instance away from this server.