redis / redis-om-node

Object mapping, and more, for Redis and Node.js. Written in TypeScript.
MIT License
1.18k stars 80 forks source link

Unexpected error when using search #195

Closed nullishamy closed 1 year ago

nullishamy commented 1 year ago

The OM encounters an error when using the search functionality. The cause is unknown, and has only started happening recently, when we deployed to prod. We never encountered this during development. This problem only happens intermittently, and does not seem to follow any pattern etc.

Driver:

export async function fetchMessagesByAuthor (authorId: string): Promise<CachedMessage[]> {
  return (await messageRepository.search()
    .where('authorId').equals(authorId)
    .return.all()) as CachedMessage[]
}

Schema:

export const messageSchema = new Schema('message', {
  messageId: { type: 'string' },
  authorId: { type: 'string' },
  channelId: { type: 'string' },
  content: { type: 'text' },
  hexHash: { type: 'string' }
})

export const messageRepository = new Repository(messageSchema, redis)
await messageRepository.createIndex()

Stack:

TypeError: Cannot read properties of null (reading 'length')
at documentValue (/app/node_modules/@redis/search/dist/commands/SEARCH.js:29:23)
at Object.transformReply (/app/node_modules/@redis/search/dist/commands/SEARCH.js:17:61)
at transformCommandReply (/app/node_modules/@redis/client/dist/lib/commander.js:89:20)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Client.search (/app/node_modules/redis-om/dist/index.js:1681:14)
at async Search.callSearch (/app/node_modules/redis-om/dist/index.js:1398:23)
at async Search.page (/app/node_modules/redis-om/dist/index.js:1286:27)
at async fetchMessagesByAuthor (/app/lib/cache/op.js:22:13)
guyroyse commented 1 year ago

What version of Redis are you using? Both for development and for production? Also, what version of Node Redis is listed in your package.json?

I suspect that this is a garden variety bug that happens when search returns no results. But if that suspicion is true, it could be in either Redis OM or Node Redis.

Also, I might be able to hop on a Zoom call and help troubleshoot this this afternoon (Eastern Time).

nullishamy commented 1 year ago

package.json

    "redis": "^4.6.7",
    "redis-om": "^0.4.0-beta.3",

Development redis (was not locked to a version in the compose so it just pulled latest at the time)

root@cebcd0cf25bb:/#  /opt/redis-stack//bin/redis-server --version
Redis server v=6.2.12 sha=00000000:0 malloc=jemalloc-5.1.0 bits=64 build=4dbc2487343b0024

Production redis

root@f6301f93c7bb:/# /opt/redis-stack//bin/redis-server --version
Redis server v=6.2.13 sha=00000000:0 malloc=jemalloc-5.1.0 bits=64 build=9bc624588a181ec8
root@f6301f93c7bb:/# 

Development runs the redis/redis-stack image. Production runs redis/redis-stack-server.

I was unfortunately unavailable yesterday, as I did not see your comment, but I would not have been available anyways because I live in GMT. Feel free to ask for more information if you need it, hope this helps!

guyroyse commented 1 year ago

Okay. All those versions should work. I did release 0.4.0 as a non-beta, but it's just 0.4.0-beta3 with a new name. ;) Lemme see if I can reproduce this.

guyroyse commented 1 year ago

Hey @leibale, do you think this could be a bug with Node Redis?

guyroyse commented 1 year ago

How much data are you querying against in prod? There are some gotchas in RediSearch when queries timeout. Default timeout is 500ms, I believe.

nullishamy commented 1 year ago

How much data are you querying against in prod? There are some gotchas in RediSearch when queries timeout. Default timeout is 500ms, I believe.

Oops, never saw this! Really shouldn't be too much data. We only cache for 3 minutes or so, so less than 5000 entries (with indexed searching of course). It could be a timeout but I doubt it. How would i go about verifying / disproving this theory?

nullishamy commented 1 year ago

Hm. Since opening and discussing this, the issue seems to have subsided. Not entirely sure what caused it initially or what fixed it, but it seems to be resolved. Closing.