redis / lettucemod

Java client for Redis Modules
Apache License 2.0
49 stars 20 forks source link

RediSearch - Data expiry with SearchOptions.SortBy leading to NullPointerException #19

Closed peio-burucoa closed 2 years ago

peio-burucoa commented 2 years ago

Hi there, We've been facing the following NullPointerException while using lettucemod 2.18.2 with RediSearch:

Exception in thread "main" java.lang.NullPointerException
    at com.redis.lettucemod.output.SearchOutput.complete(SearchOutput.java:95)
    at io.lettuce.core.protocol.RedisStateMachine.doDecode(RedisStateMachine.java:343)
    at io.lettuce.core.protocol.RedisStateMachine.decode(RedisStateMachine.java:295)
    at io.lettuce.core.protocol.CommandHandler.decode(CommandHandler.java:841)
    at io.lettuce.core.protocol.CommandHandler.decode0(CommandHandler.java:792)
    at io.lettuce.core.protocol.CommandHandler.decode(CommandHandler.java:775)
    at io.lettuce.core.protocol.CommandHandler.decode(CommandHandler.java:658)
    at io.lettuce.core.protocol.CommandHandler.channelRead(CommandHandler.java:598)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:829)

Here is a quick example to reproduce the issue (a Redis instance running on port 16379 is required):

import com.redis.lettucemod.RedisModulesClient;
import com.redis.lettucemod.api.StatefulRedisModulesConnection;
import com.redis.lettucemod.api.sync.RedisModulesCommands;
import com.redis.lettucemod.search.Field;
import com.redis.lettucemod.search.Order;
import com.redis.lettucemod.search.SearchOptions;
import com.redis.lettucemod.search.SearchResults;
import io.lettuce.core.RedisCommandExecutionException;
import io.lettuce.core.RedisURI;

import java.util.Map;

public class LettuceModExample {

    public static void main(String[] args) throws InterruptedException {
        RedisURI uri = RedisURI.create("localhost", 16379);
        RedisModulesClient client = RedisModulesClient.create(uri);

        // Save 100 beers expiring after 3 seconds, with slight delay between every save operation
        try (StatefulRedisModulesConnection<String, String> connection = client.connect()) {
            RedisModulesCommands<String, String> commands = connection.sync();
            try {
                commands.create("beers", Field.text("name").build());
            } catch (RedisCommandExecutionException e) {
                // in case index already exists
            }
            for (int i = 0; i < 100; i++) {
                commands.hmset("beer:" + i, Map.of("name", "Chouffe" + i));
                commands.expire("beer:" + i, 3);
                Thread.sleep(10);
            }
        }

        // Add sorting to our search options
        SearchOptions<String, String> searchOptions = SearchOptions.<String, String>builder()
                .sortBy(new SearchOptions.SortBy<>("name", Order.ASC))
                .build();

        // Search for the amount of beers, every second, during 10 seconds
        for (int i = 0; i < 10; i++) {
            try (StatefulRedisModulesConnection<String, String> redisConnection = client.connect()) {
                RedisModulesCommands<String, String> commands = redisConnection.sync();
                SearchResults<String, String> results = commands.search("beers", "chou*", searchOptions);
                System.out.println(results.getCount());
            }
            Thread.sleep(1000);
        }
    }
}

This should throw a NullPointerException after data started to expire. Without the SortBy search options this example does NOT throw a NPE. Same without the slight delay between every save operation.

Brico87 commented 2 years ago

Hey ! I'm working with @peio-burucoa and we just found out that we had an error in the configuration of the index. Using the keyword SORTABLE while creating the index by hand is fixing the issue.

However, we had another issue with searching data which expires:

java.lang.UnsupportedOperationException: null
 at java.base/java.util.AbstractMap.put(AbstractMap.java:209)
 at io.lettuce.core.output.MapOutput.set(MapOutput.java:53)
 at com.redislabs.lettusearch.output.SearchOutput.set(SearchOutput.java:60)
 at io.lettuce.core.protocol.RedisStateMachine.safeSet(RedisStateMachine.java:810)
 at io.lettuce.core.protocol.RedisStateMachine.handleNull(RedisStateMachine.java:392)

The issue appears to be there => https://github.com/redis-developer/lettucemod/blob/master/subprojects/lettucemod/src/main/java/com/redis/lettucemod/output/SearchOutput.java#L72. No null check on bytes while setting it into nested map. Hence the error.

Could a fix be planned for that ? Cheers !

jruaux commented 2 years ago

Thanks for reporting this issue. I added a fix and will issue a release shortly