I'm using your redis library for my I/O operation with redis. In one of my project using an Akka Cluster, I need to share a redis client with a pool of workers like below. The basic idea of my pool of workers is to give some job to them and in a every dump, the first task is to fetch data from Redis and process it.
case class Master(redisClient : RedisClientMasterSlaves) extends Actor {
val workerRouter =
context.actorOf(FromConfig.props(Props(classOf[Worker],redisClient)), name = "workerRouter")
// some code here
}
In the POC that I wrote, I discovered that every message and fields of Props must be serializable but it's not the case of the redisClient that I use. One solution to my problem is to use RedisClientActor and send command to this actor. But I will have only one actor fetching data from Redis and it's not the initial idea of my Akka Cluster. So do you have advices to share a redis client in several nodes of akka cluster ?
It seems to me that it's a bad practice to create a new Redis client in each worker (it will create a new actorsystem for each worker which is not recommended when using Akka).
Hi,
I'm using your redis library for my I/O operation with redis. In one of my project using an Akka Cluster, I need to share a redis client with a pool of workers like below. The basic idea of my pool of workers is to give some job to them and in a every dump, the first task is to fetch data from Redis and process it.
In the POC that I wrote, I discovered that every message and fields of Props must be serializable but it's not the case of the redisClient that I use. One solution to my problem is to use
RedisClientActor
and send command to this actor. But I will have only one actor fetching data from Redis and it's not the initial idea of my Akka Cluster. So do you have advices to share a redis client in several nodes of akka cluster ?It seems to me that it's a bad practice to create a new Redis client in each worker (it will create a new actorsystem for each worker which is not recommended when using Akka).