I want a lightweight, High Available & Extensible cache solution using redis, but nutcracker is too heavy for a system with only 2 or 3 application servers and 2 or 3 redis instances. and there isnt a good enough implementation of consistant hash using pure python. so I wrote this. I used it in 2 project and they are running well till now when half a year passed. so I shared it for people who have the same requirement.
redis_dict(redis_confs, prefix='', key=str, expire=None, on_fail=None, on_node_ejected=None, on_node_rediscovered=None, retry_ratio=1e-2, hash_function=crc32)
: construct a redis_dict instance, which can be used as a normal python dictsome_redis_dict_instance.visit_redis(cmd, k, args)
: visit lower level redis apissome_redis_dict_instance.get_entry(k)
: return the really redis entry of ksome_redis_dict_instance.alive_hash(redis_entry)
: return the node name for redis_entry via alive_hashsome_redis_dict_instance.total_hash(redis_entry)
: return the node name for redis_entry via total_hashlen(some_redis_dict_instance.alive_hash)
: return the alive nodes numberpip install ring_redis
cd path/to/ring_redis
python setup.py install
################### your redis configuration #####################
REDIS_CONF = {
'group0' : {
'node0': {
'capacity': 50 * 1024 ** 2,
'connection': {
'host' : '192.168.230.45',
'port' : 15061,
'db': 0,
'socket_timeout': 5e-3,
},
},
'node1': {
'capacity': 50 * 1024 ** 2,
'connection': {
'host' : '192.168.230.46',
'port' : 15061,
'db': 0,
'socket_timeout': 5e-3,
},
},
},
}
############################ useage ##############################
from ring_redis import redis_dict
test = redis_dict(REDIS_CONF['group0'], prefix='test.', expire=20)
test['a'] = 'abc'
print("test['a'] : %s" % (test['a']))
print("len(test) : %s" % (len(test)))
print("test.keys() : %s" % (test.keys()[:100]))
print("'a' in test? : %s" % ('a' in test))
print("'b' in test? : %s" % ('b' in test))
print("test.visit_redis('incr', 'x', 1) : %s" % (test.visit_redis('incr', 'x', 1)))
print("test.get_entry('x') : %s" % (test.get_entry('x')))
print("test.total_hash(test.get_entry('x')) : %s" % (test.total_hash(test.get_entry('x'))))
print("test.alive_hash(test.get_entry('x')) : %s" % (test.alive_hash(test.get_entry('x'))))
prefix + key(dict_key) == redis_entry
RedisClusterUnavailable
will be raised, you should pass on_fail
as argument of redis_dict constructor or catch this exception to handler this situation yourself.