Open jace opened 7 years ago
Since point 4 is unclear, it'll work like this in an app:
def before_request():
g.redis_pipe = RedisPipe() # This creates the pipe
def view():
with g.redis_pipe(callback=foo) as pipe:
pipe.operation()
def after_request():
g.redis_pipe.commit()
HasGeek apps now make extensive use of Redis, calling it multiple times in each request. Redis is much faster if we call it just once, pipelining all requests. We use this in Hasjob when retrieving viewcounts, which are stored as hundreds of individual keys (so that each of them expire automatically).
Coaster should provide a pipeline framework that allows multiple sources to pipeline a request and fire them at once, during request construction (usually retrieval) and teardown (usually storage).
The construction phase isn't well defined as our code doesn't operate on the basis of callbacks, but we can assume that for teardown:
Coaster offers a pipe proxy that can be used like this:
The pipe proxy (a) passes the calls through to the actual pipe object, and (b) tracks the calls so it knows which result belongs to which caller.
When the pipe is finally executed, results are split into batches and sent to each callback function.
Coaster offers the framework but not the app-level frame, something that creates the stacks and handles the execution. That's up to the app itself or a helper lib like Baseframe.