jruizgit / rules

Durable Rules Engine
MIT License
1.16k stars 206 forks source link

loading facts from external database #154

Open lteacy opened 6 years ago

lteacy commented 6 years ago

I'm interested in building a solution in which facts are persisted to an external database, such as MongoDB, so that only relevant data for currently active customers needs to be held in Redis at any one time.

When an event occurs, e.g. w.r.t. to an individual customer, relevant facts would first need to be retrieved from the external data store, which would then determine what rules get triggered as a result of the event. After processing, new facts may be persisted to the data store and offloaded from Redis.

Just wondering if you have any thoughts on how best to achieve this? Would be great if this was opaque to the application, in which case Redis would effectively be caching and retrieving facts from the external database as and when required - much as it would be in other database caching scenarios.

jruizgit commented 6 years ago

Hi, thanks for posting the question. First thought, without knowing all the details of your App, you might need to write daemon that reads facts from the MongoDB data store a asserts them in durable_rules. The complexity of the solution would depend on the schema of your MongoDB app. That is, if you store all facts in a table with an indexed auto-increment field, you can easily query the facts that have not been asserted. durable_rules provides an API to assert facts and post events efficiently (without having to go through an http API):

q = create_queue('fraud_detection')
q.post({'t': 'deposit'})
q.post({'t': 'withrawal'})
q.post({'t': 'chargeback'})
q.assert_fact({'t': 'balance'})
q.post({'t': 'deposit'})
q.post({'t': 'withrawal'})
q.post({'t': 'chargeback'})
q.retract_fact({'t': 'balance'})
q.close()