Open miraculixx opened 9 years ago
My idea about exporting TableRules would be more like this:
class <Slug>TableRule(TableRule):
def __init__(self):
super(<Slug>TableRule, self).__init__(rules=<spec>)
where spec would be evaluated like spec = TableRule.from_yaml(yaml_spec).rules
at export time. Actually, YAML loading would happen behind the scenes if we use Rulestore API.
Of course we could store spec data on class or even module level to keep things prettier. So definition import would happen once, not every time we create an instance or class (if we cache it).
Moreover, we don't need to differentiate module rules and table rules at import time if we stick to this approach - we would just be saving the table rule results as a separate module in the egg rules. So lots of things would become simpler.
Regarding "pythonic approach" description, I think that this begs for a Rulestore backend that would work on python module of the structure we defined (with data in rulestore.py and rules/*.py modules). So effectively this would mean we could export data from any rulestore backend to egg-based backend. I'm not sure if this is exactly what we want or it covers all of the use cases, but guess it's worth considering such approach.
thanks. only just noticed your comment. not sure we have already discussed this sufficiently in our recent chat:
in general
what code would you generate specifically, other than the TableRule as per the specs? I don't like the parsing rules => code generation (if I understand correctly #1 & #2) approach instead of including the JSON in the table rules, it is a source of failure and means every new feature in json rules needs to be implemented twice, once in parsing, one in generating. (06:16:16 PM) antisvin: hm, I was going to use existing parsing of json/yaml table rule definitions, so that there would be minimum of new code (06:16:31 PM) antisvin: actually - can you explain what should Ruleset actually do, how it's used? (06:17:16 PM) antisvin: because it could be that I didn't clearly understand that. I was thinking about exporting separate rules mostly.
on intended use cases (of pyrules):
(06:19:53 PM) miraculixx: use case 1: user writes rules and uploads/submits them to the "admin" backend, rules can be either tablerules (entered in admin) or module rules submitted as egg files. basically that's what we have know. (06:37:07 PM) miraculixx: use case 2: at a specific time, create a rule set export. The idea is to be able to deploy this export into a separate deployment of pyrules that only executes the rule sets. This can be a standalone deployment, or it can be an "embedded" pyrules in a larger application, technically that's the same thing. (06:37:54 PM) miraculixx: use case 3: manage these exports as versioned rulesets for logging/auditing purpose. of course that's trivial if we have 2. (06:38:42 PM) miraculixx: maybe it helps to think of #1 as the development system, and #2 as the production. (06:39:46 PM) antisvin: so do you think that exported egg should provide just 1 rule or multiple? (06:41:15 PM) miraculixx: I guess the absolute simplest approach to #2 is an export of the respective parts of the pyrules models plus the module rules, and then a reimport for execution in the 2nd system.
one egg files:
(06:41:59 PM) miraculixx: one egg = export of all rules of one or multiple rulesets (06:45:46 PM) miraculixx: maybe this helps - for module rules, this is somewhat similar to the way scrapy builds eggs for spider modules => scrapyd, tablerules is somewhat similar to fixtures in testing. only for pyrules it should all be packaged in one egg file. (06:45:54 PM) miraculixx: does that make sense? (06:50:51 PM) antisvin: actually, I was thinking about the way scrapy works and more and tried to describe it above. scrapy has one spider (== rule) per egg and can export it separately. an egg contains module code as is, it just gets necessay metadata for packagin as an egg. and in case of pyrules, we could generate some rules dynamically, based on definition provided. (06:52:38 PM) antisvin: anyway, I need some more time to think about it, maybe it would give me a better understanding how to put all the pieces together. (06:56:30 PM) miraculixx: yeah I guess the key difference to scrapy is that for pyrules it is one egg per the selected rulesets. basically it is the same if I would package the sample app in an egg, then install it in another system, only I want to select which rulesets and by reference which rules get included.
Update: user names only
Utlimate goal is to be able to store, version and deploy rulesets between development, test and production environments. This story should implement the basics for this goal -- as soon as we can serialise/deserialise rules and rulesets, the other elements become trivial.
Expected behavior
Tasks
Implementation notes
import
a. Module rules
A module rule is simply a python class in the app's
rules
module/package. Hence importing (e.g. in a celery instance) means to find allRule
instances.b. TableRulesets
Basically the same as with Module rules, assuming that each tablerule has been properly specified as shown below, and the tablerules.py is imported at startup.
export
Rules are either python modules or yaml specifications. Hence exporting a ruleset and building an egg from it involves these steps:
pythonic approach
rules
module / name space package.version
fileI prefer the python approach. It may be helpful to code a full example egg by hand before automating it from the django models.
For completeness, this would be the non-pythonic approach. Yikes...
non python approach
The setup.py should have a
version
file with a version number that is incremented automatically on each build.