espeed / bulbs

A Python persistence framework for graph databases like Neo4j, OrientDB and Titan.
http://bulbflow.org
Other
622 stars 83 forks source link

Proxy and dynamic models #63

Open etabard opened 12 years ago

etabard commented 12 years ago

My models are described and stored in Redis. It allows me to have a versatile backend and change my models on the fly.

To do so I extend Node on the fly, define my properties and return a proxy based on this temporary model. I do this only when somebody creates or updates a node.

The problem is, each time a create a proxy, the proxy builder makes a get_or_create index. Is there a way to prevent this behavior if the index has already be ensured ? Mongodb drivers do this with ensureIndex. The first time it shoots a request and then do only one request each X minutes.

Second question, is it safe to create proxies on the fly (I use graph.build_proxy), do you keep references somewhere ? I don't want to mess up with garbage collection ...

If you have a better suggestion, please let me know :)

espeed commented 12 years ago

What do you mean by your models are described and stored in Redis? -- Are you pickling the models and storing them in Redis?

By default proxies are stored on the Graph object. You can store proxy objects in the Registry, but I don't think Bulbs is doing that anywhere.

You can store a reference to the index in the Registry and create a custom build_proxy() method to use the index in the registry rather than building it again. Or maybe even easier, store a reference to the proxy object in registry.

espeed commented 12 years ago

And update the proxy's element_class var to use your updated model.

etabard commented 12 years ago

I meant store the model description in json. Something like that :

{
   "id" : ""
   "type" : "User",
   "properties" : [
      {
         "key" : "username",
         "default" : "",
         "type" : "string",
         "mandatory" : true,
         "validator" : "regexp"
      },
      ...
   ]

I'm not pickling it but I could do it as a separate caching mechanism. The point here is :

espeed commented 12 years ago

Regarding the Python worker delivering multiple applications, Bulbs was designed to make it easy to to use it with multiple backend graphs at the same time -- each graph object is instantiated once and contains everything it needs (all the model proxies, etc).

In your app, create a model.py file where you define your models and your custom Graph object:

See https://github.com/espeed/lightbulb/blob/master/lightbulb/model.py

And then create a bulbsconf.py file where you instantiate your Graph object. Have your modules import the graph object from there so that you don't have to recreate it each time:

See https://github.com/espeed/lightbulb/blob/master/lightbulb/bulbsconf.py

You can obviously have multiple graph objects to serve multiple applications.

As for updating your Models, I would just update the element_class var on the proxy if the model changes.

See https://github.com/espeed/bulbs/blob/master/bulbs/element.py#L544

EDIT: Make sure you update the class Registry too -- this is what Bulbs uses to initialize_elements()

See https://github.com/espeed/bulbs/blob/master/bulbs/registry.py#L27

The registry object is stored on the client object.

espeed commented 11 years ago

For Bulbs 0.4 I will add option to disable index check.