Closed quard8 closed 12 years ago
Adding a caching layer in to the core of Shanty isn't really a sensible way to go. It would be a lot of work to do it in a manageable abstracted way and distract from the core purpose of the library.
There are lots of caching options around but you might want to look at the Zend Framework caching options as a starting point.
See this for example:
http://www.joeyrivera.com/2009/caching-using-phpzend_cache-and-mysql/
I have a Shanty-Mongo Cache adapter in my Fork if you want a quick caching layer.
Actualy I need to cache mongodb requests and not mongodb as backend for cache :)
I made a custom TwoLevels class (i can email it too you if you would like) that will cache the results of my complex queries to both mongo and memcache to speed things up, and if we dont find something in memcache, it will get re-primed out of the mongo cache, and since i am always doing a key based request, my queries are really really really fast.
Oh, this can be useful for me, please send to quard8 at gmail dot com. Thanks!
I'm thinking about caching layer which should be simple and don't break current behaviour of constructing queryies.
Ideal solution is to add cache() function in query chain, like this:
the problem is that function all() returns MongoCursor object, and we can't handle sort() and limit(), because cache data will be invalid.
Currently I can't imagine how to implement this. Maybe add some abstraction layer behind MongoCursor, or something different.
Any suggestions?