Closed einhirn closed 13 years ago
might be worth a try but I'm not sure sqlite can cope with the large amount of data...
So your plugin doesn't need any database/SQL features that only a full blown DBMS has? About the amount of data: Is this an immediate problem? Or is it, because you collect access data over years?
http://www.sqlite.org/whentouse.html says: ====8<====8<====8<====8<====
With the default page size of 1024 bytes, an SQLite database is limited in size to 2 terabytes (2^41 bytes).
====8<====8<====8<====8<====
I think that it will take a long time to collect terabyte sized statistics, right?
My application would be an extension of the tagcloud-Plugin, so I can produce a Cloud of most popular pages in my wiki... I think I'd try to consolidate the statistics on a regular interval because I don't need to know who came to which wiki page from where. I just want to know how popular each wiki page is. I don't know if this plugin is overkill for such an application?
Now, the theoretical limit might be 2TB file size but the question is how fast lookups are with a large amount of data.
For your aim, the statistics plugins seems to be a bit much. Why not simply just count accesses instead of saving each access separately? Then only have as much entries as you have pages.
Is it in any way feasible to change this plugin to use sqlite helper?