Thanks for your service. I'd like to set up a sprunge-like locally, without using cloud services. For this reason I'm thinking of:
1) Conditionate the import of google engine modules, create an intermediate common storage API module.
2) Store pastebins on the local filesystem, with a directory hierarchy.
3) Auto expire of older pastes.
A cron+find for many directories and files is slow and expensive. So I'd like to amortize this cost. An approach would be to generate the hashes with a salt + autoincrement number. Then store in a database the oldest number, the newest number, and the size of all pastebins.
In a config set the maximum number of bytes (or maximum number of pastes). Whenever a new paste is created, delete the older entries to fit in the limits.
Do you think it makes sense? Would you accept patches or the changes are so big that it's not worth it complicating the codebase?
Thanks for your service. I'd like to set up a sprunge-like locally, without using cloud services. For this reason I'm thinking of: 1) Conditionate the import of google engine modules, create an intermediate common storage API module. 2) Store pastebins on the local filesystem, with a directory hierarchy. 3) Auto expire of older pastes.
A cron+find for many directories and files is slow and expensive. So I'd like to amortize this cost. An approach would be to generate the hashes with a salt + autoincrement number. Then store in a database the oldest number, the newest number, and the size of all pastebins. In a config set the maximum number of bytes (or maximum number of pastes). Whenever a new paste is created, delete the older entries to fit in the limits.
Do you think it makes sense? Would you accept patches or the changes are so big that it's not worth it complicating the codebase?