mrdoob / glsl-sandbox

Shader editor and gallery.
https://glslsandbox.com/
MIT License
1.55k stars 260 forks source link

Use cloudinary to store images #17

Closed jfontan closed 10 years ago

jfontan commented 11 years ago

Right now the images are saved as data:image streams in the database. This makes the database grow very fast and the gallery page is a bit slow loading as it can not cache images. There is a 240 Mb limit on the free database and we have 140 Mb used right now.

Checking the heroku add-ons I've found Cloudinary. The starter plan seems enough for the page:

The code changes are simple and a test version is in my repository.

A problem that we could have is making backups. I make a backup of the database from time to time that is simply a dump of it. With another server for images I'll need to get those too.

I can also think on two ways of moving to use the new images service:

What do you think?

emackey commented 11 years ago

Sounds interesting. Does that limit us to 25000 shaders? Could there be "sprite sheets" that combine a page of shaders into a single image?

jfontan commented 11 years ago

It does limit the number of images to 25000, yes. The problem I see with making sprites is that they need to be updated dynamically. We may ease the problem supposing that really old shaders wont be modified and only creating those sprite sheets for old shaders.

I still think that 25000 shaders is enough for now. Ten times the amount we have now.

emackey commented 11 years ago

This sounds good as long as you can still get good backups. Will it be a lot harder?

jfontan commented 11 years ago

A new backup script must be made for the cloudinary service. Right now backup is done manually from time to time running a script. I have to automate this a make it more intelligent so it only downloads the new and modified effects since the last backup (right now I download the entire database and takes lots of time). Adding the other service should not be too difficult. I'll start with the new backup scripts and think on the cloudinary backup. Will tell you about my findings.

jfontan commented 11 years ago

As you may have noticed I run into problems this week with the amount of that stored in the database. I've migrated all the images from the mongo db to cloudinary. This solved the database problem (and made the backups much faster) but unfortunately this is not a long term solution.

The problem with cloudinary is that the free tier has a 1Gb transfer limit per month. I should have checked this before. In these 2/3 days the page has already consumed 1.6 Gb. I want to have a longer term solution and Ive decided to use Amazon S3. It seems to be reliable and prices are fairly cheap.

The new migration will be much easier as the database is already using image urls, it does not matter where they are stored.

I'll try to do the change this week. Any other ideas on how to solve this will be welcome.

emackey commented 11 years ago

Sounds good, hopefully it will work well.

mrdoob commented 11 years ago

Have you looked at Google Cloud Storage? As far as I know is cheaper than S3. Either way, I'd be happy to share the costs :)

emackey commented 10 years ago

This is implemented now, I think this issue can be closed.