keichan34 / exfile

File upload persistence and processing for Phoenix / Plug
MIT License
90 stars 19 forks source link

Cache processing results #51

Open asiniy opened 7 years ago

asiniy commented 7 years ago

Now all processed files (especially images) are generated on each query.

It's not difficult if you have only couple of images to be rendered on the page. But this becomes a problem when you have a webpage which requires 20-30 images to be generated simultaneously. They are returned one after another, even on powerful VPN.

In my opinion, exfile should have a cache ability, like:

config :exfile, Exfile,
  cache: %{
    size: '10GB',
    path: '/tmp/exfile-other' # default is `/tmp/exfile-cache-#{env}`
  }

All processed images will be stored to path and will have the mark of their earlier usage assigned, and there also will be a supervisor which monitors size of the path folder. If it oversizes limit, it will destroy the oldest unused images in it.

@keichan34 WDYT?

keichan34 commented 7 years ago

Sorry about the late reply on this.

Since files are fingerprinted and processing inputs are in the URL, the requests can be cached by a normal HTTP cache (nginx and/or any pull-based CDN like CloudFront, CloudFlare, Fastly, etc). I don't think it's in the scope of Exfile to cache outputs when a HTTP cache can do it equally as well.

asiniy commented 7 years ago

@keichan34 interesting idea! Let me test it, I'll response in ~ 1 week

scarfacedeb commented 7 years ago

@asiniy you could also put nginx in front of elixir app and cache exfile responses with it.