sourcelevel / faraday-http-cache

A Faraday middleware that respects HTTP cache
Other
343 stars 86 forks source link

All requests are uncacheable #98

Closed mityakoval closed 6 years ago

mityakoval commented 6 years ago

Hi!

I am trying to boost up the performance of my web app by caching some of the API request with HTTP cache. I've tried to use Rails' standard memory cache and a custom written one but each request is uncacheable.

HTTP Cache: [GET /parks] miss, uncacheable

I could not find any info as to what prevents caching. This is my first attempt so I don't really know where to start digging.

Faraday config:

@conn = Faraday.new(url: URL) do |builder|
    builder.use :http_cache, store: Rails.cache, logger: ActiveSupport::Logger.new(STDOUT)
    builder.adapter Faraday.default_adapter
end

Rails config:

if Rails.root.join('tmp/caching-dev.txt').exist?
    config.action_controller.perform_caching = true

    config.cache_store = :memory_store
    config.public_file_server.headers = {
      'Cache-Control' => "public, max-age=#{2.days.seconds.to_i}"
    }
else
    config.action_controller.perform_caching = false

    config.cache_store = :null_store
end

I also have access to the API's nginx server config:

upstream puma {
  server ...
}

server {
  listen 8000 default_server deferred;

  root ...
  access_log ...
  error_log ...

  location ^~ /assets/ {
    gzip_static on;
    expires max;
    add_header Cache-Control public;
  }

  try_files $uri/index.html $uri @puma;
  location @puma {
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header Host $http_host;
    proxy_redirect off;

    proxy_pass http://puma;
  }

  error_page 500 502 503 504 /500.html;
  client_max_body_size 1500M;
  keepalive_timeout 10;
}
mityakoval commented 6 years ago

Nginx wasn't allowing caching. Setting expires 1h; within location @puma fixed the issue.