zedeus / nitter

Alternative Twitter front-end
https://nitter.net
GNU Affero General Public License v3.0
9.94k stars 528 forks source link

sample config for reverse proxying nitter using caddy #456

Open Mr-Sheep opened 2 years ago

Mr-Sheep commented 2 years ago

Currently the wiki has a sample config for Apache and Nginx users, but none for caddy(which is a nice little webserver) Here is a sample config for caddy reverse proxy setup for nitter listening on port 8080

:443, example.org
tls no@body.me
log {
  level debug
}
route {
  reverse_proxy  127.0.0.1:8080
}

Furthermore, I think there should be only one single page for all reverse proxy sample configs instead of creating separate wiki pages for different web servers, which can potentially tidy up the wiki a bit.

artemislena commented 2 years ago

L: This one works too (it's basically what we use and has a few more things configured):

nitter.example.com {
    header {
        Strict-Transport-Security "max-age=63072000"
        Referrer-Policy no-referrer
        X-Permitted-Cross-Domain-Policies none
        X-Content-Type-Options nosniff
        X-Robots-Tag "none"
        Permissions-Policy "Permissions-Policy: accelerometer=(), ambient-light-sensor=(), autoplay=(), battery=(), camera=(), cross-origin-isolated=(), display-capture=(), document-domain=(), encrypted-media=(), execution-while-not-rendered=(), execution-while-out-of-viewport=(), fullscreen=(self), geolocation=(), gyroscope=(), keyboard-map=(), magnetometer=(), microphone=(), midi=(), navigation-override=(), payment=(), picture-in-picture=(self), publickey-credentials-get=(), screen-wake-lock=(), sync-xhr=(), usb=(), web-share=(), xr-spatial-tracking=()"
        Content-Security-Policy "default-src 'none'; script-src 'self' 'unsafe-inline'; img-src 'self'; style-src 'self' 'unsafe-inline'; font-src 'self'; object-src 'none'; media-src 'self' blob:; worker-src 'self' blob:; base-uri 'self'; form-action 'self'; frame-ancestors 'self'; connect-src 'self' https://*.twimg.com; manifest-src 'self'"
    }
    respond /robots.txt 200 {
        body "User-agent: *
Disallow: /
"
        close
    }
    reverse_proxy http://localhost:8080 {
        transport http {
            compression off
        }
    }
}

Edit: Added this to the wiki.