Closed PythonLinks closed 2 years ago
You're looking at the reference documentation, it's not an example. So it will have everything in it.
Did you look at the example Caddyfile? https://github.com/caddyserver/cache-handler/blob/master/Caddyfile
It's much simpler.
thank you for the ultra fast response.
I did see that page. A bit better.
I think the larger problem is that this is really for high end sophisticated caching.
I think next I am going to try https://github.com/sillygod/cdp-cache
It is the caddy 1 caching ported to caddy 2.
I already am using the caddy 1 caching, it worked great for my needs, and it has all of the same concepts I am used to.
I think the Soin caching is just a lot more sophisticated with ore options.
When the cdp path fails, I will be back.
9 days to my conference.
@darkweak tells me that the simplest config is:
cache {
ttl 10s
}
soon to be
cache
with a sane default TTL.
Hopefully that helps!
The fastest way to configure the cache-handler is with the ttl directive. The others options are not mandatories. That's specified in the Souin documentation and in the cache-handler too.
We declare some route
with different paths to set/activate different directives in the Caddyfile but if you just could declare once the Souin/cache-handler options on top of the caddyfile inside the global options block and your instance will get the same configuration for each endpoints your targeting.
Feel free to ask more question about the configuration or the behavior under the hood if needed.
@mholt
So the very simple
cache { ttl 10s }
soon to be just
cache
is exactly the kind of introductory documentation which a new person needs.
What do people cache? Well usually it is a slow reverse proxy. In my case a single python process.
So next I tried
'
greenmaps.us {
reverse_proxy http://localhost:8084
cache {
ttl 10s
}
}
'
And of course got the error message
'cache' is not ordered, so it cannot be used here
So back to the example and i tried your order command.
greenmaps.us { order cache before rewrite reverse_proxy http://localhost:8084 cache { ttl 10s } }
This time the error message is
unrecognized directive: order
replacing the word "rewrite" with the word "reverse_proxy"
yielded the same error message.
greenmaps.us { order cache before reverse_proxy reverse_proxy http://localhost:8084 cache { ttl 10s } }
Putting the order command before the server declaration also did not work.
order cache before reverse_proxy greenmaps.us { reverse_proxy http://localhost:8084 cache { ttl 10s } }
Not much else I can try. Sorry if I was a bit long winded here.
The point is that there is a huge difference in what the newbie thinks and what the experienced caddy-handler developer thinks. I am just trying to lay out my thinking so that you can better understand my perspective.
Let me add another comment here. There is a huge difference between the needs of large organizations, and small organizations. When Caddy was first released it was perfect for a small organization. Simple to install did automatic certificates, and had simple caching. The big companies, the ones with the money, started using it. They have stronger demands. The simple Caddy 1 got tossed out, and a much more feature rich Caddy 2 came into being. But in so doing it lost the simplicity desired by the small organizations, those with no go developers on staff, nor the money to hire them.
Hopefully the simple introductory path to caching can be restored, with the ability to upgrade to the more sophisticated options when it eventually becomes needed.
Thank you for being so helpful.
order
is global option. It must go inside the global options block. See the Caddyfile structure docs: https://caddyserver.com/docs/caddyfile/concepts#structure
That makes sense.
Thanks for the link.
Thanks for the fast response.
Hopefully I am getting closer.
But still no luck.
Next I tried:
`{ order cache before reverse_proxy }
greenmaps.us { cache { ttl 10s } reverse_proxy http://localhost:8084 }`
And the errror message was: `
caddy responded with error: HTTP 400: { "error":" loading config: loading new config: loading http app module: provision http: server srv0: setting up route handlers: route 0: loading handler modules: position 0: loading module 'subroute': provision http.handlers.subroute: setting up subroutes: route 0: loading handler modules: position 0: loading module 'cache': unknown module: http.handlers.cache"} `
But in fact the module is listed. `
caddy list-modules | grep cache cache http.handlers.cache `
The cache directive should be enabled in the global options too I think.
{
order ...
cache {
ttl your_default_ttl
}
}
The fast responses are wonderful.
Sadly it is all still not working
I moved the cache to the glocal (first section), and tried removing it or leaving it in the green maps server section.
` { order cache before reverse_proxy cache { ttl 10s } }
greenmaps.us { reverse_proxy http://localhost:8084 }
`
and
` { order cache before reverse_proxy cache { ttl 10s } }
greenmaps.us { cache { ttl 10s } reverse_proxy http://localhost:8084 } `
Give me the same error messages.
reload: sending configuration to instance: caddy responded with error: HTTP 400: {"error":"loading config: loading new config: loading cache app module: unknown module: cache"}
and
reload: sending configuration to instance: caddy responded with error: HTTP 400: {"error":"loading config: loading new config: loading cache app module: unknown module: cache"}
I also tried lots of other things. And just to make sure I am not being stupid, that module does exist.
`
caddy list-modules | grep cache cache http.handlers.cache `
The good news, is that once I get this simple example working, we will have a simple example, for those who follow in my footsteps.
Make sure you are running the same Caddy binary that has the module plugged in.
That Worked.
So here is a simplest caddy file caching with reverse proxy for the beginners who come after me.
{
order cache before reverse_proxy
cache {
ttl 10s
}
}
your-domain.com { reverse_proxy http://localhost:8084/ }
` Remember to change the domain name, and port.
The last comment really helped.
Make sure you are running the same Caddy binary that has the module plugged in.
On freebsd the command:
pkg install caddy
Does not just copy in the file, it creates a service. So once I upgraded /usr/local/etc/caddy, I could not just do
caddy reload
I had to do
service caddy stop
service caddy onestart
My bad. Live and learn.
Your support has been fantastic.
I will also say that your whole concept of automatic https is brilliant. If you take a look at the nginx configuration files generated by the Let's Encrypt certbot, there is a ton of repetitive junk in there, repeated for each domain I configure.
The top part of my configuration file is quite useful. ' server { server_name demo.forestwiki.com;
location / {
include cache.config;
include uwsgi_params;
proxy_pass http://localhost:8085;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-Proto $scheme;
}
` And then the rest of the file is just repetitive junk that certbot puts in every configuration file for every domain I host. And there are quite a few. DRY. Do not repeat yourself, is a common theme in programming.
`
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/demo.forestwiki.com/fullchain.pem; # m
anaged by Certbot ssl_certificate_key /etc/letsencrypt/live/demo.forestwiki.com/privkey.pem; # managed by Certbot include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server { if ($host = demo.forestwiki.com) { return 301 https://$host$request_uri; } # managed by Certbot listen 80 ; server_name demo.forestwiki.com; return 404; # managed by Certbot } '
So it would be good to add something like this to your marketing materials, where you compare caddy to NGINX. I fear that my users will want to use nginx. I would like a page to post them to. Maybe I will write it myself.
Thanks for all of the help.
I am so very very glad to be back on board with Caddy.
Of course I will probably still have some troubles. But getting much much closer.
Using Souin, you can now use only the cache
directive to use the default values (the caching duration is 2m
by default)
{
order cache before [something]
cache
}
I was in a similar boat as PythonLinks, without this threadf I would have given up. Please put a simple complete example in the README. And please put some comments into it, e.g. the Caddyfile in this repo has exactly zero comments, so one is left to guess what each section is intended to demonstrate.
You will find it much easier to configure cdp-cache.
From my cdp-cache pull request.
This is a http cache plugin for caddy 2. The difference between this and cache-handler https://github.com/caddyserver/cache-handler is that this one is much easier to understand and configure. But it does not support a distributed ache, and it needs to be compiled with golang 1.17 and caddy version v2.4.6
Here is my cdp-cache pull request.
https://github.com/sillygod/cdp-cache/pull/55
I am running it on caddy on Bastille containers on FreeBSD. If you choose to go down that route, I would be happy to advise.
I found messing with go installations was trivial within Bastille. Every time I did it wrong I just blew away the container and repeated.
Please check the README in the Souin repository. The cache-handler will be updated tomorrow to add the new features and be able to set the following directive in the caddyfile
{
order cache before rewrite
cache
}
I never got an issue while trying to store more than 1MB but if you need more examples I can just say to wait 1 day until I write some others.
add the new features
Which new features?
I am a beginner at configuring Caddy. Never tried Soin.
You have one example which tosses everything in. Way too complex for the beginner to figure out. I feel like giving up.
It would be great to have a simple proxy cache server example. Then a more complex one caching based on path, or cookies. Then a few other simple examples. I could read each example by itself, and soon I would understand what is going on. Right now there are just to many new things to learn to figure it out quickly.