Closed adamgreig closed 12 years ago
09:58 < junderwood> "1350120904.623 0 127.0.0.1 NONE/417 4000 PUT http://habitat.habhub.org/habitat/e4e19e5d77a73852f2f9e23836628804 - NONE/- text/html
this document ID has absolutely no mention in the logs.bizzarrer still, that timestamp is 093504 exactly and we have
[Sat, 13 Oct 2012 09:35:04 GMT] [info] [<0.12645.8>] 86.176.187.22 - - GET /_uuids?count=100 200
[Sat, 13 Oct 2012 09:35:04 GMT] [info] [<0.12645.8>] 86.176.187.22 - - PUT /habitat/e4e19e5d77a73852f2f9e23836627b5f 201
discovered blog article from 2010 "Many applications rely on using a special HTTP/1.1 header (Expect: 100-continue) when doing a POST, which is not happily supported by Squid." (curl uses 100-continue)
edited to add: "Squid-3.2 claims HTTP/1.1 support. Squid v3.1 claims HTTP/1.1 support but only in sent requests (from Squid to servers). Earlier Squid versions do not claim HTTP/1.1 support by default because they cannot fully handle Expect:100-continue, 1xx responses, and/or chunked messages."
I think we can close this as not our fault. Only a concern on old versions of Squid that don't do Expect properly.
Very odd! The squid error page says
[It could be] "HTTP/1.1 Expect: feature is being asked from an HTTP/1.0 software."
And at the top squid gives the request it thinks it got... PUT /habitat/3fed393000f9e14d5a6cb06454f1761e HTTP/1.0
HTTP/1.0 and an expect header! So maybe it is curl's fault...
However, then I downloaded and looked at the tcpdumps: (thank you, by the way, these are perfect). It turns out curl actually sent PUT /habitat/3fed393000f9e14d5a6cb06454f1761e HTTP/1.1
How bizarre!
I think it's probably a squid problem then... but I'm curious as to why this only breaks dl-fldigi (maybe other software is extra tolerant? I don't know.) If I find time I might spin up a virtual machine and test this stuff with that exact squid version myself.
I found some mention on the squid documentation about a configuration option to ignore expect 100-continue. Does that help? Otherwise a bypass for habitat.habhub.org is probably fine. I will try to remember to let you know if the IP address is going to change but hopefully it won't be too difficult to work out what's going on and update your rule if it does unexpectedly.
I'll add a note to the wiki.
Thank you for your help, Daniel
On 18 October 2012 21:07, John Underwood john@jcu.me.uk wrote:
OK. Finally got round to checking.
Squid does in fact serve an error page to dl-fldigi which I obtained via wireshark (attached). It does appear to be the HTTP/1.1 Expect feature which is causing the problem. I have added a couple of tcpdumps for your amusement.
I tried the same thing in the office with the same result. Again, we use a squid proxy. Both are running on CentOS 6 and squid 3.1.10 is the stock distribution on this platform. Presumably CentOS 5 uses an older version. Even the bleeding edge squid 3.2 doesn't appear to support HTTP/1.1 fully.
My guess is that the majority of squid implementations out there will be incompatible with dl-fldigi. Maybe I'm the only person using squid and dlfldigi. I can work around the problem by bypassing squid for habhub.org (providing you don't change the IP address). However, it's probably worth putting a warning somewhere on the Wiki just in case someone else tries it (or a slightly more helpful error from dlfldigi). I'm guessing that any work-around would mean not using the "expect" feature which I guess will cause you grief.
John
[ Email from John ]
Daniel,
Good call on the squid configuration option. That seems to have sorted it out. I will try modifying the config on my work web cache as well.
Thanks
John
John provided me with a couple of tcpdumps with the ignore_expect_100 squid option on. Notes:
I vote we switch it off: It's quite easy, just add a custom http header "Expect:" and curl will not send it.
+1 on turning it off
According to junderwood on #highaltitude. Causes HTTP 417.