Closed gmenier closed 2 years ago
workaround for me (if you only want some text) : net_services_curl.inc,
patch curl_easy_setopt(handle, CURLOPT_ACCEPT_ENCODING, "deflate, gzip"); to curl_easy_setopt(handle, CURLOPT_ACCEPT_ENCODING, "text");
Definitively a pb with deflate I suppose. Regards G.
You could try to explicitly call garbage collection in case all memory has been used up which can make decompression fail.
Hi,
Trying tu use net.curl.
It is ok for some calls (get + cleanup) then : .... Accept: / Accept-Encoding: deflate, gzip
< HTTP/1.1 200 OK < X-Powered-By: PHP/7.3.11-1+0~20191026.48+debian10~1.gbpf71ca0 < Expires: Thu, 14 May 2020 07:54:11 GMT < Last-Modified: Thu, 14 May 2020 07:54:11 GMT < Cache-Control: proxy-revalidate, no-store < Pragma: no-cache < Content-Encoding: gzip < Vary: Accept-Encoding < Content-type: text/html; charset=UTF-8 < Content-Length: 1590 < Date: Thu, 14 May 2020 07:54:11 GMT < Server: lighttpd/1.4.53 <
I also get :
curl_easy_perform failed: Unrecognized or bad HTTP Content or Transfer-Encoding
After this error, all get behave the same (ie bad ;-) )
Tried with different web sites with no success
I suspect the Zlib.
Any ideas ?
Regards Gildas