Closed Koc closed 11 years ago
100Mb+ is a problem to store only on $5 shared hosting. Ancient browsers do not support gzip encoding. Let http servers solve such problems.
Леша, причем тут браузеры? С каких это пор мы делаем сайтмапы для людей? Я говорю, что у нас 120 субдоменов, у каждого по 100 с лишним карт сайтов. Зачем занимать столько места если можно все это сжать?
gni?! (Don't worry I used google translate. Please don't forget to use english).
@Koc stick to english, this is international resource. Let me explain my last comment: you can not guarantee gzip support for any client. Neither browser, neither google bot, neither any other crawler. Gzip support is a big problem for any 3rd party library: there's zlib library but it's not enabled by default. It's possible to use gzip
unix command but what about windows users? To sum up:
In the end it's open source: you need it - you fork it, you implement it. If you think it's useful for community - contribute it to upstream via push request.
P.S. Костя, don't worry, be happy
@esion i wonder what language did you use?
@mente I speak and think in French. Obviously it's better to communicate in English here :)
Well, about gzip I'm not sure what should be done, we don't have sitemap as large as you have. If I well remember, the http request headers gives info about what is allowed, so we may provide an gziped answer ... or not. And as you said @mente, maybe it's the job of the http server.
For now I can't implement any solutions.
Of course this issue low priority. --gzip
option would be optional. I will open PR after #28 would be merged to avoid conflicts
Also gzipping described in native specification http://www.sitemaps.org/protocol.html
Great! Can I peek into your code before you send a PR for it? Wondering how did you solve problems I've mentioned before
Not sure about apache, but nginx supports gzip compression of the content on the fly. Even for fastcgi backends. So what's the point of this issue?