-
### Describe the issue
I saw #103 asked a similar question, but I'm not sure I understand how this works with respect to Equation 5 from the first LLMLingua paper. If I have a query with `condition_i…
-
## Expected behavior
Timber WebP lossless quality is lossless quality.
## Actual behavior
Seems like the `towebp` filter with quality 100 don't make lossless quality, compared to ImageMagick …
-
**Describe the bug**
In my experience, adaptive compression sometimes increases compression level throughout low entropy data and afterwards fails to decrease it even though CPU becomes the bottlenec…
-
# Summary
It would be great if all of the endpoints queried by Argo CD UI supported compression.
https://github.com/argoproj/argo-cd/issues/4226 covered most of the endpoints, however that chan…
-
Currently kitsune's upload backend just takes the file and (if it's an allowed filetype) places it in the specified storage backend. This behavior is of course correct and acceptable, but I'd be nice …
-
Followup to #4464. Re-uploading a NAR with a different compression method invalidates the (cached) .narinfo, so it's better if we get rid of the `Compression` field and autodetect the compression meth…
-
We are having the api layer written in python that is querying the pinot using swagger api. We have found the below results in terms of response time and response file size. We wanted to reduce the re…
-
Please bear with my non-technical nature & understanding of nostr. In a former professional life the ability to compress certain data types made a lasting impression on me.
Per my limited understa…
-
### What is the problem this feature would solve?
Currently all http responses are uncompressed.
### What is the feature you are proposing to solve the problem?
Would be good to enable gzip compres…
jakeg updated
5 months ago
-
I'm unsure about this one, mostly a question. We could in theory add under-the-hood snappy compression, but maybe that would break the bucket-sorting a bit, at least for blob txs. If the originally st…