Open tvdeyen opened 10 years ago
For now I've monkey-patched dragonfly-s3_data_store to escape utf8 in json, which does the trick:
module Dragonfly
class S3DataStore
def meta_to_headers(meta)
{'x-amz-meta-json' => JSON.generate(meta, ascii_only: true)}
end
end
end
The default implementation uses MultiJson.encode(meta)
, which does not escape utf8.
I've opened an issue for fog at fog/fog#2942.
Hi guys, just chiming in, because I ran into this problem as well. IMHO this really needs to be fixed in fog
(see https://github.com/fog/fog/issues/2942), but just for reference I want to post our solution here, since JSON.generate(meta, ascii_only: true)
did not work for us.
We decided to monkey patch this in our rails v3
app (running on ruby v2.1.2
) with a module
module Dragonfly
module S3DataStoreUmlauts
def meta_to_headers(meta)
{ 'x-amz-meta-json' => ActiveSupport::JSON.encode(meta) }
end
end
end
which we then use to extend
our datastore
like so
Dragonfly.app.datastore.extend Dragonfly::S3DataStoreUmlauts
PS:
This won't work with rails v4
, see https://github.com/rails/rails/commit/8f8397e0a4ea2bbc27d4bba60088286217314807 and https://github.com/rails/rails/issues/3727
Due to another signing error, when the filename contains multiple spaces I'm now using this:
module Dragonfly
class S3DataStore
def meta_to_headers(meta)
{'x-amz-meta-json' => JSON.generate(meta, ascii_only: true).gsub(' ', '\u0020')}
end
end
end
@markevans Maybe dragonfly should simply base64 encode the json? There are so many cases where the generated headers can break either RFC2616 Section 4.2 or S3's buggy interpretation of those rules.
Oops, deleted last comment which should've landed on a fog/fog-aws#160 ticket.
Uploading files to S3 fails if the file has an umlaut in its name.
uri encode the filename before storing works: