I haven't added decoding yet because I didn't have the code for that already written. This is a decent proof of concept, generating 100MB of binary data, writing it to a file on disk, then base64-encoding it as JSON to another file. It pauses before exiting so I could check how much RAM it's using and it only uses 2.1MB of RAM for all of this.
require "base64"
require "json"
# # Generate the giant binary file data
puts "Generating 100MB of binary data..."
File.open "foo.data", "w" do |f|
(100 << 20).times do
f.write_byte rand(UInt8)
end
end
pp upload = FileUpload.new("output", Path["foo.data"])
puts "Writing JSON..."
File.open "output", "w" do |file|
upload.to_json file
end
puts "Done."
# Pause before exiting to be able to check the amount of RAM used.
gets
struct FileUpload
include JSON::Serializable
getter filename : String
@[JSON::Field(key: "data", converter: FileUpload::Base64Converter)]
getter path : Path
def initialize(@filename, @path)
end
module Base64Converter
extend self
# Output the file contents as base64 directly to the `JSON::Builder`'s `IO`
def to_json(path : Path, json : JSON::Builder)
File.open path do |file|
json.string do |io|
Base64.encode file, io
end
end
end
end
end
Closing in favor of #14611. It's a more accurate and delightfully faster implementation. This PR was based on code I cranked out very quickly to handle a very specific scenario in one of my apps.
Fixes #14603
I haven't added decoding yet because I didn't have the code for that already written. This is a decent proof of concept, generating 100MB of binary data, writing it to a file on disk, then base64-encoding it as JSON to another file. It pauses before exiting so I could check how much RAM it's using and it only uses 2.1MB of RAM for all of this.