Closed theSoenke closed 1 year ago
Hi @theSoenke, thanks for the PR! I think it'd be most efficient to move reading the file to C++ (where it can be read directly into a std::vector<char>
).
@ankane that sounds like a better idea. Loading these chunks in Ruby didn't feel quite right. Adapted the PR to use C++
Benckmark:
puts Benchmark.measure {
10.times do
Torch.load("./model.pth")
end
}
cpp:
84.285722 36.134938 120.420660 (132.447201)
ruby:
54.531321 67.399236 121.930557 (222.239866)
CPU time seemed roughly similar in my tests but for the actual time it made a difference
Awesome, thanks @theSoenke!
Loading large models > 2.5GB in Ruby with
File.binread
fails with an error:This change should allow loading arbitrarily large models as long as enough memory is available.