Open copumpkin opened 8 years ago
Can you try dtrace or whatever the darwin strace equivalent is?
93142/0x349fdeb: open("/private/tmp/test\0", 0x0, 0x0) = 7 0
93142/0x349fdeb: fstat64(0x7, 0x7FFF5D991A90, 0x0) = 0 0
93142/0x349fdeb: read(0x7, "\0", 0x140000000) = -1 Err#22
93142/0x349fdeb: close(0x7) = 0 0
93142/0x349fdeb: ioctl(0x2, 0x4004667A, 0x7FFF5D99213C) = 0 0
93142/0x349fdeb: write(0x2, "\033[31;1merror:\033[0m reading from file: Invalid argument\n\0", 0x36) = 54 0
Why is nix passing such a huge number for the buffer size? One sec...
Ah, right, readFull
. This is another example of the problem with nix having to read an entire file into memory at once before writing it to the store. @copumpkin any idea the max you can try to read from a file in a single go?
Could these be easily side-stepped by mmap-ing the whole file instead of reading it? (It would really help on 64-bit systems only.)
@shlevy no idea, sorry.
@vcunat perhaps!
Alternatively, could we do incremental reads for hashing? By copying to a temporary location on the same filesystem, then moving into the desired location once we know the final hash?
There's no reason we need to read the whole thing in, for sure.
Might be worth trying https://github.com/NixOS/nix/pull/619
I just ran into this problem on Darwin trying to follow these instructions:
***
Unfortunately, we cannot download file xcode_5.1.dmg automatically.
Please go to https://developer.apple.com/downloads/ to download it yourself, and add it to the Nix store
using either
nix-store --add-fixed sha256 xcode_5.1.dmg
or
nix-prefetch-url --type sha256 file:///path/to/xcode_5.1.dmg
So I downloaded the file (2.1 GB), but following the instructions failed:
$ nix-store --add-fixed sha256 ~/Downloads/xcode_5.1.dmg
warning: dumping very large path (> 256 MiB); this may run out of memory
error: writing to file: Invalid argument
So I guess I just want to know is what is the plan to fix the problem? The linked pull request seems to be over 2 years old
Someone needs to bring the PR up to date and test it. Happy to help with the latter :)
Will the pull request be merged if somebody brings it up to date?
cc @edolstra
/cc @copumpkin @edolstra
Is there any fix in the works for this? I hit this pretty consistently when working with the new XCode requireFile stuff.
I think @edolstra's "controversial" fix for another large file issue would help here too. It's controversial because it would result in more IO and temporarily twice as much disk space (i.e., move stuff into a temporary location, while hashing it, then move into place in the store), but would run in constant memory.
Edit: see https://github.com/NixOS/nix/pull/2206#issuecomment-396064576 and another place where I think he has a commit in a fork I can't find right now that implements something similar to that PR
I marked this as stale due to inactivity. → More info
FWIW nix-store --add-fixed --recursive sha256 dirName
successfully finishes for a ~9GB directory for me, but only if I have enough free disk space (since it uses a lot of memory and therefore starts swapping to disk). I'm using Nix 2.3.7.
It uses up all available memory (just below 16G), starts swapping to disk, and finishes in about an hour or so.
I marked this as stale due to inactivity. → More info
This should be fixed by at least Nix 2.11.1
This uses constant memory:
nix-store --add-fixed sha256 file.large
This seems works in constant memory (internally using the function named Store::addToStoreSlow
):
nix store prefetch-file file://$PWD/file.large
whereas this will use lots of memory
nix store add-file ./file.large
This distinction may need to be documented or clarified. My assumption is that nix store add-file
is optimized for quick small adds that fit in memory while nix store prefetch-file
is optimized for streaming from network in constant memory.
This is what happens:
I assume one of the file APIs Nix is using is stuck on some sort of 32-bit file size counter on Darwin?
cc @shlevy @edolstra