Open chourroutm opened 1 month ago
Huh, I don't have access to a windows machine right now, but the first thing that jumps out at me is that windows paths are \
paths.
I'm a little surprised this works: first_image = tiff.imread("annotated_data\data_labeled_chunk_44_31_13.tif")
as the \d
is not escaping the backslash \\d
.
See if this works?
"file://D:\\Matthieu\\data_ngprec"
The same code works on a Linux machine, thus it seems to be related to paths. I can try to investigate on that, and keep the issue open in the meantime.
Huh, I don't have access to a windows machine right now, but the first thing that jumps out at me is that windows paths are
\
paths.I'm a little surprised this works:
first_image = tiff.imread("annotated_data\data_labeled_chunk_44_31_13.tif")
as the\d
is not escaping the backslash\\d
.See if this works?
"file://D:\\Matthieu\\data_ngprec"
This did not work, but Windows understands /
as a separator in a POSIX URI (from the line output_dir = output_dir.absolute().as_uri()
).
It turns out the files were written in a strange path: D:\D\Matthieu\...
instead of D:\Matthieu\...
.
For full support on Windows, it might be interesting to rewrite cloudvolume/paths.py with pathlib.Path
instead of posixpath
. Would you be interested in a PR for this change?
Hi! That is pretty weird. I would appreciate more contributions for windows support! Bear in mind that most CloudVolume use is for e.g. gs://
or s3://
which are posixpaths, so using the OS path type for everything would be detrimental.
Ah yes, good point! I'll only tweak the handling of the "local://" paths then
Hi, I have come up with a script to write chunks of 256^3 voxels into a precomputed segmentation, but no data is actually written to the disk (which is not full), not even the JSON info file. I am wondering whether it is related to using a Windows (W11) workstation (although they got it working in https://github.com/seung-lab/cloud-volume/issues/618).
This is the script I have:
This is the output, which comfirms the files were found: