Closed nedelceo closed 12 months ago
Besides an error about not being able to remove one of the files in your workspace/cache, I don't see anything wrong in the output. I'll be honest and say that I haven't tested the LEO resampling functionality for a long long time since EUMETSAT's team did most of the development on it. Maybe @ameraner has some ideas. I can't think of a reason why the 1km resolution target area would work but the 500m resolution target would not.
Just for the record I am getting this error ...Can't delete numpy memmap cache file...
also with 1km resolution, but in that case the image is displayed correctly.
Yeah, sorry, I was just pointing out that there weren't any other errors. This error is unrelated.
Indeed, in principle there is no reason why the 500m grid should not work compared to the 1km grid - I just tried and indeed it works for me...
It's a long shot, and it may be wrong, but hardware could be the problem - we have seen in the past that some graphic cards cannot handle high-res grids. When we observed it, it was throwing explicit errors, so this is still strange. Could you check what is the output of glxinfo -l | grep MAX_TEXTURE_SIZE
on your system? If this is less than 22272, it could be an indication.
As another test, to exclude issues with LEO, you could maybe try to resample a SEVIRI granule onto the MTG FCI FDSS 500m
area - for that, remember to select e.g. Nearest Neighbour
in the Open File Wizard when you select a dataset (otherwise. in the current default, when selecting None
, the SEVIRI data will only be reprojected to the MTG projection, but the pixel resolution will remain the same... I know, confusing)
I did not find the way how to check max texture size on Wnindows machine I am using. But, I tried to resample SEVIRI data to MTG FCI FDSS 500m
and no image was displayed (same as for LEO data). So you are probably right @ameraner. The problem might be my machine.
@nedelceo If you can run a command in the command line, you should be able to use the python inside the SIFT bundle and do SIFT_directory/bin/python -c "import vispy; print(vispy.sys_info())"
. We should really add this information to the SIFT startup process.
Thanks for the hint @djhoese . vispy.sys_info
returned MAX_TEXTURE_SIZE: 16384
that is less than 22272 mentioned above.
What type of system do you have (mac, windows PC, linux)? How old is it? One of the reasons I gave to the EUMETSAT team when tiled imagery was changed to entire-image-in-GPU was that it should save GPU memory and use an overall smaller texture. There wasn't necessarily a concern for texture size for "modern" decently powerful machines. So if there is a class of machine that users are using that don't meet these expectations we need to collect a list of them.
I use Lenovo ThinkStation 330 with Windows 10 pro. Bought probably in 2020.
A ThinkStation P330 desktop/tower? Do you know what GPU you have? Some information should have been included in vispy's sys_info output (at least about the drivers). When I googled I saw a model with 16GB of RAM and an Nvidia GPU. This assumes you haven't (on purpose or by accident) been using any Intel integrated GPUs.
@ameraner This makes me think maybe we shouldn't be loading entire images into GPU memory. A 2020 desktop is not a "light" machine.
@nedelceo does it help/change something if, in your configs, you set
display:
image_mode: tiled_geolocated
(see https://sift.readthedocs.io/en/latest/configuration/display.html). With this mode only part of the array is sent to the GPU, which might help with this. With this, in the progress bar you should see Tiling
appear when zooming in/out/loading new data.
If it helps, how is the rendering performance for you in this mode?
@djhoese you are right. I have machine with 16 GB RAM and NVIDIA GPU. As you assumed, I haven't been using NVIDIA GPU. When I set .../SIFT_2.0.0b0/python.exe
to be run with NVIDIA GPU, it solved the issue.
@ameraner setting image_mode
to tiled_geolocated
also solved the issue.
Thanks to both of you, each of your advice solves the problem.
Oh great. So maybe we can close this as your system (with the Nvidia GPU enabled) generally meets our minimum and expected specs for a modern system. Integrated GPUs are always more limited so while I'd love to support them more easily out of the box (by default), I think that would hurt the performance and experience seen by users with better machines.
@ameraner feel free to reopen if you want to use this as a reminder to compare performance on difference machines.
Related to https://github.com/vispy/vispy/issues/758
No image is dispalyed when I try to plot VIIRS or MODIS data using the
MTG FCI FDSS 500m
area.MTG FCI FDSS 1km
area.MTG FCI FDSS 500m
area.MTG FCI FDSS 500m
area.Plotting VIIRS data at 500m resolution: