Open rr- opened 7 years ago
Such is life. I'm surprised FFmpeg still accepts this dimension - it must be close to the limit (which is about 10000*10000 pixels).
Best we could do is "pre-scaling" the image with libswscale.
Will that work with zooming and panning though? (e.g. if I zoom in, won't I be seeing downscaled-and-then-upscaled texture?)
You can try how that would work by using --vf=scale=...
(vf_scale's options suck, there's no scale factor and you must pass a new size).
I think current hardware has texture limits from 4K x 4K to 32K x 32K (depends on the GPU).
Unfortunately it looks pretty bad, especially for highly detailed images...
It would be cool to be able to fit the texture on the viewport rather than source, and use sws to fit and translate the image into that viewport texture. (I think that's what already happens with vo=x11
- of course with mmap()
ed or similar pixels, rather than GL textures?)
Also if you zoom a large image close enough on vo=x11
, the image gets stretched in a weird way. (I think the image size overflows?) The behavior seems to produce different artifacts for large image width and for large image height.
Well sure, with vo_x11 this will work better, because it involves cropping the image before libswscale, and then scaling the cropped image to screen size.
I don't know what's overflowing, or whether it's a mpv or libswscale bug.
Because x11
seems to have no problem with images of such size, I'd use x11
by default, but x11
flickers a lot when switching between images, and xv
(which seems to work similarly to x11
) has suboptimal OSD scaling.
For this reason I think the optimal solution (good looks and presumably low effort) is to make -vo=opengl,...
fall back to the next VO when opengl
is unable to create the texture - that way, supplying -vo=opengl,x11
, would trigger x11
only for extreme images.
(The other solution I have in mind is to make a backend / wrapper for OpenGL that uses SWS to crop and scale the image into a texture of reasonable size.)
The “proper” fix would be to use tiled textures for images too large to fit inside a single texture, but unfortunately getting this to work properly would be pretty complicated.
Too complicated, except maybe as explicit downsample pass (which would disregard some correctness issues).
Out of curiosity, what would be the complications? I've been planning to write an image viewer and this issue had occurred to me. I figured the tiles would have to overlap by the radius of the scaler. What else would have to be done?
I figured the tiles would have to overlap by the radius of the scaler.
I guess this might work too. Still pretty complicated...
They would have to overlap, yes. But working that logic into vo_opengl would still be stupidly complicated, given that vo_opengl has more than enough things going on.
That said, I think you could completely abstract it away and hide it behind a unified pseudotexture that works the same way as they do currently, with the only cost being the need to draw multiple tris with updated bindings.
Did you ever find a workaround for this? Running into the same issue when trying to use a playlist with mixed videos and images. Images > GL_MAX_TEXTURE_SIZE
fail to load unless I am using x11
, but the video performance on x11
is poor. Currently I am using the command interface to dynamically change vo
when a large image needs to be displayed, but that can cause some glitching during the transition.
mpv version and platform
Reproduction steps
Try to open large file:
Expected behavior
I should see planet Mars
Actual behavior
I see black window instead
I can see the image fine on
vo=x11
, however,vo=opengl,x11
doesn't fall through ontox11
. Suspected cause - my GPU can't handle GL textures this big, and the error happens too late to be caught by the code that would fall back tox11
.Log file
Sample files
https://upload.wikimedia.org/wikipedia/commons/2/28/'Calypso'_Panorama_of_Spirit's_View_from_'Troy'.jpg