libsdl-org / SDL

Simple Directmedia Layer
https://libsdl.org
zlib License
9.54k stars 1.77k forks source link

SDL_GetRenderTarget returning different value than the one set in SDL_SetRenderTarget #9176

Closed mystborn closed 7 months ago

mystborn commented 7 months ago

I'm working on a camera system, and throughout my draw calls I check if the renderer is currently using the camera texture. The value returned by SDL_GetRenderTarget does not return the pointer passed into SDL_SetRenderTarget.

I am on Windows 10, using the x64 debug version of SDL.

SDL_RendererInfo info;
SDL_GetRendererInfo(renderer, &info);
SDL_SetRenderTarget(renderer, camera->render_target);
printf("%p == %p\n", camera->render_target, SDL_GetRenderTarget(renderer));
printf("%s\n", info.name);

// Output on my system: 
000001ECE78C59C0 == 000001ECE78C5900
direct3d12

Maybe a d3d12 bug?

EDIT:

Digging into the source, it looks like SDL_GetRenderTarget returns the native texture, which feels like unintended behaviour.

slouken commented 7 months ago

Can you create a more complete minimal example? I suspect it has to do with creating a render target in a texture format that isn't supported, which hasn't really been tested much.

mystborn commented 7 months ago

Here is a complete example. Is there a texture format that you'd recommend?

#include <SDL3/SDL.h>

#include <stdio.h>
#include <stdlib.h>

#define WIDTH 800
#define HEIGHT 450

int main(int argc, char** argv) {
    if (SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_EVENTS | SDL_INIT_GAMEPAD) < 0) {
        printf("SDL could not initialize! SDL_Error: %s\n", SDL_GetError());
        return EXIT_FAILURE;
    }

    SDL_Window* window = SDL_CreateWindow(
        "Minimal example",
        WIDTH,
        HEIGHT,
        SDL_WINDOW_OPENGL | SDL_WINDOW_RESIZABLE);

    SDL_Renderer* renderer = SDL_CreateRenderer(window, NULL, SDL_RENDERER_ACCELERATED);

    if (!window || !renderer) {
        printf("Game could not be initialized! SDL_Error: %s\n", SDL_GetError());
        return EXIT_FAILURE;
    }

    SDL_RendererInfo info;
    SDL_GetRendererInfo(renderer, &info);

    printf("Renderer: %s\n", info.name);

    SDL_Texture* texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, 400, 225);
    SDL_SetRenderTarget(renderer, texture);

    printf("%p == %p\n", texture, SDL_GetRenderTarget(renderer));

    SDL_DestroyRenderer(renderer);
    SDL_DestroyWindow(window);
    SDL_Quit();

    return EXIT_SUCCESS;
}

// OUTPUT:
// Renderer: direct3d11
// 000002E9FF407980 == 000002E9FF407760

P.S. Not sure why the renderer here is d3d11 when in my actual project it's d3d12, but it's the same behaviour. Same thing happens when the renderer is set to opengl as well.

slouken commented 7 months ago

Typically unless you need to lock the texture and access the pixels in a specific format, you'll want to use the first format in the list provided by SDL_GetRendererInfo(). For many renderers this is SDL_PIXELFORMAT_ARGB8888.