Closed HuHeng closed 7 months ago
Indeed, inside libplacebo, pl_renderer
is tied to a single video source. (It maintains frame mixing state, tone mapping state, and so on - having separate video streams would break this abstraction quite badly)
I can see some potential paths for this feature to exist, but I would like to know a bit more about your use case - are you using libplacebo inside your own application (C), are you using it inside ffmpeg, or are you using it inside mpv? Because handling multiple input streams will require some support from the wrapping application, and if it's your own code it might be easier to just handle it in C using pl_shader_custom
.
Thank you very much for your answer. I am in the research phase, and if used, it should be used in my own C program.
Same question in my C program. It's seem custom shader only support one texture input. How can i use two textures?
It's there any way to render ffmpeg frame to custom opengl texture?
So i can do it like:
ffmpeg_frame->libplacebo->rgba_texture->custom_shader-> custom_render
I'm not sure I understood your question, but from C, you can use a pl_renderer
per video stream to render to two pl_tex
targets, and then you can use pl_shader_custom()
to blend them together. For example:
pl_dispatch dp;
pl_renderer r1, r2;
pl_tex tex1, tex2, dst_tex;
...
const float vs[4][2] = {
{ 0, 0 },
{ 1, 0 },
{ 0, 1 },
{ 1, 1 },
};
struct pl_frame frame_rgba = {
.num_planes = 1;
.planes = {{
.components = 4,
.component_mapping = {0, 1, 2, 3},
}},
};
frame_rgba.planes[0].texture = tex1;
pl_render_image(r1, &img1, &frame_rgba, &render_params);
frame_rgba.planes[0].texture = tex2;
pl_render_image(r2, &img2, &frame_rgba, &render_params);
pl_shader sh = pl_dispatch_begin(dp);
pl_shader_custom(sh, &(struct pl_custom_shader) {
.body = "vec4 c1 = texture(tex1, pos); \n"
"vec4 c2 = texture(tex2, pos); \n"
"color = mix(c1, c2, 0.5); \n"
.input = PL_SHADER_SIG_NONE
.output = PL_SHADER_SIG_COLOR
.num_descriptors = 2,
.descriptors = {{
.desc.name = "tex1",
.desc.type = PL_DESC_SAMPLED_TEX,
.binding.object = tex1,
}, {
.desc.name = "tex2",
.desc.type = PL_DESC_SAMPLED_TEX,
.binding.object = tex2,
}},
.num_vertex_attribs = 1,
.vertex_attribs = {{
.attr.name = "pos",
.attr.fmt = pl_find_vertex_fmt(gpu, PL_FMT_FLOAT, 2),
.data = { vs[0], vs[1], vs[2], vs[3] },
}},
});
bool success = pl_dispatch_finish(dp, pl_dispatch_params(
.shader = &sh,
.target = dst_tex,
));
Replace your own shader code for however you want to combine the two images.
So i can do it like: ffmpeg_frame->libplacebo->rgba_texture->custom_shader-> custom_render
use pl_opengl_wrap() to upload rgba_texture.
I would like to inquire whether libplacebo's custom shader supports this scenario, blending two video streams using a custom shader to achieve some blend effects, such as transition effects, which require manipulating the textures of these two sources. It seems that the stages and hooks defined within the custom shader are only for one input source? I am not sure if my understanding is correct.