Closed brandonros closed 3 months ago
Maybe I misunderstand, but couldn't you implement YUVSource
for your type, and then invoke Encoder::encode
with that type? That shouldn't require and conversion or copying.
Would you have a bit more info what you're trying to do?
That said, YUVBuffer
and the RGB / YUV conversion infrastructure might need a bit of cleanup, both API- and performance wise.
I was under the impression I couldn't externally implement a trait in my code in code pulled in from another crate/dependency (your code). I apologize. It looks like I can. Thank you, I apologize for that.
But yes, if by default there was something like
struct NonRgbYuvSource<'a> {
width: i32,
height: i32,
y: &'a [u8],
u: &'a [u8],
v: &'a [u8],
y_stride: i32,
u_stride: i32,
v_stride: i32,
}
impl <'a>NonRgbYuvSource<'a> {
fn new(width: i32, height: i32, y: &'a [u8], u: &'a [u8], v: &'a [u8], y_stride: i32, u_stride: i32, v_stride: i32) -> NonRgbYuvSource<'a> {
return NonRgbYuvSource {
width,
height,
y,
u,
v,
y_stride,
u_stride,
v_stride,
};
}
}
impl <'a>openh264::formats::YUVSource for NonRgbYuvSource<'a> {
fn width(&self) -> i32 {
self.width
}
fn height(&self) -> i32 {
self.height
}
fn y(&self) -> &[u8] {
self.y
}
fn u(&self) -> &[u8] {
self.u
}
fn v(&self) -> &[u8] {
self.v
}
fn y_stride(&self) -> i32 {
self.y_stride
}
fn u_stride(&self) -> i32 {
self.u_stride
}
fn v_stride(&self) -> i32 {
self.v_stride
}
}
It would help I think
Thank you for all that you do on this package, really appreciate it.
FWIW, the constraint is that you can't implement an external trait on an external type. If either the trait or the type are defined in your crate (not counting type aliases), you can implement.
I added YUVBuffer::from_vec(yuv: Vec<u8>, width: usize, height: usize)
now which is practically zero cost if you happen to have a YUV already, and a YUVSlices
similar to the one above (mostly for feedback purposes for the next release).
I have some data that is already in YUV format. It would be nice if we could pass it to YUVBuffer in a way that doesn't require it to be converted from YUV -> RGB and then back to YUV again. I do not see any other way to pass YUVSource at the moment that doesn't require an intermediate RGB layer given the code here https://github.com/ralfbiedert/openh264-rs/blob/4aa93296d78b9e8bd91f5856eaf1a540231bb5da/openh264/src/formats/mod.rs#L5