bytedeco / javacv

Java interface to OpenCV, FFmpeg, and more
Other
7.53k stars 1.58k forks source link

Error bad src image pointers when trying to push streaming video to server #2096

Open lushangkan opened 1 year ago

lushangkan commented 1 year ago

I'm trying to develop a Minecraft module for capturing OpenGL and OpenAL audio and video data directly and pushing it to the server without going through the screen capture.

When I use FFmpegFrameRecorder to try to push audio and video data to the rtmp server, an error occurs [STDERR]: Error: [swscaler @ 0000021ef0eb8e00] bad src image pointers.

After the error, I tried to view the address of the push stream using the playback software, and I noticed that the screen was green, but the sound played fine.

I did some Google searches and one stackoverflow article pointed out that some functions might be corrupting the integrity of the buffer, I tried to remove all the functions that free up the buffer's memory, but it still reported errors. Later I tried to convert the VideoBuffer generation function from MemoryUtil, the memory manager that comes with OpenGL, to ByteBuffer.allocateDirect, the one that comes with java (it's not possible to use ByteBuffer.allocate because of the nature of OpenGL), but still to no avail.

I need help and sincerely thank everyone who offers it, my code repository

Here's my code snippet:

Streamer.java

public void update(int height, int width, ByteBuffer video, ShortBuffer audio) {
        if (isRecording) {
            try (Frame frame = new Frame()){
                // Audio
                if (audio != null) {
                    frame.sampleRate = recorder.getSampleRate();
                    frame.samples = new Buffer[]{audio};
                    frame.audioChannels = recorder.getAudioChannels();
                }
                // Video
                frame.imageWidth = width;
                frame.imageHeight = height;
                frame.imageChannels = 1;
                frame.imageDepth = Frame.DEPTH_BYTE;
                frame.image = new Buffer[]{video};

                recorder.record(frame);
            } catch (Exception e) {
                log.error("Could not record frame!", e);
                ChatUtils.sendMessageToPlayer(Text.Serializer.fromJson("{\"text\":\"Could not record frame!\",\"color\":\"red\",\"bold\":true,\"italic\":false,\"underlined\":false,\"strikethrough\":false,\"obfuscated\":false}"));
                e.printStackTrace();
            }
        }

OpenGLUtils.java

/**
     * Get the screenshot of the game
     * @param framebuffer the framebuffer of the game
     * @return the screen shot of the game
     */
    public static ByteBuffer screenShot(Framebuffer framebuffer) {
        int height = framebuffer.textureHeight;
        int width = framebuffer.textureWidth;
        ByteBuffer imageBuffer = ByteBuffer.allocateDirect(height * width * 4);
        RenderSystem.bindTexture(framebuffer.getColorAttachment());
        RenderSystem.assertOnRenderThread();
        GL11C.glGetTexImage(GL11C.GL_TEXTURE_2D, 0, GL_BGRA_EXT, GL11C.GL_BYTE, imageBuffer);
        return imageBuffer;
    }
saudet commented 1 year ago
               frame.imageChannels = 1;

That probably needs to be 3 or 4?

               frame.image = new Buffer[]{video};

How did you make sure this buffer is actually a direct NIO buffer?

lushangkan commented 1 year ago
               frame.imageChannels = 1;

That probably needs to be 3 or 4?

               frame.image = new Buffer[]{video};

How did you make sure this buffer is actually a direct NIO buffer?

Why is imageChannels 3 or 4? imageChannels is how much images is in a single stream or the number of channels of image color?

The buffer is created by ByteBuffer.allocateDirect and buffer.isDirect() is true, it should be a direct NIO buffer.

lushangkan commented 1 year ago

No error after using Frame's own method of creating a Buffer.

Pointer pointer = new BytePointer(imageHeight * imageStride * pixelSize(depth));
ByteBuffer buffer = pointer.asByteBuffer();

or

for (int i = 0; i < video.capacity(); i++) {
                    ((ByteBuffer)frame.image[0]).put(i, video.get(i));
}