Closed ErAzOr2k closed 3 weeks ago
Hello,
If you have the data already encoded in H264 you can try using a decoder to render the surface provided by the start method. It is similar to VideoFileSource but you replace MediaExtractor with your h264 buffer: https://github.com/pedroSG94/RootEncoder/blob/master/encoder/src/main/java/com/pedro/encoder/input/sources/video/VideoFileSource.kt You need create a new class similar to VideoDecoder: https://github.com/pedroSG94/RootEncoder/blob/master/encoder/src/main/java/com/pedro/encoder/input/decoder/VideoDecoder.java I recommend you copy the class and replace media extractor with a LinkedBlockingQueue that you should fill with video buffer when you receive it and get data in the thread instead of get data from the media extractor
I have adjusted the code as you suggested. Here is a simplified version of it: https://gist.github.com/ErAzOr2k/1bce697bc84d190b3bcdba2d96151afa
I have been facing a problem for days (or weeks?) that I can't solve. I realize it might be out of scope, but maybe you can still send me the crucial tip.
In the decode function, outIndex is always -1, no matter what I do, even though the payloads were added correctly to the inputBuffer beforehand.
For reference, here are the unprocessed incoming byte arrays (in shortened form) as I receive them in the processReceivedData function:
Do you have any idea what it could be or if something is missing?
Hello,
I'm not able to see the code.
oh sorry something went wrong with the link. Can you try again please?
To discard errors, for now, try to create your codec manually instead of use buffers received and send all data to the queue. Also, let the buffers without modify. You should send seq buffers to the queue too.
If all is working with this, you can now check seq buffers and extract info to generate the codec.
I can try do a code example using buffers from a video
Try this way:
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
class BufferDecoder {
private val running = AtomicBoolean(false)
private val queue = LinkedBlockingQueue<ByteArray>(200)
private var videoCodec: VideoCodec = VideoCodec.H264
private var width: Int = 640
private var height: Int = 480
private var fps: Int = 30
private var rotation: Int = 0
private val scope = CoroutineScope(Dispatchers.IO)
private var job: Job? = null
private var codec: MediaCodec? = null
private val bufferInfo = MediaCodec.BufferInfo()
fun prepare(codec: VideoCodec, width: Int, height: Int, fps: Int, rotation: Int) {
this.videoCodec = codec
this.width = width
this.height = height
this.fps = fps
this.rotation = rotation
}
fun start(surfaceTexture: SurfaceTexture) {
val type = when (videoCodec) {
VideoCodec.H264 -> CodecUtil.H264_MIME
VideoCodec.H265 -> CodecUtil.H265_MIME
VideoCodec.AV1 -> CodecUtil.AV1_MIME
}
val codec = MediaCodec.createDecoderByType(type)
val format = MediaFormat()
if (rotation == 0 || rotation == 180) {
format.setInteger(MediaFormat.KEY_WIDTH, width)
format.setInteger(MediaFormat.KEY_HEIGHT, height)
} else {
format.setInteger(MediaFormat.KEY_WIDTH, height)
format.setInteger(MediaFormat.KEY_HEIGHT, width)
}
format.setInteger(MediaFormat.KEY_FRAME_RATE, fps)
codec.configure(format, Surface(surfaceTexture), null, 0)
codec.start()
running.set(true)
job = scope.launch { decode() }
}
fun stop() {
running.set(false)
job?.cancel()
job = null
codec?.stop()
codec?.release()
codec = null
}
fun sendBuffer(bytes: ByteArray) {
queue.trySend(bytes)
}
fun isRunning() = running.get()
private suspend fun decode() {
val startTs = System.nanoTime() / 1000
while (running.get()) {
val frame = runInterruptible { queue.poll(1, TimeUnit.SECONDS) } ?: continue
val codec = codec ?: continue
val inIndex = codec.dequeueInputBuffer(10000)
if (inIndex >= 0) {
codec.getInputBuffer(inIndex)?.put(frame)
val ts = System.nanoTime() / 1000 - startTs
codec.queueInputBuffer(inIndex, 0, frame.size, ts, 0)
}
val outIndex = codec.dequeueOutputBuffer(bufferInfo, 10000)
if (outIndex >= 0) {
codec.getOutputBuffer(outIndex)
codec.releaseOutputBuffer(outIndex, bufferInfo.size != 0)
}
}
}
}
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
class EncoderBufferVideoSource: VideoSource() {
private val bufferDecoder = BufferDecoder()
override fun create(width: Int, height: Int, fps: Int, rotation: Int): Boolean {
bufferDecoder.prepare(VideoCodec.H264, width, height, fps, rotation)
return true
}
override fun start(surfaceTexture: SurfaceTexture) {
bufferDecoder.start(surfaceTexture)
}
override fun stop() {
bufferDecoder.stop()
}
override fun release() { }
override fun isRunning(): Boolean = bufferDecoder.isRunning()
}
First of all, thank you very much! It is very strange. The problem persists :( But I will continue to experiment.
Maybe the input buffer is not valid. Are you using H264 with AVC or annexb headers? (normally Android use annexb) Did you tried using the other?
The input buffer comes from an RTMP client and is received by my own RTMP server implementation, so it should be Annex B. I have no way of changing that, at least I wouldn't know how.
Hello,
If you are using RTMP then this could be packetized in FLV. If not, then it is using AVC header (because annexb start with 0, 0, 0, 1): https://github.com/pedroSG94/RootEncoder/blob/master/rtmp/src/main/java/com/pedro/rtmp/flv/video/packet/H264Packet.kt#L126 As a reference, in iOS, the encoder return H264 with AVC headers and I convert it to annexB to keep the rtmp client code equal to Android version: https://github.com/pedroSG94/RootEncoder-iOS/blob/master/RootEncoder/Sources/RootEncoder/encoder/video/VideoEncoder.swift#L189 https://github.com/pedroSG94/RootEncoder-iOS/blob/master/RootEncoder/Sources/RootEncoder/encoder/video/VideoEncoder.swift#L257 Resume: Remove first 4 bytes, convert it to a uint32 to know the lenght of the frame. Get that num of bytes from the buffer and append on start 4 bytes (0, 0, 0, 1) that are the annexB header. Repeat until the buffer is empty. Remember that you need send this buffers modified one by one to the Android decoder
With a code example to reproduce it I could be able help you more. For now, this is all I can do with this info.
That’s it! Thank you so much! After I don't know how many days, I finally have an output on the surface :) A thousand thanks!!!
Hi,
I am working on an app that should allow sending various video and audio sources to an RTMP server. Additionally, the preview via OpenGlView should also be possible alongside the stream to the RTMP server.
It seems that GenericStream or RtmpStream would be well-suited for this since I can flexibly switch sources using genericStream.changeVideoSource.
However, I now have a source that provides the video and audio data in H264 format as a byte array. Is this possible with RootEncoder, and is there an example for this?