Open tungsk opened 8 years ago
Hi Ken.
Sure it is possible for the Pi to send a message to the device. As you can see in the code, the iOS device sends a certain sequence of bits to the Pi, and the Pi responds with starting a stream, from which point this stream is now dedicated to sending .h264 data.
Setting up another socket dedicated to non-stream related data would be the most straightforward way. As a matter of fact, this is still on my to-do list, but I haven't gotten around to it. Feel free to make it yourself and create a pull request 😃.
Hi Joride
I also want to know where can find the book or some reference about the H264 video streaming protocol you used to build the video , as i want to know how the "processBytesFromStream" can work .
Hi Ken,
For understanding how the processBytesFromStream works, you could to watch WWDC session 513 from 2014 "Direct Access to Video Encoding and Decoding": https://developer.apple.com/videos/play/wwdc2014-513/.
That session is not very detailed, so here are some links to useful sites: http://gentlelogic.blogspot.nl/2011/11/exploring-h264-part-2-h264-bitstream.html https://cardinalpeak.com/blog/the-h-264-sequence-parameter-set/
The official specification can be found here: http://www.itu.int/ITU-T/recommendations/rec.aspx?id=12063&lang=en
Is there any way for change the video to UIIamge ? I have tried change the CMsamplebufferRef to the UIImage However, I don't know why it always give nil from I change CMsamplebufferRef to CVIamgeBufferRef
Hello , Joride i am using your code with my pi and it is fun. since we can send out the message from the ios devices to pi , and pi can receive it. However, is it possible for pi send some message to the ios devices ? Because i just found the streaming part of the Xcode is collecting the data of the h264video, and is it possible if i push some Bytes of data when it streaming the video data . thank you :+1: Ken