Open zajako opened 7 years ago
Unfortunately, there are no immediate plans to support additional platforms. A player component would require a network TCP stack, and a way of decoding H.264 frames that are received from the network. The network protocol is simple enough, each network message has a simple header that describes the packet type with a size. For each type, convert to the appropriate structure to get the data.
The harder part would be decoding each of the h.264 frames once streaming begins. Each frame is a serialized IMFSample, which has both attributes and buffers. Depending on your h.264 of choice, you may need to construct a structure specific to your platforms decoder.
Thanks for the reply. I'm pretty sure there are h.264 decoders out there that support os x and ios, so that part shouldn't be too terribly hard. But developing libraries like this is outside of my current scope so I'm reliant on finding someone who can build the necessary dlls to support this for Unity apps on these other platforms.
Hello, I'm trying to set up a Unity app to stream from the hololens to multiple platforms. From my understanding I just need to compile the dlls for the other platforms, but I have no clue if this is achievable in any sort of easy way.
To be able to utilize this I need at least OS X and iOS.
I really just need any way to play the stream on these other platforms.