microsoft / MixedReality-WebRTC

MixedReality-WebRTC is a collection of components to help mixed reality app developers integrate audio and video real-time communication into their application and improve their collaborative experience
https://microsoft.github.io/MixedReality-WebRTC/
MIT License
908 stars 282 forks source link

How to send pictures or files #249

Closed suxinren closed 4 years ago

suxinren commented 4 years ago

Does anyone know how to send pictures or files in MixedReality-WebRTC project? I can use datachannel to send simple messages, but sending files can not be received

djee-ms commented 4 years ago

Hi @suxinren,

Can you please describe what the actual issue is? What do you mean by "sending files can not be received"? Are you using Unity or C#? Do you see any error in the Unity console or in the Visual Studio Output window? What is the typical size of files you are trying to send, is it a few KB only, or very large 100+ MB files?

suxinren commented 4 years ago

@djee-ms Thank you very much for your attention 1.I use Unity. 2.No error. Data sent in Unity Editor. 3.The file size is 103kb. If I send simple messages like "test message",it's ok, the DataChannel.MessageReceived() method can receive the message. But if I send a file,the DataChannel.MessageReceived() method can not receive anything and no error.

The main code is as follows: //Send using (FileStream fs = new FileStream(filePath, FileMode.Open)) { byte[] bytes = new byte[fs.Length]; fs.Read(bytes, 0, bytes.Length); fs.Flush(); peerConnection.SendFile(bytes); } public void SendFile(byte[] data) { try { if (dataChannel.State == DataChannel.ChannelState.Open) { dataChannel.SendMessage(data); } else outputMsg = "DataChannel State:" + dataChannel.State.ToString(); } catch (Exception ex) { outputMsg = "Send File Exception:" + ex.Message; } }

//Receice private void Peer_DataChannelAdded(DataChannel channel) { try { channel.MessageReceived += (byte[] message) => { _mainThreadWorkQueue.Enqueue(() => { OnMessageReceived?.Invoke(message); }); }; } catch (Exception ex) { outputMsg = "Peer_DataChannelAdded Exception:" + ex.Message; } }

public delegate void DataChannelMessage(byte[] arrMsg); public event DataChannelMessage OnMessageReceived;

peerConnection.OnMessageReceived += PeerConnection_OnMessageReceived;

private void PeerConnection_OnMessageReceived(byte[] arrMsg) { try { //outputMsg = "got a message"; outputMsg = "File received"; } catch (Exception ex) { outputMsg = "Exception:" + ex.Message; } }

djee-ms commented 4 years ago

My guess is that you are hitting the internal buffering limit of data channels, and you need to manually split the file into smaller chunks. See the DataChannel.BufferingChanged event. 103 KB is definitely above the internal buffer size.

suxinren commented 4 years ago

@djee-ms I will try you to split the file into smaller blocks to send, thank you very much for your help.

astaikos316 commented 4 years ago

As I have come to establish as a best practice, i created a method to "packetize" large files and send them over a data channel. It has helped me to send files and spatial mesh data from hololens that can get upwards of 40 MB.

djee-ms commented 4 years ago

@astaikos316 that sounds like a very nice feature! I think adding some utility like that would benefit many users. Would you be interested to contribute your change?

suxinren commented 4 years ago

Hi @djee-ms Is MixedReality-WebRTC supported to set the resolution of the transmitted video? e.g .720P, 540P, if supported, where to set?

djee-ms commented 4 years ago

@suxinren if you mean the resolution of the video capture from the webcam then that's in LocalVideoTrackSettings. If you mean for a custom video track (ExternalVideoTrackSource) then just set the size of the frame in I420AVideoFrame before you call CompleteRequest(). There is no per-track constant resolution; the resolution is set per frame and transmitted with each frame.

astaikos316 commented 4 years ago

I don't have it set up in a separate class or anything at the moment. Been focusing on functionality as I am doing this as part of a research effort but the routine is fairly simple as shown in an example below, where numberOfChunks is how many 1K byte chunks of a file exist, maxBufferSize is set to how large any single message is desired to be set to (i used 1024), and the routine accounts for the remaining chunk that is not divisible by the maxbuffer size:

for (int i = 0; i < numberOfChunks; i++) { wrapper = maxBufferSize * i; if (wrapper + maxBufferSize <= imageBytes.Length) { for (int j = 0; j < maxBufferSize; j++) { imageBufferArray[j] = imageBytes[wrapper + j]; } } else if (wrapper + maxBufferSize > imageBytes.Length) { finalBufferLength = imageBytes.Length - wrapper; imageBufferArray = new byte[finalBufferLength]; maxBufferSize = finalBufferLength; for (int j = 0; j < maxBufferSize; j++) { imageBufferArray[j] = imageBytes[wrapper + j]; } } peerConnection._fileDataChannel.SendMessage(imageBufferArray); }

suxinren commented 4 years ago

Hi @djee-ms My goal is to reduce the resolution of the video before it is transmitted, so as to ensure the fluency of the network with low bandwidth. Does setting LocalVideoTrackSettings help?

suxinren commented 4 years ago

Hi @djee-ms I found in LocalVideoSource class, the video resolution is 896504, not the common size, if i want to set a little smaller, what should be? 640480 or 480*360?

djee-ms commented 4 years ago

The default resolution is 896x504 only if you're on HoloLens 1. In any case you can override that with LocalVideoTrackSettings. I can't tell you what you should use, that depends on your use case, you'd have to try.

suxinren commented 4 years ago

@djee-ms Thank you very much for you help.

suxinren commented 4 years ago

Hi @djee-ms This morning, I tried to change the resolution to 640 x 480 or 480 x 360 and the App crashed. I remember when I studied face recognition through HoloLens1 photography in 2018, I looked into the resolution HoloLens1 supported as follows: 896 x 504 , 1280 x 720 , 1344 x 756 , 1408 x 792 , 2048 x 1152. So I think 896 x 504 should be the smallest pixel HoloLens1 support. This should help other developers who have the same purpose as me.

suxinren commented 4 years ago

Hi @astaikos316 Your code is very helpful, I have achieved sending pictures, the HoloLens side successfully received pictures and display. Thank you very much.

HyperLethalVector commented 4 years ago

Hi there, It's not so clear how you receive events/send data from the data channel using the Unity library, could someone point me in the right direction?

djee-ms commented 4 years ago

Sorry just saw this @HyperLethalVector; there is nothing specific in the Unity library for data channels, you have to directly use the C# library for that. You can access the PeerConnection C# class from the PeerConnection Unity component via its Peer property, e.g.:

var peer = MyPeerConnectionUnityComponent.Peer;
var dataChannel = peer.AddDataChannelAsync(...);
dataChannel.MessageReceived += ...;
dataChannel.SendMessage(...);
Gunther2689 commented 3 years ago

My guess is that you are hitting the internal buffering limit of data channels, and you need to manually split the file into smaller chunks. See the DataChannel.BufferingChanged event. 103 KB is definitely above the internal buffer size.

Can I change the internal buffer size?