philnash / twilio-video-react-hooks

A video chat application built with Twilio Video and React Hooks
MIT License
110 stars 61 forks source link

how to add twilio screen sharing in this repo? #21

Open Phyxable opened 4 years ago

Phyxable commented 4 years ago

I need to put that in my production environment.

philnash commented 4 years ago

I don't have an example of adding screen sharing to this application in React, but there is good documentation on screen sharing with Twilio Video and I wrote a post describing how to add it to an application a while ago. I also have an example application you can take a look at. None of those are React specific though.

If you want a more fully featured React example, which includes screen capture, check out this React Twilio Video collaboration app built by our Video team.

jiftuq commented 4 years ago
import React, { useState, useEffect, useRef } from "react";

const Participant = ({ participant }) => {
  const [videoTracks, setVideoTracks] = useState([]);
  const [audioTracks, setAudioTracks] = useState([]);
  const [screenTracks, setScreenTracks] = useState([]);

  const videoRef = useRef();
  const audioRef = useRef();
  const screenRef = useRef();

  const trackpubsToTracks = (trackMap) =>
    Array.from(trackMap.values())
      .map((publication) => publication.track)
      .filter((track) => track !== null);

  useEffect(() => {
    setVideoTracks(trackpubsToTracks(participant.videoTracks));
    setAudioTracks(trackpubsToTracks(participant.audioTracks));
    setScreenTracks(trackpubsToTracks(participant.screenTracks));

    const trackSubscribed = (track) => {
      if (track.kind === "video") {
        setVideoTracks((videoTracks) => [...videoTracks, track]);
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => [...audioTracks, track]);
      } else if (track.kind === "screen") {
        setScreenTracks((screenTracks) => [...screenTracks, track]);
      }
    };

    const trackUnsubscribed = (track) => {
      if (track.kind === "video") {
        setVideoTracks((videoTracks) => videoTracks.filter((v) => v !== track));
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => audioTracks.filter((a) => a !== track));
      } else if (track.kind === "screen") {
        setScreenTracks((screenTracks) => screenTracks.filter((a) => a !== track));
      }
    };

    participant.on("trackSubscribed", trackSubscribed);
    participant.on("trackUnsubscribed", trackUnsubscribed);

    return () => {
      setVideoTracks([]);
      setAudioTracks([]);
      setScreenTracks([]);
      participant.removeAllListeners();
    };
  }, [participant]);

  useEffect(() => {
    const videoTrack = videoTracks[0];
    if (videoTrack) {
      videoTrack.attach(videoRef.current);
      return () => {
        videoTrack.detach();
      };
    }
  }, [videoTracks]);

  useEffect(() => {
    const audioTrack = audioTracks[0];
    if (audioTrack) {
      audioTrack.attach(audioRef.current);
      return () => {
        audioTrack.detach();
      };
    }
  }, [audioTracks]);

  useEffect(() => {
    const screenTrack = screenTracks[0];
    if (screenTrack) {
      screenTrack.attach(screenRef.current);
      return () => {
        screenTrack.detach();
      };
    }
  }, [screenTracks]);

  return (
    <div className="participant">

///////

  );
};
<video ref={videoRef} autoPlay={true} />
      <audio ref={audioRef} autoPlay={true} muted={true} />
      <video ref={screenRef} autoPlay={true} muted={true} />
    </div>
export default Participant;
philnash commented 4 years ago

I don't believe there is a track.kind of "screen". Screen share tracks are video tracks. I'd recommend naming your tracks when you create them, that way you can separate them when a participant subscribes to them.

jiftuq commented 4 years ago

Which track.kind should i use ?

philnash commented 4 years ago

I would recommend that you name your tracks when you create them, like this for camera and microphone:

const tracks = await Video.createLocalTracks({
  video: {
    name: "user-camera",
  },
  audio: {
    name: "user-audio",
  },
});

and this for the screen:

const screenStream = await navigator.mediaDevices.getDisplayMedia();
const track = screenStream.getTracks()[0];
screenTrack = new LocalVideoTrack(track, {
  name: "user-screen",
});

Then you can look at track.name to distinguish between tracks.

philnash commented 3 years ago

Hi @numericks-yann, what's the error and what is the code you are using for this?

jiftuq commented 3 years ago

In the participant.js file, importing Twilio-video would be necessary with your solution?

philnash commented 3 years ago

If you use LocalVideoTrack in Participant.js then you would need to import { LocalVideoTrack } from 'twilio-video';.

jiftuq commented 3 years ago

const Participant is async?

philnash commented 3 years ago

The Participant is a React component in this case. You can see it is used here.

jiftuq commented 3 years ago

I mean in Participant.js its: const Participant or async function Participant to use the const screenStream = await navigator.mediaDevices.getDisplayMedia();

philnash commented 3 years ago

navigator.mediaDevices.getDisplayMedia() is an asynchronous call that returns a promise. It is asynchronous because it triggers a permissions dialog with the user, asking them which screen, application or tab they want to share, and can only resolve or reject once the user has made a choice or dismissed the dialog. Check out the documentation for getDisplayMedia on MDN for more.

You can use getDisplayMedia as a promise, with then and catch functions, or via async/await. However, you cannot make a React component, such as Participant an async function. You can use async functions within a React component though. Check out this article for how to use async/await within React functional components.

jiftuq commented 3 years ago

So this code must be outside of the "const Participant" function?

philnash commented 3 years ago

When you call on getDisplayMedia it actually has to be in response to a user action, that is a click on a button or link. That callback function can be an async function and it can be within the Participant component.

If I were you I would keep trying different things. If you get stuck, share the code you are trying and what has gone wrong/the error you are getting, and I might be able to help.

jiftuq commented 3 years ago

I get this error and I'm stuck:

SyntaxError: C:\Users\christophe\Desktop\Nouveau dossier - Copy\twilio-video-react-hooks\src\Participant.js: 'await' is only allowed within async functions and at the top levels of modules (8:23)

6 | const [audioTracks, setAudioTracks] = useState([]); 7 |

8 | const screenStream = await navigator.mediaDevices.getDisplayMedia(); | ^ 9 | const track = screenStream.getTracks()[0]; 10 | screenTrack = new LocalVideoTrack(track, { 11 | name: "user-screen",

philnash commented 3 years ago

Ok, so you have a syntax error, but I suspect there are other issues with this too. What exactly are you trying to do with this code? Can you explain in detail what the user will experience with your feature?

jiftuq commented 3 years ago

I want the video, audio and screen of each participant be shared and you gave the code snippet for screensharing but it has an await in it. Maybe i should put it into a useEffect since it accept async/await functions?

jiftuq commented 3 years ago

When I use this: https://github.com/jiftuq/twilio-video-react-hooks/blob/master/src/Participant.js, I get this error=> TypeError: screenStream.getTracks is not a function

jiftuq commented 3 years ago

Apart from that it compiled successfully , even the prompt for screensharing works

philnash commented 3 years ago

The issue there is that navigator.mediaDevices.getDisplayMedia() returns a promise, so screenStream is a promise and therefore doesn't have a getTracks function.

Further, using getDisplayMedia is a side effect, so should be carried out in a useEffect hook or as a result of a user action.

Another issue is that the Participant component is used for all participants, local and remote. Doing this would trigger the getDisplayMedia permissions for a user every time another participant joined the room.

So, what I would suggest is that in the Room component you add a button that a user can click to start sharing their screen. That button would need a function to handle the click. That function might look like this:

const handleScreenShareClick = async () => {
  const screenStream = await navigator.mediaDevices.getDisplayMedia();
  const track = screenStream.getTracks()[0];
  screenTrack = new LocalVideoTrack(track, {
    name: "user-screen",
  });
  room.localParticipant.publishTrack(screenTrack);
}

That would handle getting the screen and publishing it to the room. You then need to do some work to handle the screen share tracks in the Participant component when a participant receives a trackSubscribed event. You would also want to handle if the user declines to share their screen. I'll leave that up to you though.

jiftuq commented 3 years ago

Is there a way to directly start sharing the screen given that there is a prompt anyway for the authorization?

philnash commented 3 years ago

With screen sharing, the browser mandates that the user is prompted every time. So you can only start sharing the screen once the user chooses the display, tab or application.

jiftuq commented 3 years ago

I mean't skip the button part

jiftuq commented 3 years ago

Just like the audio/video just prompt, the screenshare should just prompt when you hit login

philnash commented 3 years ago

Sure, you could do that, though it would be, in my opinion, an awkward experience. It is your application and your user experience though.

If you wanted to do that I would suggest your do so within a useEffect within the Room component.

jiftuq commented 3 years ago

Its not possible in the participant component that way its easier to integrate with the video and audio part?

philnash commented 3 years ago

You don't ask for the video and audio tracks in the Participant component either. The screen track will be one of the video tracks.

And once again, the Participant component is used for every participant, local and remote. So if you asked for permission to get the screen share inside the participant component, then every time another participant joined the chat the local participant would be asked for their screen share again. The Participant component is not the right place to ask for the permission, the Room component is.

One thing you should add though, is listening to the trackPublished event inside the Participant component. Currently it listens to trackSubscribed events, which remote participants emit. But when your local participant publishes a new track, as they will with the screen share track, they will emit the trackPublished event and not a trackSubscribed event.

jiftuq commented 3 years ago

Thanks a lot , I'm gonna try to implement

Yateux commented 3 years ago

Hey !!!

With this solution how can i Attach RemoteVideoTrackPublication to camera ref ??

 const trackPublished = track => {
      if (track.kind === "video") {
        setScreenTracks(videoTracks => [...videoTracks, track]);
      }
    };
  useEffect(() => {
    const videoTrackSharingScreen = screenTracks[0];
    if (videoTrackSharingScreen) {
      videoTrackSharingScreen.attach(screenRef.current);
      return () => {
        videoTrackSharingScreen.detach();
      };
    }
  }, [screenTracks]);
    <video ref={screenRef} autoPlay muted />
      <video ref={videoRef} autoPlay />
      <audio ref={audioRef} autoPlay />

Because I have

TypeError: videoTrackSharingScreen.attach is not a function. (In 'videoTrackSharingScreen.attach(screenRef.current)', 'videoTrackSharingScreen.attach' is undefined)

philnash commented 3 years ago

@Chateux In the trackPublished callback, the object you actually receive is a TrackPublication, not a Track, so it won't have the attach method. I would listen for the trackSubscribed event instead (as I do for the other tracks in the Participant component).

Yateux commented 3 years ago

It seems to me that I had tested last year, but there is no user-screen track (trackpubsToTracks), so the only way is to make a publishedTrack and add it as I did in my previous post?

What should be done is :

const trackSubscribed = (track) => {
      if (track.kind === "video") {
if(track.name === "user-screen") {
   setScreenShareTracks((screenShareTrack) => [...screenShareTrack, track]);
} else {
        setVideoTracks((videoTracks) => [...videoTracks, track]);
}
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => [...audioTracks, track]);
      } else if (track.kind === "screen") {
        setScreenTracks((screenTracks) => [...screenTracks, track]);
      }
    };

and then you add your useEffect with screenShareTracks.

philnash commented 3 years ago

To get a track with the name "user-screen" you need to create the track with that name, for example

const screenStream = await navigator.mediaDevices.getDisplayMedia();
const track = screenStream.getTracks()[0];
screenTrack = new LocalVideoTrack(track, {
  name: "user-screen",
});
Sampreeth2002 commented 3 years ago

Hey I am getting an error of undefined for trackMap value this is the block where I am getting error

const trackpubsToTracks = (trackMap) =>
    Array.from(trackMap.values())
      .map((publication) => publication.track)
      .filter((track) => track !== null);

Here is the code of complete useeffect

const [videoTracks, setVideoTracks] = useState([]);
  const [audioTracks, setAudioTracks] = useState([]);
  const [screenTracks, setScreenTracks] = useState([]);

  const handleScreenShareClick = async () => {
    const screenStream = await navigator.mediaDevices.getDisplayMedia();
    const track = screenStream.getTracks()[0];
    var screenTrack = new LocalVideoTrack(track, {
      name: "user-screen",
    });
    participant.publishTrack(screenTrack);
    console.log(screenTrack);
  };

  const videoRef = useRef();
  const audioRef = useRef();
  const screenRef = useRef();

  const trackpubsToTracks = (trackMap) =>
    Array.from(trackMap.values())
      .map((publication) => publication.track)
      .filter((track) => track !== null);

  useEffect(() => {
    handleScreenShareClick().then(() => {
      setVideoTracks(trackpubsToTracks(participant.videoTracks));
      setAudioTracks(trackpubsToTracks(participant.audioTracks));
      setScreenTracks(trackpubsToTracks(participant.screenTracks));
    });

    const trackSubscribed = (track) => {
      if (track.kind === "video") {
        setVideoTracks((videoTracks) => [...videoTracks, track]);
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => [...audioTracks, track]);
      } else if (track.kind === "user-screen") {
        setScreenTracks((screenTracks) => [...screenTracks, track]);
      }
    };

    const trackUnsubscribed = (track) => {
      if (track.kind === "video") {
        setVideoTracks((videoTracks) => videoTracks.filter((v) => v !== track));
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => audioTracks.filter((a) => a !== track));
      } else if (track.kind === "user-screen") {
        setScreenTracks((screenTracks) =>
          screenTracks.filter((s) => s !== track)
        );
      }
    };

    participant.on("trackSubscribed", trackSubscribed);
    participant.on("trackUnsubscribed", trackUnsubscribed);

    return () => {
      setVideoTracks([]);
      setAudioTracks([]);
      setScreenTracks([]);
      participant.removeAllListeners();
    };
  }, [participant]);
philnash commented 3 years ago

@Sampreeth2002 You are getting that error because participant.screenTracks is not a property that exists. A screen share track is just another video track, which is why I have recommended giving the screen share track a name and then filtering the video tracks.

setVideoTracks(trackpubsToTracks(participant.videoTracks).filter(track => track.name !== 'user-screen'));
setScreenTracks(trackpubsToTracks(participant.videoTracks).filter(track => track.name === 'user-screen'));
Sampreeth2002 commented 3 years ago

Hey, After doing the above changes the user was able to screen share but I created a button for Screen share, and when clicked it calls handleScreenShare function which calls ScreenShare function. The user is getting an option for selecting which screen he/she can choose but the screen can not be seen by other users.

Here is Participant.js code and Room.js code

Room.js Code

import React, { useState, useEffect, useRef } from "react";
import { LocalVideoTrack } from "twilio-video";
import Controls from "./Controls/index";

const Participant = ({
  participant,
  handleCallDisconnect,
  handleAudioToggle,
  handleVideoToggle,
  toggleAudio,
  toggleVideo,
  isLocal,
}) => {
  const [videoTracks, setVideoTracks] = useState([]);
  const [audioTracks, setAudioTracks] = useState([]);
  const [screenTracks, setScreenTracks] = useState([]);

  const ScreenShare = async () => {
    const screenStream = await navigator.mediaDevices.getDisplayMedia();
    const track = screenStream.getTracks()[0];
    var screenTrack = new LocalVideoTrack(track, {
      name: "user-screen",
    });
    participant.publishTrack(screenTrack);
  };

  const handleScreenShare = () => {
    ScreenShare().then(() => {
      setScreenTracks(
        trackpubsToTracks(participant.videoTracks).filter(
          (track) => track.name === "user-screen"
        )
      );
    });
  };

  const videoRef = useRef();
  const audioRef = useRef();
  const screenRef = useRef();

  const trackpubsToTracks = (trackMap) =>
    Array.from(trackMap.values())
      .map((publication) => publication.track)
      .filter((track) => track !== null);

  useEffect(() => {
    setVideoTracks(
      trackpubsToTracks(participant.videoTracks).filter(
        (track) => track.name !== "user-screen"
      )
    );
    setAudioTracks(trackpubsToTracks(participant.audioTracks));

    const trackSubscribed = (track) => {
      if (track.kind === "video") {
        setVideoTracks((videoTracks) => [...videoTracks, track]);
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => [...audioTracks, track]);
      } else if (track.kind === "user-screen") {
        setScreenTracks((screenTracks) => [...screenTracks, track]);
      }
    };

    const trackUnsubscribed = (track) => {
      if (track.kind === "video") {
        setVideoTracks((videoTracks) => videoTracks.filter((v) => v !== track));
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => audioTracks.filter((a) => a !== track));
      } else if (track.kind === "user-screen") {
        setScreenTracks((screenTracks) =>
          screenTracks.filter((s) => s !== track)
        );
      }
    };

    participant.on("trackSubscribed", trackSubscribed);
    participant.on("trackUnsubscribed", trackUnsubscribed);

    return () => {
      setVideoTracks([]);
      setAudioTracks([]);
      setScreenTracks([]);
      participant.removeAllListeners();
    };
  }, [participant]);

  useEffect(() => {
    const videoTrack = videoTracks[0];
    if (videoTrack) {
      videoTrack.attach(videoRef.current);
      return () => {
        videoTrack.detach();
      };
    }
  }, [videoTracks]);

  useEffect(() => {
    const audioTrack = audioTracks[0];
    if (audioTrack) {
      audioTrack.attach(audioRef.current);
      return () => {
        audioTrack.detach();
      };
    }
  }, [audioTracks]);

  useEffect(() => {
    const screenTrack = screenTracks[0];
    if (screenTrack) {
      screenTrack.attach(screenRef.current);
      return () => {
        screenTrack.detach();
      };
    }
  }, [screenTracks]);

  return (
    <div className="participant" style={{ position: "relative" }}>
      <h3>{participant.identity}</h3>
      {/* {videoOn ? <video ref={videoRef} autoPlay={true} /> : "VideoStoped"} */}
      <video ref={videoRef} autoPlay={true} />
      <audio ref={audioRef} autoPlay={true} />
      <video ref={screenRef} autoPlay={true} />
      {/* <button
        onClick={() => {
          setVideoOn(!videoOn);
        }}
      >
        Video Stop
      </button> */}
      <button onClick={() => handleScreenShare()}>ScreenShare</button>
      {isLocal && (
        <Controls
          handleCallDisconnect={handleCallDisconnect}
          handleAudioToggle={handleAudioToggle}
          handleVideoToggle={handleVideoToggle}
          audio={toggleAudio}
          video={toggleVideo}
        />
      )}
    </div>
  );
};

export default Participant;

Room.js Code

import React, { useState, useEffect } from "react";
import Video from "twilio-video";
import Participant from "./Participant";

const Room = ({ roomName, token, handleLogout }) => {
  const [room, setRoom] = useState(null);
  const [participants, setParticipants] = useState([]);
  const [toggleAudio, setToggleAudio] = useState(true);
  const [toggleVideo, setToggleVideo] = useState(true);

  useEffect(() => {
    const participantConnected = (participant) => {
      setParticipants((prevParticipants) => [...prevParticipants, participant]);
    };

    const participantDisconnected = (participant) => {
      setParticipants((prevParticipants) =>
        prevParticipants.filter((p) => p !== participant)
      );
    };

    Video.connect(token, {
      name: roomName,
    }).then((room) => {
      setRoom(room);
      room.on("participantConnected", participantConnected);
      room.on("participantDisconnected", participantDisconnected);
      room.participants.forEach(participantConnected);
    });

    return () => {
      setRoom((currentRoom) => {
        if (currentRoom && currentRoom.localParticipant.state === "connected") {
          currentRoom.localParticipant.tracks.forEach(function (
            trackPublication
          ) {
            trackPublication.track.stop();
          });
          currentRoom.disconnect();
          return null;
        } else {
          return currentRoom;
        }
      });
    };
  }, [roomName, token]);

  const handleCallDisconnect = () => {
    room.disconnect();
  };

  const handleAudioToggle = () => {
    room.localParticipant.audioTracks.forEach((track) => {
      if (track.track.isEnabled) {
        track.track.disable();
      } else {
        track.track.enable();
      }
      setToggleAudio(track.track.isEnabled);
    });
  };

  const handleVideoToggle = () => {
    room.localParticipant.videoTracks.forEach((track) => {
      if (track.track.isEnabled) {
        track.track.disable();
      } else {
        track.track.enable();
      }
      setToggleVideo(track.track.isEnabled);
    });
  };

  const remoteParticipants = participants.map((participant) => (
    <Participant
      key={participant.sid}
      participant={participant}
      isLocal={false}
    />
  ));

  // const copyToClipboard = (e) => {
  //   this.textArea.select();
  //   document.execCommand({ roomName });
  //   e.target.focus();
  //   // this.setState({ copySuccess: "Copied!" });
  // };

  return (
    <div className="room">
      <h2>Room: {roomName}</h2>
      {/* <button onClick={() => copyToClipboard}>Copy Link</button> */}
      <button onClick={handleLogout}>Log out</button>
      {/* <button ></button> */}
      <div className="local-participant">
        {room ? (
          <Participant
            key={room.localParticipant.sid}
            participant={room.localParticipant}
            handleAudioToggle={handleAudioToggle}
            handleVideoToggle={handleVideoToggle}
            handleCallDisconnect={handleCallDisconnect}
            toggleAudio={toggleAudio}
            toggleVideo={toggleVideo}
            isLocal={true}
          />
        ) : (
          ""
        )}
        {/* <button onClick={handleScreenShare}>Screen Share</button> */}
      </div>
      <h3>Remote Participants</h3>
      <div className="remote-participants">{remoteParticipants}</div>
    </div>
  );
};

export default Room;
philnash commented 3 years ago

@Sampreeth2002 in your trackSubscribed and trackUnsubscribed methods:

    const trackSubscribed = (track) => {
      if (track.kind === "video") {
        setVideoTracks((videoTracks) => [...videoTracks, track]);
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => [...audioTracks, track]);
      } else if (track.kind === "user-screen") {
        setScreenTracks((screenTracks) => [...screenTracks, track]);
      }
    };

    const trackUnsubscribed = (track) => {
      if (track.kind === "video") {
        setVideoTracks((videoTracks) => videoTracks.filter((v) => v !== track));
      } else if (track.kind === "audio") {
        setAudioTracks((audioTracks) => audioTracks.filter((a) => a !== track));
      } else if (track.kind === "user-screen") {
        setScreenTracks((screenTracks) =>
          screenTracks.filter((s) => s !== track)
        );
      }
    };

You are trying to filter screen tracks by the kind of user-screen. There are only three track kinds: video, audio, and data. A screen share track is a video track. So in your conditionals, the screen share track is being added/removed from the videoTracks state.

You should filter within the video tracks instead. Something like this:

if (track.kind === "video") {
  if (track.name === "user-screen") {
    setScreenTracks((screenTracks) => [...screenTracks, track]);
  } else {
    setVideoTracks((videoTracks) => [...videoTracks, track]);
  }
}

You may also want to consider whether treating screenTracks as an array is what you want, since typically a participant can only share one screen share at a time.

brayanalexgiraldos commented 2 years ago

is there an example of this function in any repository?

philnash commented 2 years ago

@brayanalexgiraldos Sure, there's an example in this vanilla JS app: https://github.com/philnash/video-collaboration/tree/screen-share or in the Twilio Video team's reference application, which is written in React: https://github.com/twilio/twilio-video-app-react.