This example shows how you might stream video from a user's webcam over WebRTC, adding a WebRTC encoded transform to modify the outgoing streams. Note that this is part of a larger example in the guide topic Using WebRTC Encoded Transforms.
The code assumes that there is an RTCPeerConnection
called peerConnection
that is already connected to a remote peer. It first gets a MediaStreamTrack
, using getUserMedia()
to get a video MediaStream
from a media device, and then the MediaStream.getTracks()
method to get the first MediaStreamTrack
in the stream.
The track is added to the peer connection using addTrack()
. This returns a new RTCRtpSender
that will be used to send it.
const mediaStream = await navigator.mediaDevices.getUserMedia({ video: true });
const [track] = mediaStream.getTracks();
const videoSender = peerConnection.addTrack(track, mediaStream);
The code above sets up the connection and starts sending the track. To add a transform stream into the pipeline we need to construct an RTCRtpScriptTransform
and assign it to the sender's transform
property. As the transform is constructed immediately after creation of the RTCRtpSender
, it will receive the first frame generated by the sender's encoder, before it is sent.
const worker = new Worker("worker.js");
videoSender.transform = new RTCRtpScriptTransform(worker, {
name: "senderTransform",
});
Note that you can add the transform at any time. However by adding it immediately after calling addTrack()
the transform will get the first encoded frame that is sent.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.3