補充:查找手冊https://developer.mozilla.org/en-US/
一:WebRTC中的RTP/RTCP模塊
(一)RTP/RTCP簡介
RTP/RTCP協議是流媒體通信的基石。RTP協議定義流媒體數據在互聯網上傳輸的數據包格式,而RTCP協議則負責可靠傳輸、流量控制和擁塞控制等服務質量保證。
在WebRTC項目中,RTP/RTCP模塊作為傳輸模塊的一部分,負責對發送端采集到的媒體數據進行進行封包,然后交給上層網絡模塊發送;
在接收端RTP/RTCP模塊收到上層模塊的數據包后,進行解包操作,最后把負載發送到解碼模塊。
因此,RTP/RTCP 模塊在WebRTC通信中發揮非常重要的作用。
(二)RTP Media(webrtc中真正處理數據傳輸的模塊)
在RTP Media中有兩個重要的類:Receiver與Sender

每一個媒體軌都會對應一個對會對應一個Receiver對象(接收時)和一個Sender對象(發送時)
(三)RTCRtpReceiver與RTCRtpSender屬性一致(3種)
以RTCRtpReceiver為例:

RTCRtpReceiver.track:返回與當前RTCRtpReceiver實例關聯的MediaStreamTrack
通過媒體軌屬性可以獲取當前軌的類型,是audio/video
RTCRtpReceiver.transport:返回接收到的接收者媒體軌的RTCDTLTransport實例
存放着媒體數據傳輸相關的屬性,其中trnasport用於媒體數據的傳輸,媒體流通過底層的transport進行傳輸。transport可以進行復用,多個媒體軌復用一個transport傳輸!
RTCRtpReceiver.rtcpTransport:返回發送和接收RTCP的RTCDTLTransport實例
與rtcp傳輸相關的屬性,比如傳輸抖動,丟包數量、延遲....。接受方進行統計,反饋給發送端,發送方根據這些數據進行網絡質量的評估,適當調整網絡流量的發送,這就是流量控制
(四)RTCRtpReceiver實例的方法(5種)

getParameters:返回一個RTCRtpParameters對象,其中包含有關如何解碼RTP數據的信息。
方法返回一個RTCRtpReceiveParameters對象,該對象描述了在接收軌上面的媒體關於編碼和傳輸的配置。
對於第2、3個方法,先查看RTP頭部格式:

getSynchronizationSources: 方法返回一個RTCRTPContributionSource實例數組,每個實例對應於當前RTCRtpReceiver在最近10秒內接收到的一個SSRC(同步源)標識符。
返回了一組RTCRtpSynchronizationSource實例,每個實例描述在過去10秒內向傳入流提供數據的一個同步源。它繼承了RTCRtpContributingSource的屬性,包括時間戳、源和音頻級別。同步源對象添加了voiceActivityFlag屬性,該屬性指示接收到的最后一個RTP數據包是否包含語音活動。
getContributingSources:方法返回一個RTCRtpContributingSource實例數組,每個實例對應於當前RTCRtpReceiver在最近10秒內接收到的一個CSRC(貢獻源)標識符。
每個實例描述了在過去10秒內向傳入流提供數據的一個貢獻源。
getStats:異步請求一個RTCStatsReport對象,該對象提供有關所屬RTCPeerConnection上傳入流量的統計信息,並返回一個Promise,一旦結果可用,將調用該Promise的異步處理程序。
getCapabilities:返回一個RTCRtpCapabilities對象,描述當前設備上RTCRtpReceiver支持的編解碼器和功能。類似地,您可以通過調用靜態函數RTCRtpSender.getCapabilities()來獲取RTCRtpSender的功能。
(五)RTCRtpSender實例的方法(5種)

getParameters(同receiver):返回一個RTCRtpParameters對象,其中包含有關如何解碼RTP數據的信息。
方法返回一個RTCRtpReceiveParameters對象,該對象描述了在發送軌上面的媒體關於編碼和傳輸的配置。
setParameters:由於更改發送方軌的配置,該軌是RTCRtpSender負責的MediaStreamTrack。比如最大碼率、幀率都是可以改變的
換句話說,setParameters()更新RTP傳輸的配置以及WebRTC連接上特定傳出媒體軌的編碼配置。
getStats(同上):異步請求一個RTCStatsReport對象,該對象提供有關擁有發送方的RTCPeerConnection上傳出流量的統計信息,並返回一個Promise,一旦結果可用,將調用該Promise的異步處理程序。
replaceTrack:RTCRtpSender方法replaceTrack將當前用作發送方源的媒體流軌替換為新的MediaStreamTrack。新媒體流軌必須是相同的媒體類型(音頻、視頻等),切換軌不需要協商。
replaceTrack()的用例之一是在手機的后向和前向攝像頭之間切換的常見需求。使用replaceTrack(),可以為每個攝影機設置一個軌跡對象,並根據需要在兩者之間切換。
getCapabilities(同上):返回一個RTCRtpCapabilities對象,描述當前設備上RTCRtpSender支持的編解碼器和功能。類似地,您可以通過調用靜態函數RTCRtpReceiver.getCapabilities()來獲取RTCRtpSender的功能。
二:RTP Media結構體
(一)RTP Media結構體
下圖列舉了receiver與sender用到的所有的結構,以RTCRtpSendParameters最為關鍵,繼承自RTCRtpParameters

在RTCRtpParameters類中,包含3個成員:
RTCRtpHeaderExtensionParameters:擴展頭,包括id,uri,encrypted是否加密,默認false不加密
RTCRtcpParameters:對於沒一個Rtp都有一個RTCP與之對應,包括cname(可識別),reduceSize帶寬不夠時,減少RTCP數量,從而降低帶寬
RTCRtpCodecParameters:與編解碼相關的參數包括payloadType,mimeType,clockRate,channels,sdpFmtpLine...
除了上面繼承的字段之外,RTCRtpSendParameters還包括:
transactionID:事物ID,是唯一標識。使用getParameters會獲取到,使用setParameters可以指定要設置的事物的transactionID
encodings:指向RTCRtpEncodingParamters對象,指向一堆編解碼器
degradationPreference:指向RTCDegradationPreference對象
priority:指定優先級
RTCRtpEncodingParamters:編解碼相關結構體
RTCDegradationPreference:降低碼流方法,保持幀率、分辨率或者平衡兩者
RTCRtpReceiveParameters相對簡單,只進行接收,將數據進行上報給發送端,發送端進行整體的控制,所以發送端的參數會更多些!!
(二)RTCRtpTransceiver
是sender、receiver對,可以同時處理sender與receiver,是對兩者的封裝

三:實現傳輸速率的控制
基於:WebRTC學習(八)1V1音視頻實時互動直播系統(2)
(一)代碼實現
<html>
<head>
<title> WebRTC PeerConnection </title>
<link href="./css/main.css" rel="stylesheet" />
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.0.3/socket.io.js"></script>
</head>
<body>
<div>
<button id=connserver>Connect Signal Server</button>
<button id="leave" disabled>Leave</button>
</div>
<div id="preview">
<div>
<h2>Local:</h2>
<video autoplay playsinline id="localvideo"></video>
</div>
<div>
<h2>Remote:</h2>
<video autoplay playsinline id="remotevideo"></video>
</div>
</div>
</body>
<script type="text/javascript" src="https://webrtc.github.io/adapter/adapter-latest.js"></script>
<script type="text/javascript" src="./js/main2.js"></script>
</html>
'use strict' var localVideo = document.querySelector("video#localvideo"); var remoteVideo = document.querySelector("video#remotevideo"); var btnConn = document.querySelector("button#connserver"); var btnLeave = document.querySelector("button#leave"); var SltBW = document.querySelector("select#bandwidth"); var localStream = null; //保存本地流為全局變量 var socket = null; var roomid = "111111"; var state = "init"; //客戶端狀態機 var pc = null; //定義全局peerconnection變量 function sendMessage(roomid,data){ console.log("send SDP message",roomid,data); if(socket){ socket.emit("message",roomid,data); } } function getOffer(desc){ pc.setLocalDescription(desc); sendMessage(roomid,desc); //發送SDP信息到對端 } //這里我們本機是遠端,收到了對方的offer,一會需要把自己本端的數據回去!!!!! function getAnswer(desc){ //在offer獲取后,設置了遠端描述 pc.setLocalDescription(desc); //這里只需要設置本端了 sendMessage(roomid,desc); //本端已經收到offer,開始回復answer,說明本端協商完成 SltBW.disabled = false; } //媒體協商方法,發起方調用,創建offer function call(){ if(state === "joined_conn"){ if(pc){ var options = { offerToReceiveAudio:1, offerToReceiveVideo:1 }; pc.createOffer(options) .then(getOffer) .catch(handleError); } } } //創建peerconnection,監聽一些事件:candidate,當收到candidate事件之后(TURN服務返回),之后轉發給另外一端(SIGNAL 服務器實現) //將本端的媒體流加入peerconnection中去 function createPeerConnection(){ console.log("Create RTCPeerConnection!"); if(!pc){ //設置ICEservers var pcConfig = { "iceServers" : [{ 'urls':"turn:82.156.184.3:3478", 'credential':"ssyfj", 'username':"ssyfj" }] } pc = new RTCPeerConnection(pcConfig); pc.onicecandidate = (e)=>{ //處理turn服務返回的candidate信息,媒體協商之后SDP規范中屬性獲取 if(e.candidate){ //發送candidate消息給對端 console.log("find a new candidate",e.candidate); sendMessage(roomid,{ type:"candidate", label:e.candidate.sdpMLineIndex, id:e.candidate.sdpMid, candidate:e.candidate.candidate }); } }; pc.ontrack = (e)=>{ //獲取到遠端的軌數據,設置到頁面顯示 remoteVideo.srcObject = e.streams[0]; } } if(localStream){ //將本端的流加入到peerconnection中去 localStream.getTracks().forEach((track)=>{ pc.addTrack(track,localStream); }); } } //銷毀當前peerconnection的流信息 function closeLocalMedia(){ if(localStream && localStream.getTracks()){ localStream.getTracks().forEach((track)=>{ track.stop(); }) } localStream = null; } //關閉peerconnection function closePeerConnection(){ console.log("close RTCPeerConnection"); if(pc){ pc.close(); pc = null; } } function conn(){ socket = io.connect(); //與信令服務器建立連接,io對象是在前端引入的socket.io文件創立的全局對象 //開始注冊處理服務端的信令消息 socket.on("joined",(roomid,id)=>{ console.log("receive joined message:",roomid,id); //修改狀態 state = "joined"; createPeerConnection(); //加入房間后,創建peerconnection,加入流,等到有新的peerconnection加入,就要進行媒體協商 btnConn.disabled = true; btnLeave.disabled = false; console.log("receive joined message:state=",state); }); socket.on("otherjoin",(roomid,id)=>{ console.log("receive otherjoin message:",roomid,id); //修改狀態,注意:對於一個特殊狀態joined_unbind狀態需要創建新的peerconnection if(state === "joined_unbind"){ createPeerConnection(); } state = "joined_conn"; //原本joined,現在變為conn //媒體協商 call(); console.log("receive otherjoin message:state=",state); }); socket.on("full",(roomid,id)=>{ console.log("receive full message:",roomid,id); state = "leaved"; console.log("receive full message:state=",state); socket.disconnect(); //斷開連接,雖然沒有加入房間,但是連接還是存在的,所以需要進行關閉 alert("the room is full!"); btnLeave.disabled = true; btnConn.disabled = false; }); socket.on("leaved",(roomid,id)=>{ //------資源的釋放在發送leave消息給服務器的時候就釋放了,符合離開流程圖 console.log("receive leaved message:",roomid,id); state = "leaved"; //初始狀態 console.log("receive leaved message:state=",state); //這里斷開連接 socket.disconnect(); btnLeave.disabled = true; btnConn.disabled = false; }); socket.on("bye",(roomid,id)=>{ console.log("receive bye message:",roomid,id); state = "joined_unbind"; console.log("receive bye message:state=",state); //開始處理peerconneciton closePeerConnection(); }); socket.on("message",(roomid,data)=>{ console.log("receive client message:",roomid,data); //處理媒體協商數據,進行轉發給信令服務器,處理不同類型的數據,如果是流媒體數據,直接p2p轉發 if(data){ //只有下面3種數據,對於媒體流數據,走的是p2p路線,不經過信令服務器中轉 if(data.type === "offer"){ //這里表示我們本機是遠端,收到了對方的offer,一會需要把自己本端的數據回去!!!!! pc.setRemoteDescription(new RTCSessionDescription(data)); //需要把傳輸過來的文本轉對象 pc.createAnswer() .then(getAnswer) .catch(handleError); }else if(data.type === "answer"){ pc.setRemoteDescription(new RTCSessionDescription(data)); //收到對端發送過來的SDP信息,說明協商完成 SltBW.disabled = false; }else if(data.type === "candidate"){ //在雙方設置完成setLocalDescription之后,雙方開始交換candidate,每當收集一個candidate之后都會觸發pc的onicecandidate事件 var candidate = new RTCIceCandidate({ sdpMLineIndex:data.label, //媒體行的行號 m=video ... candidate:data.candidate }); //生成candidate,是從TURN/STUN服務端獲取的,下面開始添加到本地pc中去,用於發送到遠端 //將candidate添加到pc pc.addIceCandidate(candidate); //發送到對端,觸發對端onicecandidate事件 }else{ console.error("the message is invalid!",data); } } }); //開始發送加入消息 socket.emit("join",roomid); return; } function getMediaStream(stream){ localStream = stream; //保存到全局變量,用於傳輸到對端 localVideo.srcObject = localStream; //顯示在頁面中,本端 //-------與signal server進行連接,接受信令消息!!------ conn(); } function handleError(err){ console.err(err.name+":"+err.message); } //初始化操作,獲取本地音視頻數據 function start(){ if(!navigator.mediaDevices || !navigator.mediaDevices.getUserMedia){ console.error("the getUserMedia is not support!"); return; }else{ var constraints = { video : true, audio : false }; navigator.mediaDevices.getUserMedia(constraints) .then(getMediaStream) .catch(handleError); } } function connSignalServer(){ //開啟本地視頻 start(); return true; } function leave(){ if(socket){ socket.emit("leave",roomid); } //釋放資源 closePeerConnection(); closeLocalMedia(); btnConn.disabled = false; btnLeave.disabled = true; } function changeBW(){ SltBW.disabled = true; var bw = SltBW.options[SltBW.selectedIndex].value; if(bw==="unlimited"){ return; } //獲取所有的發送器 var senders = pc.getSenders(); var vdsender = null; //開始對視頻流進行限流 senders.forEach((sender)=>{ if(sender && sender.track &&sender.track.kind === "video"){ vdsender = sender; //獲取到視頻流的sender } }); //獲取參數 var parameters = vdsender.getParameters(); if(!parameters.encodings){ //從編解碼器中設置最大碼率 return; } parameters.encodings[0].maxBitrate = bw*1000; vdsender.setParameters(parameters) .then(()=>{ SltBW.disabled = false; console.log("Success to set parameters"); }) .catch(handleError); } //設置觸發事件 btnConn.onclick = connSignalServer; //獲取本地音視頻數據,展示在頁面,socket連接建立與信令服務器,注冊信令消息處理函數,發送join信息給信令服務器 btnLeave.onclick = leave; SltBW.onchange = changeBW;
主要事件:
function changeBW(){ SltBW.disabled = true; var bw = SltBW.options[SltBW.selectedIndex].value; if(bw==="unlimited"){ return; } //獲取所有的發送器 var senders = pc.getSenders(); var vdsender = null; //開始對視頻流進行限流 senders.forEach((sender)=>{ if(sender && sender.track &&sender.track.kind === "video"){ vdsender = sender; //獲取到視頻流的sender } }); //獲取參數 var parameters = vdsender.getParameters(); if(!parameters.encodings){ //從編解碼器中設置最大碼率 return; } parameters.encodings[0].maxBitrate = bw*1000; vdsender.setParameters(parameters) .then(()=>{ SltBW.disabled = false; console.log("Success to set parameters"); }) .catch(handleError); }
(二)結果測試
1.發送端碼流控制

2.接受端碼率控制

3.通過谷歌調試 chrome://webrtc-internals/
可以看到第二副圖,是發送方的碼率控制在2M左右,接受方的碼率第四副圖中被控制到0.5M

四:實現統計信息(流量/s,包數/s)
(一)代碼實現
/* * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. * * Use of this source code is governed by a BSD-style license * that can be found in the LICENSE file in the root of the source * tree. */ button { margin: 10px 20px 25px 0; vertical-align: top; width: 134px; } table { margin: 200px (50% - 100) 0 0; } textarea { color: #444; font-size: 0.9em; font-weight: 300; height: 20.0em; padding: 5px; width: calc(100% - 10px); } div#getUserMedia { padding: 0 0 8px 0; } div.input { display: inline-block; margin: 0 4px 0 0; vertical-align: top; width: 310px; } div.input > div { margin: 0 0 20px 0; vertical-align: top; } div.output { background-color: #eee; display: inline-block; font-family: 'Inconsolata', 'Courier New', monospace; font-size: 0.9em; padding: 10px 10px 10px 25px; position: relative; top: 10px; white-space: pre; width: 270px; } div.label { display: inline-block; font-weight: 400; width: 120px; } div.graph-container { background-color: #ccc; float: left; margin: 0.5em; width: calc(50%-1em); } div#preview { border-bottom: 1px solid #eee; margin: 0 0 1em 0; padding: 0 0 0.5em 0; } div#preview > div { display: inline-block; vertical-align: top; width: calc(50% - 12px); } section#statistics div { display: inline-block; font-family: 'Inconsolata', 'Courier New', monospace; vertical-align: top; width: 308px; } section#statistics div#senderStats { margin: 0 20px 0 0; } section#constraints > div { margin: 0 0 20px 0; } h2 { margin: 0 0 1em 0; } section#constraints label { display: inline-block; width: 156px; } section { margin: 0 0 20px 0; padding: 0 0 15px 0; } video { background: #222; margin: 0 0 0 0; --width: 100%; width: var(--width); height: 225px; } @media screen and (max-width: 720px) { button { font-weight: 500; height: 56px; line-height: 1.3em; width: 90px; } div#getUserMedia { padding: 0 0 40px 0; } section#statistics div { width: calc(50% - 14px); } }
/* * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. * * Use of this source code is governed by a BSD-style license * that can be found in the LICENSE file in the root of the source * tree. */ // taken from chrome://webrtc-internals with jshint adaptions 'use strict'; /* exported TimelineDataSeries, TimelineGraphView */ // The maximum number of data points bufferred for each stats. Old data points // will be shifted out when the buffer is full. const MAX_STATS_DATA_POINT_BUFFER_SIZE = 1000; const TimelineDataSeries = (function() { /** * @constructor */ function TimelineDataSeries() { // List of DataPoints in chronological order. this.dataPoints_ = []; // Default color. Should always be overridden prior to display. this.color_ = 'red'; // Whether or not the data series should be drawn. this.isVisible_ = true; this.cacheStartTime_ = null; this.cacheStepSize_ = 0; this.cacheValues_ = []; } TimelineDataSeries.prototype = { /** * @override */ toJSON: function() { if (this.dataPoints_.length < 1) { return {}; } let values = []; for (let i = 0; i < this.dataPoints_.length; ++i) { values.push(this.dataPoints_[i].value); } return { startTime: this.dataPoints_[0].time, endTime: this.dataPoints_[this.dataPoints_.length - 1].time, values: JSON.stringify(values), }; }, /** * Adds a DataPoint to |this| with the specified time and value. * DataPoints are assumed to be received in chronological order. */ addPoint: function(timeTicks, value) { let time = new Date(timeTicks); this.dataPoints_.push(new DataPoint(time, value)); if (this.dataPoints_.length > MAX_STATS_DATA_POINT_BUFFER_SIZE) { this.dataPoints_.shift(); } }, isVisible: function() { return this.isVisible_; }, show: function(isVisible) { this.isVisible_ = isVisible; }, getColor: function() { return this.color_; }, setColor: function(color) { this.color_ = color; }, getCount: function() { return this.dataPoints_.length; }, /** * Returns a list containing the values of the data series at |count| * points, starting at |startTime|, and |stepSize| milliseconds apart. * Caches values, so showing/hiding individual data series is fast. */ getValues: function(startTime, stepSize, count) { // Use cached values, if we can. if (this.cacheStartTime_ === startTime && this.cacheStepSize_ === stepSize && this.cacheValues_.length === count) { return this.cacheValues_; } // Do all the work. this.cacheValues_ = this.getValuesInternal_(startTime, stepSize, count); this.cacheStartTime_ = startTime; this.cacheStepSize_ = stepSize; return this.cacheValues_; }, /** * Returns the cached |values| in the specified time period. */ getValuesInternal_: function(startTime, stepSize, count) { let values = []; let nextPoint = 0; let currentValue = 0; let time = startTime; for (let i = 0; i < count; ++i) { while (nextPoint < this.dataPoints_.length && this.dataPoints_[nextPoint].time < time) { currentValue = this.dataPoints_[nextPoint].value; ++nextPoint; } values[i] = currentValue; time += stepSize; } return values; } }; /** * A single point in a data series. Each point has a time, in the form of * milliseconds since the Unix epoch, and a numeric value. * @constructor */ function DataPoint(time, value) { this.time = time; this.value = value; } return TimelineDataSeries; })(); const TimelineGraphView = (function() { // Maximum number of labels placed vertically along the sides of the graph. let MAX_VERTICAL_LABELS = 6; // Vertical spacing between labels and between the graph and labels. let LABEL_VERTICAL_SPACING = 4; // Horizontal spacing between vertically placed labels and the edges of the // graph. let LABEL_HORIZONTAL_SPACING = 3; // Horizintal spacing between two horitonally placed labels along the bottom // of the graph. // var LABEL_LABEL_HORIZONTAL_SPACING = 25; // Length of ticks, in pixels, next to y-axis labels. The x-axis only has // one set of labels, so it can use lines instead. let Y_AXIS_TICK_LENGTH = 10; let GRID_COLOR = '#CCC'; let TEXT_COLOR = '#000'; let BACKGROUND_COLOR = '#FFF'; let MAX_DECIMAL_PRECISION = 2; /** * @constructor */ function TimelineGraphView(divId, canvasId) { this.scrollbar_ = {position_: 0, range_: 0}; this.graphDiv_ = document.getElementById(divId); this.canvas_ = document.getElementById(canvasId); // Set the range and scale of the graph. Times are in milliseconds since // the Unix epoch. // All measurements we have must be after this time. this.startTime_ = 0; // The current rightmost position of the graph is always at most this. this.endTime_ = 1; this.graph_ = null; // Horizontal scale factor, in terms of milliseconds per pixel. this.scale_ = 1000; // Initialize the scrollbar. this.updateScrollbarRange_(true); } TimelineGraphView.prototype = { setScale: function(scale) { this.scale_ = scale; }, // Returns the total length of the graph, in pixels. getLength_: function() { let timeRange = this.endTime_ - this.startTime_; // Math.floor is used to ignore the last partial area, of length less // than this.scale_. return Math.floor(timeRange / this.scale_); }, /** * Returns true if the graph is scrolled all the way to the right. */ graphScrolledToRightEdge_: function() { return this.scrollbar_.position_ === this.scrollbar_.range_; }, /** * Update the range of the scrollbar. If |resetPosition| is true, also * sets the slider to point at the rightmost position and triggers a * repaint. */ updateScrollbarRange_: function(resetPosition) { let scrollbarRange = this.getLength_() - this.canvas_.width; if (scrollbarRange < 0) { scrollbarRange = 0; } // If we've decreased the range to less than the current scroll position, // we need to move the scroll position. if (this.scrollbar_.position_ > scrollbarRange) { resetPosition = true; } this.scrollbar_.range_ = scrollbarRange; if (resetPosition) { this.scrollbar_.position_ = scrollbarRange; this.repaint(); } }, /** * Sets the date range displayed on the graph, switches to the default * scale factor, and moves the scrollbar all the way to the right. */ setDateRange: function(startDate, endDate) { this.startTime_ = startDate.getTime(); this.endTime_ = endDate.getTime(); // Safety check. if (this.endTime_ <= this.startTime_) { this.startTime_ = this.endTime_ - 1; } this.updateScrollbarRange_(true); }, /** * Updates the end time at the right of the graph to be the current time. * Specifically, updates the scrollbar's range, and if the scrollbar is * all the way to the right, keeps it all the way to the right. Otherwise, * leaves the view as-is and doesn't redraw anything. */ updateEndDate: function(optDate) { this.endTime_ = optDate || (new Date()).getTime(); this.updateScrollbarRange_(this.graphScrolledToRightEdge_()); }, getStartDate: function() { return new Date(this.startTime_); }, /** * Replaces the current TimelineDataSeries with |dataSeries|. */ setDataSeries: function(dataSeries) { // Simply recreates the Graph. this.graph_ = new Graph(); for (let i = 0; i < dataSeries.length; ++i) { this.graph_.addDataSeries(dataSeries[i]); } this.repaint(); }, /** * Adds |dataSeries| to the current graph. */ addDataSeries: function(dataSeries) { if (!this.graph_) { this.graph_ = new Graph(); } this.graph_.addDataSeries(dataSeries); this.repaint(); }, /** * Draws the graph on |canvas_|. */ repaint: function() { this.repaintTimerRunning_ = false; let width = this.canvas_.width; let height = this.canvas_.height; let context = this.canvas_.getContext('2d'); // Clear the canvas. context.fillStyle = BACKGROUND_COLOR; context.fillRect(0, 0, width, height); // Try to get font height in pixels. Needed for layout. let fontHeightString = context.font.match(/([0-9]+)px/)[1]; let fontHeight = parseInt(fontHeightString); // Safety check, to avoid drawing anything too ugly. if (fontHeightString.length === 0 || fontHeight <= 0 || fontHeight * 4 > height || width < 50) { return; } // Save current transformation matrix so we can restore it later. context.save(); // The center of an HTML canvas pixel is technically at (0.5, 0.5). This // makes near straight lines look bad, due to anti-aliasing. This // translation reduces the problem a little. context.translate(0.5, 0.5); // Figure out what time values to display. let position = this.scrollbar_.position_; // If the entire time range is being displayed, align the right edge of // the graph to the end of the time range. if (this.scrollbar_.range_ === 0) { position = this.getLength_() - this.canvas_.width; } let visibleStartTime = this.startTime_ + position * this.scale_; // Make space at the bottom of the graph for the time labels, and then // draw the labels. let textHeight = height; height -= fontHeight + LABEL_VERTICAL_SPACING; this.drawTimeLabels(context, width, height, textHeight, visibleStartTime); // Draw outline of the main graph area. context.strokeStyle = GRID_COLOR; context.strokeRect(0, 0, width - 1, height - 1); if (this.graph_) { // Layout graph and have them draw their tick marks. this.graph_.layout( width, height, fontHeight, visibleStartTime, this.scale_); this.graph_.drawTicks(context); // Draw the lines of all graphs, and then draw their labels. this.graph_.drawLines(context); this.graph_.drawLabels(context); } // Restore original transformation matrix. context.restore(); }, /** * Draw time labels below the graph. Takes in start time as an argument * since it may not be |startTime_|, when we're displaying the entire * time range. */ drawTimeLabels: function(context, width, height, textHeight, startTime) { // Draw the labels 1 minute apart. let timeStep = 1000 * 60; // Find the time for the first label. This time is a perfect multiple of // timeStep because of how UTC times work. let time = Math.ceil(startTime / timeStep) * timeStep; context.textBaseline = 'bottom'; context.textAlign = 'center'; context.fillStyle = TEXT_COLOR; context.strokeStyle = GRID_COLOR; // Draw labels and vertical grid lines. while (true) { let x = Math.round((time - startTime) / this.scale_); if (x >= width) { break; } let text = (new Date(time)).toLocaleTimeString(); context.fillText(text, x, textHeight); context.beginPath(); context.lineTo(x, 0); context.lineTo(x, height); context.stroke(); time += timeStep; } }, getDataSeriesCount: function() { if (this.graph_) { return this.graph_.dataSeries_.length; } return 0; }, hasDataSeries: function(dataSeries) { if (this.graph_) { return this.graph_.hasDataSeries(dataSeries); } return false; }, }; /** * A Graph is responsible for drawing all the TimelineDataSeries that have * the same data type. Graphs are responsible for scaling the values, laying * out labels, and drawing both labels and lines for its data series. */ const Graph = (function() { /** * @constructor */ function Graph() { this.dataSeries_ = []; // Cached properties of the graph, set in layout. this.width_ = 0; this.height_ = 0; this.fontHeight_ = 0; this.startTime_ = 0; this.scale_ = 0; // The lowest/highest values adjusted by the vertical label step size // in the displayed range of the graph. Used for scaling and setting // labels. Set in layoutLabels. this.min_ = 0; this.max_ = 0; // Cached text of equally spaced labels. Set in layoutLabels. this.labels_ = []; } /** * A Label is the label at a particular position along the y-axis. * @constructor */ /* function Label(height, text) { this.height = height; this.text = text; } */ Graph.prototype = { addDataSeries: function(dataSeries) { this.dataSeries_.push(dataSeries); }, hasDataSeries: function(dataSeries) { for (let i = 0; i < this.dataSeries_.length; ++i) { if (this.dataSeries_[i] === dataSeries) { return true; } } return false; }, /** * Returns a list of all the values that should be displayed for a given * data series, using the current graph layout. */ getValues: function(dataSeries) { if (!dataSeries.isVisible()) { return null; } return dataSeries.getValues(this.startTime_, this.scale_, this.width_); }, /** * Updates the graph's layout. In particular, both the max value and * label positions are updated. Must be called before calling any of the * drawing functions. */ layout: function(width, height, fontHeight, startTime, scale) { this.width_ = width; this.height_ = height; this.fontHeight_ = fontHeight; this.startTime_ = startTime; this.scale_ = scale; // Find largest value. let max = 0; let min = 0; for (let i = 0; i < this.dataSeries_.length; ++i) { let values = this.getValues(this.dataSeries_[i]); if (!values) { continue; } for (let j = 0; j < values.length; ++j) { if (values[j] > max) { max = values[j]; } else if (values[j] < min) { min = values[j]; } } } this.layoutLabels_(min, max); }, /** * Lays out labels and sets |max_|/|min_|, taking the time units into * consideration. |maxValue| is the actual maximum value, and * |max_| will be set to the value of the largest label, which * will be at least |maxValue|. Similar for |min_|. */ layoutLabels_: function(minValue, maxValue) { if (maxValue - minValue < 1024) { this.layoutLabelsBasic_(minValue, maxValue, MAX_DECIMAL_PRECISION); return; } // Find appropriate units to use. let units = ['', 'k', 'M', 'G', 'T', 'P']; // Units to use for labels. 0 is '1', 1 is K, etc. // We start with 1, and work our way up. let unit = 1; minValue /= 1024; maxValue /= 1024; while (units[unit + 1] && maxValue - minValue >= 1024) { minValue /= 1024; maxValue /= 1024; ++unit; } // Calculate labels. this.layoutLabelsBasic_(minValue, maxValue, MAX_DECIMAL_PRECISION); // Append units to labels. for (let i = 0; i < this.labels_.length; ++i) { this.labels_[i] += ' ' + units[unit]; } // Convert |min_|/|max_| back to unit '1'. this.min_ *= Math.pow(1024, unit); this.max_ *= Math.pow(1024, unit); }, /** * Same as layoutLabels_, but ignores units. |maxDecimalDigits| is the * maximum number of decimal digits allowed. The minimum allowed * difference between two adjacent labels is 10^-|maxDecimalDigits|. */ layoutLabelsBasic_: function(minValue, maxValue, maxDecimalDigits) { this.labels_ = []; let range = maxValue - minValue; // No labels if the range is 0. if (range === 0) { this.min_ = this.max_ = maxValue; return; } // The maximum number of equally spaced labels allowed. |fontHeight_| // is doubled because the top two labels are both drawn in the same // gap. let minLabelSpacing = 2 * this.fontHeight_ + LABEL_VERTICAL_SPACING; // The + 1 is for the top label. let maxLabels = 1 + this.height_ / minLabelSpacing; if (maxLabels < 2) { maxLabels = 2; } else if (maxLabels > MAX_VERTICAL_LABELS) { maxLabels = MAX_VERTICAL_LABELS; } // Initial try for step size between conecutive labels. let stepSize = Math.pow(10, -maxDecimalDigits); // Number of digits to the right of the decimal of |stepSize|. // Used for formating label strings. let stepSizeDecimalDigits = maxDecimalDigits; // Pick a reasonable step size. while (true) { // If we use a step size of |stepSize| between labels, we'll need: // // Math.ceil(range / stepSize) + 1 // // labels. The + 1 is because we need labels at both at 0 and at // the top of the graph. // Check if we can use steps of size |stepSize|. if (Math.ceil(range / stepSize) + 1 <= maxLabels) { break; } // Check |stepSize| * 2. if (Math.ceil(range / (stepSize * 2)) + 1 <= maxLabels) { stepSize *= 2; break; } // Check |stepSize| * 5. if (Math.ceil(range / (stepSize * 5)) + 1 <= maxLabels) { stepSize *= 5; break; } stepSize *= 10; if (stepSizeDecimalDigits > 0) { --stepSizeDecimalDigits; } } // Set the min/max so it's an exact multiple of the chosen step size. this.max_ = Math.ceil(maxValue / stepSize) * stepSize; this.min_ = Math.floor(minValue / stepSize) * stepSize; // Create labels. for (let label = this.max_; label >= this.min_; label -= stepSize) { this.labels_.push(label.toFixed(stepSizeDecimalDigits)); } }, /** * Draws tick marks for each of the labels in |labels_|. */ drawTicks: function(context) { let x1; let x2; x1 = this.width_ - 1; x2 = this.width_ - 1 - Y_AXIS_TICK_LENGTH; context.fillStyle = GRID_COLOR; context.beginPath(); for (let i = 1; i < this.labels_.length - 1; ++i) { // The rounding is needed to avoid ugly 2-pixel wide anti-aliased // lines. let y = Math.round(this.height_ * i / (this.labels_.length - 1)); context.moveTo(x1, y); context.lineTo(x2, y); } context.stroke(); }, /** * Draws a graph line for each of the data series. */ drawLines: function(context) { // Factor by which to scale all values to convert them to a number from // 0 to height - 1. let scale = 0; let bottom = this.height_ - 1; if (this.max_) { scale = bottom / (this.max_ - this.min_); } // Draw in reverse order, so earlier data series are drawn on top of // subsequent ones. for (let i = this.dataSeries_.length - 1; i >= 0; --i) { let values = this.getValues(this.dataSeries_[i]); if (!values) { continue; } context.strokeStyle = this.dataSeries_[i].getColor(); context.beginPath(); for (let x = 0; x < values.length; ++x) { // The rounding is needed to avoid ugly 2-pixel wide anti-aliased // horizontal lines. context.lineTo( x, bottom - Math.round((values[x] - this.min_) * scale)); } context.stroke(); } }, /** * Draw labels in |labels_|. */ drawLabels: function(context) { if (this.labels_.length === 0) { return; } let x = this.width_ - LABEL_HORIZONTAL_SPACING; // Set up the context. context.fillStyle = TEXT_COLOR; context.textAlign = 'right'; // Draw top label, which is the only one that appears below its tick // mark. context.textBaseline = 'top'; context.fillText(this.labels_[0], x, 0); // Draw all the other labels. context.textBaseline = 'bottom'; let step = (this.height_ - 1) / (this.labels_.length - 1); for (let i = 1; i < this.labels_.length; ++i) { context.fillText(this.labels_[i], x, step * i); } } }; return Graph; })(); return TimelineGraphView; })();
<html>
<head>
<title> WebRTC PeerConnection </title>
<link href="./css/main.css" rel="stylesheet" />
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.0.3/socket.io.js"></script>
</head>
<body>
<div>
<button id=connserver>ConnSignal</button>
<button id="leave" disabled>Leave</button>
</div>
<div>
<label>BandWidth:</label>
<select id="bandwidth" disabled> <!--帶寬限制-->
<option value="unlimited" selected>unlimited</option>
<option value="125">125</option>
<option value="250">250</option>
<option value="500">500</option>
<option value="1000">1000</option>
<option value="2000">2000</option>
</select>
kbps
</div>
<div id="preview">
<div>
<h2>Local:</h2>
<video autoplay playsinline id="localvideo"></video>
</div>
<div>
<h2>Remote:</h2>
<video autoplay playsinline id="remotevideo"></video>
</div>
</div>
<div class="graph-container" id="bitrateGraph">
<div>Bitrate</div>
<canvas id="bitrateCanvas"></canvas>
</div>
<div class="graph-container" id="packetGraph">
<div>Packets sent per second</div>
<canvas id="packetCanvas"></canvas>
</div>
</body>
<script type="text/javascript" src="https://webrtc.github.io/adapter/adapter-latest.js"></script>
<script type="text/javascript" src="./js/main3.js"></script>
<script type="text/javascript" src="./js/third_party/graph.js"></script>
</html>
'use strict' var localVideo = document.querySelector("video#localvideo"); var remoteVideo = document.querySelector("video#remotevideo"); var btnConn = document.querySelector("button#connserver"); var btnLeave = document.querySelector("button#leave"); var SltBW = document.querySelector("select#bandwidth"); //繪制圖像,在獲取了本地媒體流之后設置 var bitrateGraph; var bitrateSeries; var packetGraph; var packetSeries; var localStream = null; //保存本地流為全局變量 var socket = null; var roomid = "111111"; var state = "init"; //客戶端狀態機 var pc = null; //定義全局peerconnection變量 var lastResult = null; //全局變量,獲取統計值 function sendMessage(roomid,data){ console.log("send SDP message",roomid,data); if(socket){ socket.emit("message",roomid,data); } } function getOffer(desc){ pc.setLocalDescription(desc); sendMessage(roomid,desc); //發送SDP信息到對端 } //這里我們本機是遠端,收到了對方的offer,一會需要把自己本端的數據回去!!!!! function getAnswer(desc){ //在offer獲取后,設置了遠端描述 pc.setLocalDescription(desc); //這里只需要設置本端了 sendMessage(roomid,desc); //本端已經收到offer,開始回復answer,說明本端協商完成 SltBW.disabled = false; } //媒體協商方法,發起方調用,創建offer function call(){ if(state === "joined_conn"){ if(pc){ var options = { offerToReceiveAudio:1, offerToReceiveVideo:1 }; pc.createOffer(options) .then(getOffer) .catch(handleError); } } } //創建peerconnection,監聽一些事件:candidate,當收到candidate事件之后(TURN服務返回),之后轉發給另外一端(SIGNAL 服務器實現) //將本端的媒體流加入peerconnection中去 function createPeerConnection(){ console.log("Create RTCPeerConnection!"); if(!pc){ //設置ICEservers var pcConfig = { "iceServers" : [{ 'urls':"turn:82.156.184.3:3478", 'credential':"ssyfj", 'username':"ssyfj" }] } pc = new RTCPeerConnection(pcConfig); pc.onicecandidate = (e)=>{ //處理turn服務返回的candidate信息,媒體協商之后SDP規范中屬性獲取 if(e.candidate){ //發送candidate消息給對端 console.log("find a new candidate",e.candidate); sendMessage(roomid,{ type:"candidate", label:e.candidate.sdpMLineIndex, id:e.candidate.sdpMid, candidate:e.candidate.candidate }); } }; pc.ontrack = (e)=>{ //獲取到遠端的軌數據,設置到頁面顯示 remoteVideo.srcObject = e.streams[0]; } } if(localStream){ //將本端的流加入到peerconnection中去 localStream.getTracks().forEach((track)=>{ pc.addTrack(track,localStream); }); } } //銷毀當前peerconnection的流信息 function closeLocalMedia(){ if(localStream && localStream.getTracks()){ localStream.getTracks().forEach((track)=>{ track.stop(); }) } localStream = null; } //關閉peerconnection function closePeerConnection(){ console.log("close RTCPeerConnection"); if(pc){ pc.close(); pc = null; } } function conn(){ socket = io.connect(); //與信令服務器建立連接,io對象是在前端引入的socket.io文件創立的全局對象 //開始注冊處理服務端的信令消息 socket.on("joined",(roomid,id)=>{ console.log("receive joined message:",roomid,id); //修改狀態 state = "joined"; createPeerConnection(); //加入房間后,創建peerconnection,加入流,等到有新的peerconnection加入,就要進行媒體協商 btnConn.disabled = true; btnLeave.disabled = false; console.log("receive joined message:state=",state); }); socket.on("otherjoin",(roomid,id)=>{ console.log("receive otherjoin message:",roomid,id); //修改狀態,注意:對於一個特殊狀態joined_unbind狀態需要創建新的peerconnection if(state === "joined_unbind"){ createPeerConnection(); } state = "joined_conn"; //原本joined,現在變為conn //媒體協商 call(); console.log("receive otherjoin message:state=",state); }); socket.on("full",(roomid,id)=>{ console.log("receive full message:",roomid,id); state = "leaved"; console.log("receive full message:state=",state); socket.disconnect(); //斷開連接,雖然沒有加入房間,但是連接還是存在的,所以需要進行關閉 alert("the room is full!"); btnLeave.disabled = true; btnConn.disabled = false; }); socket.on("leaved",(roomid,id)=>{ //------資源的釋放在發送leave消息給服務器的時候就釋放了,符合離開流程圖 console.log("receive leaved message:",roomid,id); state = "leaved"; //初始狀態 console.log("receive leaved message:state=",state); //這里斷開連接 socket.disconnect(); btnLeave.disabled = true; btnConn.disabled = false; }); socket.on("bye",(roomid,id)=>{ console.log("receive bye message:",roomid,id); state = "joined_unbind"; console.log("receive bye message:state=",state); //開始處理peerconneciton closePeerConnection(); }); socket.on("message",(roomid,data)=>{ console.log("receive client message:",roomid,data); //處理媒體協商數據,進行轉發給信令服務器,處理不同類型的數據,如果是流媒體數據,直接p2p轉發 if(data){ //只有下面3種數據,對於媒體流數據,走的是p2p路線,不經過信令服務器中轉 if(data.type === "offer"){ //這里表示我們本機是遠端,收到了對方的offer,一會需要把自己本端的數據回去!!!!! pc.setRemoteDescription(new RTCSessionDescription(data)); //需要把傳輸過來的文本轉對象 pc.createAnswer() .then(getAnswer) .catch(handleError); }else if(data.type === "answer"){ pc.setRemoteDescription(new RTCSessionDescription(data)); //收到對端發送過來的SDP信息,說明協商完成 SltBW.disabled = false; }else if(data.type === "candidate"){ //在雙方設置完成setLocalDescription之后,雙方開始交換candidate,每當收集一個candidate之后都會觸發pc的onicecandidate事件 var candidate = new RTCIceCandidate({ sdpMLineIndex:data.label, //媒體行的行號 m=video ... candidate:data.candidate }); //生成candidate,是從TURN/STUN服務端獲取的,下面開始添加到本地pc中去,用於發送到遠端 //將candidate添加到pc pc.addIceCandidate(candidate); //發送到對端,觸發對端onicecandidate事件 }else{ console.error("the message is invalid!",data); } } }); //開始發送加入消息 socket.emit("join",roomid); return; } function getMediaStream(stream){ localStream = stream; //保存到全局變量,用於傳輸到對端 localVideo.srcObject = localStream; //顯示在頁面中,本端 //-------與signal server進行連接,接受信令消息!!------ conn(); //繪制圖像,渲染顯示 bitrateSeries = new TimelineDataSeries(); bitrateGraph = new TimelineGraphView('bitrateGraph', 'bitrateCanvas'); bitrateGraph.updateEndDate(); packetSeries = new TimelineDataSeries(); packetGraph = new TimelineGraphView('packetGraph', 'packetCanvas'); packetGraph.updateEndDate(); } function handleError(err){ console.error(err.name+":"+err.message); } //初始化操作,獲取本地音視頻數據 function start(){ if(!navigator.mediaDevices || !navigator.mediaDevices.getUserMedia){ console.error("the getUserMedia is not support!"); return; }else{ var constraints = { video : true, audio : false }; navigator.mediaDevices.getUserMedia(constraints) .then(getMediaStream) .catch(handleError); } } function connSignalServer(){ //開啟本地視頻 start(); return true; } function leave(){ if(socket){ socket.emit("leave",roomid); } //釋放資源 closePeerConnection(); closeLocalMedia(); btnConn.disabled = false; btnLeave.disabled = true; } function changeBW(){ SltBW.disabled = true; var bw = SltBW.options[SltBW.selectedIndex].value; if(bw==="unlimited"){ return; } //獲取所有的發送器 var senders = pc.getSenders(); var vdsender = null; //開始對視頻流進行限流 senders.forEach((sender)=>{ if(sender && sender.track &&sender.track.kind === "video"){ vdsender = sender; //獲取到視頻流的sender } }); //獲取參數 var parameters = vdsender.getParameters(); if(!parameters.encodings){ //從編解碼器中設置最大碼率 return; } parameters.encodings[0].maxBitrate = bw*1000; vdsender.setParameters(parameters) .then(()=>{ SltBW.disabled = false; console.log("Success to set parameters"); }) .catch(handleError); } //設置定時器,每秒觸發 window.setInterval(()=>{ if(!pc || !pc.getSenders()) return; var sender = pc.getSenders()[0]; //因為我們只有視頻流,所以不進行判斷,直接去取 if(!sender){ return; } sender.getStats() .then((reports)=>{ reports.forEach((report)=>{ if(report.type === "outbound-rtp"){ //獲取輸出帶寬 if(report.isRemote){ //表示是遠端的數據,我們只需要自己本端的 return; } var curTs = report.timestamp; var bytes = report.bytesSent; var packets = report.packetsSent; //上面的bytes和packets是累計值。我們只需要差值 if(lastResult && lastResult.has(report.id)){ var biterate = 8*(bytes-lastResult.get(report.id).bytesSent)/(curTs-lastResult.get(report.id).timestamp); var packetCnt = packets - lastResult.get(report.id).packetsSent; bitrateSeries.addPoint(curTs,biterate); bitrateGraph.setDataSeries([bitrateSeries]); bitrateGraph.updateEndDate(); packetSeries.addPoint(curTs,packetCnt); packetGraph.setDataSeries([packetSeries]); packetGraph.updateEndDate(); } } }); lastResult = reports; }) .catch(handleError); },1000); //設置觸發事件 btnConn.onclick = connSignalServer; //獲取本地音視頻數據,展示在頁面,socket連接建立與信令服務器,注冊信令消息處理函數,發送join信息給信令服務器 btnLeave.onclick = leave; SltBW.onchange = changeBW;
主要邏輯:
//設置定時器,每秒觸發 window.setInterval(()=>{ if(!pc || !pc.getSenders()) return; var sender = pc.getSenders()[0]; //因為我們只有視頻流,所以不進行判斷,直接去取 if(!sender){ return; } sender.getStats() .then((reports)=>{ reports.forEach((report)=>{ if(report.type === "outbound-rtp"){ //獲取輸出帶寬 if(report.isRemote){ //表示是遠端的數據,我們只需要自己本端的 return; } var curTs = report.timestamp; var bytes = report.bytesSent; var packets = report.packetsSent; //上面的bytes和packets是累計值。我們只需要差值 if(lastResult && lastResult.has(report.id)){ var biterate = 8*(bytes-lastResult.get(report.id).bytesSent)/(curTs-lastResult.get(report.id).timestamp); var packetCnt = packets - lastResult.get(report.id).packetsSent; bitrateSeries.addPoint(curTs,biterate); bitrateGraph.setDataSeries([bitrateSeries]); bitrateGraph.updateEndDate(); packetSeries.addPoint(curTs,packetCnt); packetGraph.setDataSeries([packetSeries]); packetGraph.updateEndDate(); } } }); lastResult = reports; }) .catch(handleError); },1000);
(二)結果顯示


