一.WebRTC的Android客戶端搭建
1.libjingle_peerconnection_so.so
2.libjingle_peerconnection.jar
3.客戶端源碼一份(可以在github上下載)

二、相關概念介紹
1.P2P:點對點通訊;
2.STUN:提供反射地址使雙方可以進行P2P通訊;
3.TURN:在反射地址方式失敗情況下的補充方案,即使用中繼器,使雙方百分之百能夠通訊;
4.ICE:綜合STUN與TURN兩種方案,找出一種最合理最廉價的可行路徑;
5.SIP/SDP:SIP一種音視頻通訊的協議,SDP為SIP協議中對音視頻的描述;
6.PeerConnectionFactory/PeerConnection:整個WebRTC中最核心的類,有了這個類才能獲得音視頻相關的其他操作,
7.MediaStream:流媒體,包括音頻和視頻;
8.Track:流媒體中的軌,封裝了流媒體的數據,包括音頻數據中的聲道,視頻數據中的YUV場;
9.Session Management:抽象的會話層;
10.RTP:流媒體協議;
11.iSAC:音頻語音編碼器;
12.iLBC:音頻語音解碼器;
13.VP8:視頻圖像編解碼器;
14.Room:在web客戶端會生成一個隨機的房間號,在android客戶端需要輸入該房間號進行相互連接進行通訊;
15.Observe:觀察連接方的數據,就是一個連接后數據的回調接口;
三、相關類的介紹
1.AppRTCAudioManager:音頻管理類,全部調用Android SDK已有的方法;
2.AppRTCClient:自定義的與服務端進行通訊和消息回調的接口;
3.AppRTCProximitySensor:Android手機上的距離傳感器,使手機靠近耳朵時讓按鍵失效;
4.CallActivity:建立通訊連接,並且顯示音視頻的界面;
5.CallFragment:通訊界面,CallActivity的子界面;
6.CaptureQualityController:控制界面的畫質分辨率大小,用seekbar顯示;
7.ConnectActivity:初始化相機和通信的一些參數,點擊呼叫按鈕時跳轉到CallActivity界面進行呼叫;
8.CpuMonitor:顯示CPU的當前使用性能數據;
9.HudFragment:參數的界面顯示,包括上行寬帶和丟包率;
10.PeerConnectionClient:PeerConnection的實現,有了這個類才能進行音視頻相關數據通訊;
11.PercentFrameLayout:封裝了FrameLayout;
12.RoomParametersFetcher:解析服務器的json數據,並轉發房間的參數和地址;
13.SettingsActivity:設置Activity;
14.SettingsFragment:設置Fragment,一些音視頻通訊的參數設置;
15.UnhandledExceptionHandler:crash時抓異常,不會直接拋出異常;
16.WebSocketChannelClient:websockot封裝;
17.WebSocketRTCClient:對WebSocketChannelClient使用,進行websockt方式發送數據;
工具類:
1.AppRTCUtils:
2.AsyncHttpURLConnection:
3.LooperExecutor:
四、WebSocket
如果下載的代碼中含有autobanh.jar這個jar包,那么可以斷定該Android客戶端與WebRTC的通訊是采用WebSocket的方式進行通訊的,其實現主要在WebSocketChannelClient和WebSocketRTCClient這兩個類中,其回調接口則在WebSocketConnectionObserver的三個回調函數里面,這個類被封裝在了WebSocketChannelClient里面。(WebSocket的詳細知識不做介紹)
五、WebRTC呼叫流程介紹(轉載)

上述序列中,標注的場景是ClientA向ClientB發起對聊請求,調用描述如下:
·ClientA首先創建PeerConnection對象,然后打開本地音視頻設備,將音視頻數據封裝成MediaStream添加到PeerConnection中。
·ClientA調用PeerConnection的CreateOffer方法創建一個用於offer的SDP對象,SDP對象中保存當前音視頻的相關參數。ClientA通過PeerConnection的SetLocalDescription方法將該SDP對象保存起來,並通過Signal服務器發送給ClientB。
·ClientB接收到ClientA發送過的offer SDP對象,通過PeerConnection的SetRemoteDescription方法將其保存起來,並調用PeerConnection的CreateAnswer方法創建一個應答的SDP對象,通過PeerConnection的SetLocalDescription的方法保存該應答SDP對象並將它通過Signal服務器發送給ClientA。
·ClientA接收到ClientB發送過來的應答SDP對象,將其通過PeerConnection的SetRemoteDescription方法保存起來。
·在SDP信息的offer/answer流程中,ClientA和ClientB已經根據SDP信息創建好相應的音頻Channel和視頻Channel並開啟Candidate數據的收集,Candidate數據可以簡單地理解成Client端的IP地址信息(本地IP地址、公網IP地址、Relay服務端分配的地址)。
·當ClientA收集到Candidate信息后,PeerConnection會通過OnIceCandidate接口給ClientA發送通知,ClientA將收到的Candidate信息通過Signal服務器發送給ClientB,ClientB通過PeerConnection的AddIceCandidate方法保存起來。同樣的操作ClientB對ClientA再來一次。
·這樣ClientA和ClientB就已經建立了音視頻傳輸的P2P通道,ClientB接收到ClientA傳送過來的音視頻流,會通過PeerConnection的OnAddStream回調接口返回一個標識ClientA端音視頻流的MediaStream對象,在ClientB端渲染出來即可。同樣操作也適應ClientB到ClientA的音視頻流的傳輸。
六、回調函數
1.WebSocket回調接口與主要消息處理
Candidate、answer、offer、bye四大類消息
private class WebSocketObserverimplements WebSocketConnectionObserver {
@Override
public void onOpen() {
Log.d(TAG, "WebSocket connection opened to: " + wsServerUrl);
executor.execute(new Runnable() {
@Override
public void run() {
state = WebSocketConnectionState.CONNECTED;
// Check if we have pending register request.
if (roomID !=null && clientID != null) {
register(roomID, clientID);
}
}
});
}
@Override
public void onClose(WebSocketCloseNotification code, String reason) {
Log.d(TAG, "WebSocket connection closed. Code: " + code
+ ". Reason: " + reason + ". State: " + state);
synchronized (closeEventLock) {
closeEvent = true;
closeEventLock.notify();
}
executor.execute(new Runnable() {
@Override
public void run() {
if (state != WebSocketConnectionState.CLOSED) {
state = WebSocketConnectionState.CLOSED;
events.onWebSocketClose();
}
}
});
}
@Override
public void onTextMessage(String payload) {
Log.d(TAG, "WSS->C: " + payload);
final String message = payload;
executor.execute(new Runnable() {
@Override
public void run() {
if (state == WebSocketConnectionState.CONNECTED
|| state == WebSocketConnectionState.REGISTERED) {
events.onWebSocketMessage(message);
}
}
});
}
@Override
public void onRawTextMessage(byte[] payload) {
}
@Override
public void onBinaryMessage(byte[] payload) {
}
}
@Override
public void onWebSocketMessage(final String msg) {
if (wsClient.getState() != WebSocketConnectionState.REGISTERED) {
Log.e(TAG, "Got WebSocket message in non registered state.");
return;
}
try {
JSONObject json = new JSONObject(msg);
String msgText = json.getString("msg");
String errorText = json.optString("error");
if (msgText.length() > 0) {
json = new JSONObject(msgText);
String type = json.optString("type");
if (type.equals("candidate")) {
IceCandidate candidate = new IceCandidate(
json.getString("id"),
json.getInt("label"),
json.getString("candidate"));
events.onRemoteIceCandidate(candidate);
} else if (type.equals("answer")) {
if (initiator) {
SessionDescription sdp = new SessionDescription(
SessionDescription.Type.fromCanonicalForm(type),
json.getString("sdp"));
events.onRemoteDescription(sdp);
} else {
reportError("Received answer for call initiator: " + msg);
}
} else if (type.equals("offer")) {
if (!initiator) {
SessionDescription sdp = new SessionDescription(
SessionDescription.Type.fromCanonicalForm(type),
json.getString("sdp"));
events.onRemoteDescription(sdp);
} else {
reportError("Received offer for call receiver: " + msg);
}
} else if (type.equals("bye")) {
events.onChannelClose();
} else {
reportError("Unexpected WebSocket message: " + msg);
}
} else {
if (errorText !=null && errorText.length() > 0) {
reportError("WebSocket error message: " + errorText);
} else {
reportError("Unexpected WebSocket message: " + msg);
}
}
} catch (JSONException e) {
reportError("WebSocket message JSON parsing error: " + e.toString());
}
}
2.Observer接口
主要是連接建立完成后Ice的改變和流信息的改變引起的回調
public static interface Observer {
/** Triggered when the SignalingState changes. */
public void onSignalingChange(SignalingState newState);
/** Triggered when the IceConnectionState changes. */
public void onIceConnectionChange(IceConnectionState newState);
/** Triggered when the ICE connection receiving status changes. */
public void onIceConnectionReceivingChange(boolean receiving);
/** Triggered when the IceGatheringState changes. */
public void onIceGatheringChange(IceGatheringState newState);
/** Triggered when a new ICE candidate has been found. */
public void onIceCandidate(IceCandidate candidate);
/** Triggered when media is received on a new stream from remote peer. */
public void onAddStream(MediaStream stream);
/** Triggered when a remote peer close a stream. */
public void onRemoveStream(MediaStream stream);
/** Triggered when a remote peer opens a DataChannel. */
public void onDataChannel(DataChannel dataChannel);
/** Triggered when renegotiation is necessary. */
public void onRenegotiationNeeded();
}
3.SDP接口
主要是連接建立的過程中引起的回調
/** Interface for observing SDP-related events. */
public interface SdpObserver {
/** Called on success of Create{Offer,Answer}(). */
public void onCreateSuccess(SessionDescription sdp);
/** Called on success of Set{Local,Remote}Description(). */
public void onSetSuccess();
/** Called on error of Create{Offer,Answer}(). */
public void onCreateFailure(String error);
/** Called on error of Set{Local,Remote}Description(). */
public void onSetFailure(String error);
}
4.PeerConnectionClient
生成PeerConnection,實現相關的回調,完成整個業務邏輯
private final PCObserverpcObserver = new PCObserver();(Observer )
private final SDPObserversdpObserver = new SDPObserver();(SdpObserver)
private PeerConnectionFactoryfactory;
private PeerConnectionpeerConnection;
5.CallActivity
private PeerConnectionClientpeerConnectionClient = null;
private AppRTCClientappRtcClient;
七、Native函數之信令協商
6.1 加載so文件
static {
System.loadLibrary("jingle_peerconnection_so");
}
6.2 PeerConnectionFactory相關Native函數
6.2.1網絡接口相關參數
public static class Options {
// Keep in sync with webrtc/base/network.h!
static final int ADAPTER_TYPE_UNKNOWN = 0;
static final int ADAPTER_TYPE_ETHERNET = 1 << 0;
static final int ADAPTER_TYPE_WIFI = 1 << 1;
static final int ADAPTER_TYPE_CELLULAR = 1 << 2;
static final int ADAPTER_TYPE_VPN = 1 << 3;
static final int ADAPTER_TYPE_LOOPBACK = 1 << 4;
public int networkIgnoreMask;
public boolean disableEncryption;
}
6.2.2初始化PeerConnectionFactory
// |context| is an android.content.Context object, but we keep it untyped here
// to allow building on non-Android platforms.
// Callers may specify either |initializeAudio| or |initializeVideo| as false
// to skip initializing the respective engine (and avoid the need for the
// respective permissions).
// |renderEGLContext| can be provided to suport HW video decoding to
// texture and will be used to create a shared EGL context on video
// decoding thread.
public static native boolean initializeAndroidGlobals(Object context, boolean initializeAudio,boolean initializeVideo,boolean videoHwAcceleration);
Context:簡單的ApplicationContext,或者其他Context相關的上下文。
initializeAudio:初始化音頻部分。(boolean)
videoHwAcceleration:是否啟用硬件加速。(boolean)
6.2.3初始化音視頻軌
private static final String FIELD_TRIAL_VP9 = "WebRTC-SupportVP9/Enabled/";
// Field trial initialization. Must be called before PeerConnectionFactory
// is created.
public static native void initializeFieldTrials(String fieldTrialsInitString);
6.2.4 PeerConnectionFactory其他函數
//創建Factory
private static native long nativeCreatePeerConnectionFactory();
//創建指令回調接口(與ICE服務器進行交互的指令)
private static native long nativeCreateObserver(PeerConnection.Observer observer);
//創建PeerConnection
private static native long nativeCreatePeerConnection(long nativeFactory, PeerConnection.RTCConfiguration rtcConfig, ediaConstraints constraints, long nativeObserver);
//創建本地音視頻流
private static native long nativeCreateLocalMediaStream(long nativeFactory, String label);
//創建本地視頻源
private static native long nativeCreateVideoSource(long nativeFactory,long nativeVideoCapturer, MediaConstraints constraints);
//創建視頻軌
private static native long nativeCreateVideoTrack(long nativeFactory, String id, long nativeVideoSource);
//創建本地音頻流
private static native long nativeCreateAudioSource(long nativeFactory, MediaConstraints constraints);
//創建音頻軌
private static native long nativeCreateAudioTrack(long nativeFactory, String id, long nativeSource);
//設置相關網絡參數
public native void nativeSetOptions(long nativeFactory, Options options);
//設置視頻硬件加速參數
private static native void nativeSetVideoHwAccelerationOptions(long nativeFactory, Object renderEGLContext);
//回收PeerConnectionFactory
private static native void freeFactory(long nativeFactory);
6.3 PeerConnection相關Native函數
6.3.1相關信令狀態
//檢測本地candidate的狀態:剛剛創建、正在收集、完成收集
** Tracks PeerConnectionInterface::IceGatheringState */
public enum IceGatheringState { NEW, GATHERING, COMPLETE };
//檢測遠端candidate的狀態
/** Tracks PeerConnectionInterface::IceConnectionState */
public enum IceConnectionState {
NEW, CHECKING, CONNECTED, COMPLETED, FAILED, DISCONNECTED, CLOSED
};
//檢測與Sigal信令服務器連接的狀態
/** Tracks PeerConnectionInterface::SignalingState */
public enum SignalingState {
STABLE, HAVE_LOCAL_OFFER, HAVE_LOCAL_PRANSWER, HAVE_REMOTE_OFFER,
HAVE_REMOTE_PRANSWER, CLOSED
};
6.3.2 Native函數介紹
//得到本地sdp描述
public native SessionDescription getLocalDescription();
//得到遠端sdp描述
public native SessionDescription getRemoteDescription();
//創建數據通道
public native DataChannel createDataChannel(String label, DataChannel.Init init);
//創建offer消息
public native void createOffer(SdpObserver observer, MediaConstraints constraints);
//創建answer消息
public native void createAnswer(SdpObserver observer, MediaConstraints constraints);
//設置本地sdp
public native void setLocalDescription(SdpObserver observer, SessionDescription sdp);
//設置遠端sdp
public native void setRemoteDescription(SdpObserver observer, SessionDescription sdp);
//更新IceServer
public native boolean updateIce(List<IceServer> iceServers, MediaConstraints constraints);
//得到信令狀態
public native SignalingState signalingState();
//獲得遠端連接狀態
public native IceConnectionState iceConnectionState();
//獲得本地連接狀態
public native IceGatheringState iceGatheringState();
//關閉與Ice服務器的連接
public native void close();
//釋放PeerConnection
private static native void freePeerConnection(long nativePeerConnection);
//釋放Observer
private static native void freeObserver(long nativeObserver);
//添加新的Candidate
private native boolean nativeAddIceCandidate(String sdpMid, int sdpMLineIndex, String iceCandidateSdp);
//添加本地流
private native boolean nativeAddLocalStream(long nativeStream);
//移除本地流
private native void nativeRemoveLocalStream(long nativeStream);
//得到StatsObserver的狀態
private native boolean nativeGetStats(StatsObserver observer, long nativeTrack);
八、Native函數之音視頻
一旦有了peerConnectionFactory實例,就應該從你的設備上獲取音頻和視頻了,最終渲染到屏幕上。VideoCapturerAndroid,VideoSource,VideoTrack和VideoRenderer,都是以VideoCapturerAndroid開始。
8.1 VideoCapturerAndroid
VideoCapturerAndroid類是一個相機的包裝類,提供訪問設備相機數據流的江邊方法。允許你獲取設備數量,獲取前置后置攝像頭
// Returns the number of camera devices
VideoCapturerAndroid.getDeviceCount();
// Returns the front face device name
VideoCapturerAndroid.getNameOfFrontFacingDevice();
// Returns the back facing device name
VideoCapturerAndroid.getNameOfBackFacingDevice();
// Creates a VideoCapturerAndroid instance for the device name
VideoCapturerAndroid.create(name);
使用VideoCapturerAndroid類的實例,可以創建包含相機視頻流的MediaStream,你可以給對方發送數據。
8.2 VideoSource/VideoTrack
VideoSource可以開始或停止你的設備。在無用停止抓取信息有助於電池使用壽命的延長。
VideoTrack是一個添加VideoSource到MediaStream對象的一個包裝。
8.3 AudioSource/AudioTrack
除了不需要AudioCapturer獲取麥克風數據,AudioSource/AudioTrack和VideoSource/VideoTrack很類似。audioConstraints是MediaContraints的實例。
8.4 VideoRenderer
VideoRendererGui是一個GLSurfaceView,在這之上,可以顯示視頻流,增加我們的renderer到VideoTrack上。
// To create our VideoRenderer, we can use the// included VideoRendererGui for simplicity// First we need to set the GLSurfaceView that it should render to
GLSurfaceView videoView = (GLSurfaceView) findViewById(R.id.glview_call);
// Then we set that view, and pass a Runnable// to run once the surface is ready
VideoRendererGui.setView(videoView, runnable);
// Now that VideoRendererGui is ready, we can get our VideoRenderer
VideoRenderer renderer = VideoRendererGui.createGui(x, y, width, height);
// And finally, with our VideoRenderer ready, we// can add our renderer to the VideoTrack.
localVideoTrack.addRenderer(renderer);
8.5 MediaConstraints
這個MediaConstraints是WebRTC支持將視頻和音頻放入MediaStream的方式。看這個支持的規范,大多數方法都需要MediaContraints的實例。
8.6 MediaStream
getUserMedia直接返回一個MediaStream,可以直接將其添加到RTCPeerConnection中發送給對端。
// We start out with an empty MediaStream object,
// created with help from our PeerConnectionFactory
// Note that LOCAL_MEDIA_STREAM_ID can be any string
MediaStream mediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_MEDIA_STREAM_ID);
// Now we can add our tracks.
mediaStream.addTrack(localVideoTrack);
mediaStream.addTrack(localAudioTrack);
