WebRTC筆記(三)音視頻同步


1. RTP timestamp和SeqNo

RTP timestamp負責定義媒體數據的采樣時刻,描述負載數據的幀間關系;

RTP SeqNo負責定義RTP數據包的先后關系,描述媒體數據的幀內關系;

2. RTP timestamp和NTP timestamp是同一時刻的不同表示;

3. 音視頻同步的基本對象是AudioReceiveStream和VideoReceiveStream,兩者都繼承自Syncable;

4. 負責音視頻同步的線程是ModuleProcessThread,主要處理文件是rtp_streams_synchronizer.cc,RtpStreamsSynchronizer類包含以下成員,

  • StreamSynchronization類
  • audio和video的Measurements;
  • AudioReceiveStream和VideoReceiveStream的指針:syncable_audio_和syncable_video_;
class RtpStreamsSynchronizer : public Module {
 public:
  explicit RtpStreamsSynchronizer(Syncable* syncable_video);

  void ConfigureSync(Syncable* syncable_audio);
  ......

 private:
  Syncable* syncable_video_;
  Syncable* syncable_audio_ GUARDED_BY(crit_);
  StreamSynchronization::Measurements audio_measurement_ GUARDED_BY(crit_);
  StreamSynchronization::Measurements video_measurement_ GUARDED_BY(crit_);
......
};

 5. 同步過程:

void RtpStreamsSynchronizer::Process() {
  RTC_DCHECK_RUN_ON(&process_thread_checker_);
  last_sync_time_ = rtc::TimeNanos();

  rtc::CritScope lock(&crit_);
  if (!syncable_audio_) {
    return;
  }
  RTC_DCHECK(sync_.get());

  rtc::Optional<Syncable::Info> audio_info = syncable_audio_->GetInfo();
  if (!audio_info || !UpdateMeasurements(&audio_measurement_, *audio_info)) {
    return;
  }

  int64_t last_video_receive_ms = video_measurement_.latest_receive_time_ms;
  rtc::Optional<Syncable::Info> video_info = syncable_video_->GetInfo();
  if (!video_info || !UpdateMeasurements(&video_measurement_, *video_info)) {
    return;
  }

  if (last_video_receive_ms == video_measurement_.latest_receive_time_ms) {
    // No new video packet has been received since last update.
    return;
  }

  int relative_delay_ms;
  // Calculate how much later or earlier the audio stream is compared to video.
  if (!sync_->ComputeRelativeDelay(audio_measurement_, video_measurement_,
                                   &relative_delay_ms)) {
    return;
  }

  TRACE_COUNTER1("webrtc", "SyncCurrentVideoDelay",
      video_info->current_delay_ms);
  TRACE_COUNTER1("webrtc", "SyncCurrentAudioDelay",
      audio_info->current_delay_ms);
  TRACE_COUNTER1("webrtc", "SyncRelativeDelay", relative_delay_ms);
  int target_audio_delay_ms = 0;
  int target_video_delay_ms = video_info->current_delay_ms;
  // Calculate the necessary extra audio delay and desired total video
  // delay to get the streams in sync.
  if (!sync_->ComputeDelays(relative_delay_ms,
                            audio_info->current_delay_ms,
                            &target_audio_delay_ms,
                            &target_video_delay_ms)) {
    return;
  }

  syncable_audio_->SetMinimumPlayoutDelay(target_audio_delay_ms);
  syncable_video_->SetMinimumPlayoutDelay(target_video_delay_ms);
}

    SetMinimumPlayoutDelay告訴AV的Stream,之后的每一幀播放至少要延遲的ms數,直至該值更新。通過調整target_audio_delay_ms和target_video_delay_ms來協調audio和video兩路流的播放時機。

參考文檔:

https://www.jianshu.com/p/3a4d24a71091?hmsr=toutiao.io&utm_medium=toutiao.io&utm_source=toutiao.io


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM