上一篇介紹了Live555如何實現錄像功能,我錄的是H264編碼的視頻文件。在《Live555的基本介紹》這一篇中,我介紹說把mp3文件放到live/mediaServer目錄下,然后使用Live555流化,就可以通過vlc去點播該文件。那么我們錄好的h264文件能否被Live555流化然后使用VLC點播呢?經試驗,發現是不行的。
然后我就對比VLC去請求Live555流化mp3文件和h264文件的過程,發現了原因:在請求h264文件時返回的SDP信息中,總出現"a=range:npt=0-",而請求mp3文件時返回的SDP信息中,npt后面是0到一個具體的數字,即指定了該ServerMediaSubsession的時間長度。在ServerMediaSession::generateSDPDescription函數中,我們找到與此相關的內容:
1 // Unless subsessions have differing durations, we also have a "a=range:" line: 2 float dur = duration(); 3 if (dur == 0.0) { 4 rangeLine = strDup("a=range:npt=0-\r\n"); 5 } else if (dur > 0.0) { 6 char buf[100]; 7 sprintf(buf, "a=range:npt=0-%.3f\r\n", dur); 8 rangeLine = strDup(buf); 9 } else { // subsessions have differing durations, so "a=range:" lines go there 10 rangeLine = strDup(""); 11 }
也就是說,duration函數返回值為0,才會出現"a=range:npt=0-",表示此ServerMediaSession的持續時間未知(不是表示此ServerMediaSession的持續時間為0)。
1 float ServerMediaSession::duration() const { 2 float minSubsessionDuration = 0.0; 3 float maxSubsessionDuration = 0.0; 4 for (ServerMediaSubsession* subsession = fSubsessionsHead; subsession != NULL; 5 subsession = subsession->fNext) { 6 // Hack: If any subsession supports seeking by 'absolute' time, then return a negative value, to indicate that only subsessions 7 // will have a "a=range:" attribute: 8 char* absStartTime = NULL; char* absEndTime = NULL; 9 subsession->getAbsoluteTimeRange(absStartTime, absEndTime); 10 if (absStartTime != NULL) return -1.0f; 11 12 float ssduration = subsession->duration(); 13 if (subsession == fSubsessionsHead) { // this is the first subsession 14 minSubsessionDuration = maxSubsessionDuration = ssduration; 15 } else if (ssduration < minSubsessionDuration) { 16 minSubsessionDuration = ssduration; 17 } else if (ssduration > maxSubsessionDuration) { 18 maxSubsessionDuration = ssduration; 19 } 20 } 21 22 if (maxSubsessionDuration != minSubsessionDuration) { 23 return -maxSubsessionDuration; // because subsession durations differ 24 } else { 25 return maxSubsessionDuration; // all subsession durations are the same 26 } 27 }
看一下ServerMediaSession::duration函數,發現ServerMediaSession的duration取決於各個ServerMediaSubsession的duration。那我們再看一下ServerMediaSubsession的duration函數:
1 float ServerMediaSubsession::duration() const { 2 // default implementation: assume an unbounded session: 3 return 0.0; 4 }
可以看到默認的實現是返回0,而對於H264視頻文件對應的ServerMediaSubsession具體類是H264VideoFileServerMediaSubsession,在該類中沒有找到覆蓋duration的實現,因此對於H264視頻文件duration函數返回0。而對於mp3文件,我們看一下MP3AudioFileServerMediaSubsession類中關於duration的實現:
1 float MP3AudioFileServerMediaSubsession::duration() const { 2 return fFileDuration; //返回的是fFileDuration,這個值由MP3FileSource得到 3 }
后來發現對於mkv文件,Live555也是支持點播的,那么我又去看了一下MatroskaFileServerMediaSubsession類關於duration的實現:
1 float MatroskaFileServerMediaSubsession::duration() const { return fOurDemux.fileDuration(); }
總之,對於mp3文件和mkv文件,druation函數都有具體的實現,而對於h264文件,使用的是默認的實現(返回0)。后來在官網找到這么一段說明:
這段話列出了Live555支持幾種播放動作(包括暫停、點播、快進、倒放)的文件類型,Seeking即點播,從中我們可以看到支持的文件類型沒有h264。但是后面又提示說了如果要使這些播放動作可以用於MPEG Transport Stream file(即ts流文件,后綴名是.ts),則必須有一個index file,還提示說可以使用MPEG2TransportStreamIndexer來產生index file。
然后我在testOnDemandRTSPServer.cpp中找到了對於.ts文件的處理
1 // A MPEG-2 Transport Stream: 2 { 3 char const* streamName = "mpeg2TransportStreamTest"; 4 char const* inputFileName = "test.ts"; 5 char const* indexFileName = "test.tsx"; 6 ServerMediaSession* sms 7 = ServerMediaSession::createNew(*env, streamName, streamName, 8 descriptionString); 9 sms->addSubsession(MPEG2TransportFileServerMediaSubsession 10 ::createNew(*env, inputFileName, indexFileName, reuseFirstSource)); 11 rtspServer->addServerMediaSession(sms); 12 13 announceStream(rtspServer, sms, streamName, inputFileName); 14 }
從中可以看到,對於.ts文件的流化,Live555還需要一個.tsx文件,這個就是對應於.ts文件的index file了。然后我在live/testProgs目錄下找到了MPEG2TransportStreamIndexer.cpp
1 int main(int argc, char const** argv) { 2 // Begin by setting up our usage environment: 3 TaskScheduler* scheduler = BasicTaskScheduler::createNew(); 4 env = BasicUsageEnvironment::createNew(*scheduler); 5 6 7 // Parse the command line: 8 programName = argv[0]; 9 //if (argc != 2) usage(); 10 11 char const* inputFileName = "test.ts"; 12 // Check whether the input file name ends with ".ts": 13 int len = strlen(inputFileName); 14 if (len < 4 || strcmp(&inputFileName[len-3], ".ts") != 0) { 15 *env << "ERROR: input file name \"" << inputFileName 16 << "\" does not end with \".ts\"\n"; 17 usage(); 18 } 19 20 // Open the input file (as a 'byte stream file source'): 21 FramedSource* input 22 = ByteStreamFileSource::createNew(*env, inputFileName, TRANSPORT_PACKET_SIZE); 23 if (input == NULL) { 24 *env << "Failed to open input file \"" << inputFileName << "\" (does it exist?)\n"; 25 exit(1); 26 } 27 28 // Create a filter that indexes the input Transport Stream data: 29 FramedSource* indexer 30 = MPEG2IFrameIndexFromTransportStream::createNew(*env, input); 31 32 // The output file name is the same as the input file name, except with suffix ".tsx": 33 char* outputFileName = new char[len+2]; // allow for trailing x\0 34 sprintf(outputFileName, "%sx", inputFileName); 35 36 // Open the output file (for writing), as a 'file sink': 37 MediaSink* output = FileSink::createNew(*env, outputFileName); 38 if (output == NULL) { 39 *env << "Failed to open output file \"" << outputFileName << "\"\n"; 40 exit(1); 41 } 42 43 44 45 // Start playing, to generate the output index file: 46 *env << "Writing index file \"" << outputFileName << "\"..."; 47 output->startPlaying(*indexer, afterPlaying, NULL); 48 49 env->taskScheduler().doEventLoop(); // does not return 50 51 return 0; // only to prevent compiler warning 52 } 53 54 void afterPlaying(void* /*clientData*/) { 55 *env << "...done\n"; 56 exit(0); 57 }
這個程序演示了如何根據一個.ts文件產生對應的.tsx文件,只要同時具有.ts文件和.tsx文件就可以實現點播了。那現在如何把h264文件轉成.ts文件呢?然后我又在live/testProgs目錄下驚喜發現了testH264VideoToTransportStream.cpp文件,來看看
1 int main(int argc, char** argv) { 2 // Begin by setting up our usage environment: 3 TaskScheduler* scheduler = BasicTaskScheduler::createNew(); 4 env = BasicUsageEnvironment::createNew(*scheduler); 5 6 // Open the input file as a 'byte-stream file source': 7 FramedSource* inputSource = ByteStreamFileSource::createNew(*env, inputFileName); 8 if (inputSource == NULL) { 9 *env << "Unable to open file \"" << inputFileName 10 << "\" as a byte-stream file source\n"; 11 exit(1); 12 } 13 14 // Create a 'framer' filter for this file source, to generate presentation times for each NAL unit: 15 H264VideoStreamFramer* framer = H264VideoStreamFramer::createNew(*env, inputSource, True/*includeStartCodeInOutput*/); 16 17 // Then create a filter that packs the H.264 video data into a Transport Stream: 18 MPEG2TransportStreamFromESSource* tsFrames = MPEG2TransportStreamFromESSource::createNew(*env); 19 tsFrames->addNewVideoSource(framer, 5/*mpegVersion: H.264*/); 20 21 // Open the output file as a 'file sink': 22 MediaSink* outputSink = FileSink::createNew(*env, outputFileName); 23 if (outputSink == NULL) { 24 *env << "Unable to open file \"" << outputFileName << "\" as a file sink\n"; 25 exit(1); 26 } 27 28 // Finally, start playing: 29 *env << "Beginning to read...\n"; 30 outputSink->startPlaying(*tsFrames, afterPlaying, NULL); 31 32 env->taskScheduler().doEventLoop(); // does not return 33 34 return 0; // only to prevent compiler warning 35 } 36 37 void afterPlaying(void* /*clientData*/) { 38 *env << "Done reading.\n"; 39 *env << "Wrote output file: \"" << outputFileName << "\"\n"; 40 exit(0); 41 }
這個程序又演示了如何將一個h264視頻文件轉成.ts格式的文件,這樣把上面的兩個例子程序結合起來,就可以實現H264文件的點播了。
在此,順便說一下自己對Live555多線程編程的理解,Live555是基於事件驅動的單線程模式,每個TaskScheduler就對應一個Live555線程,那么在我們的程序中可以創建多個TaskScheduler來實現多線程。我們自己程序的其他線程和Live555線程的交互可以通過全局的flag變量或者調用triggerEvent函數。我們來看看Live555官方對於此問題的解答: