项目描述:
一个本地的IP Camera 实时发送RTSP视频流到本机上,视频的帧是H264编码,需要解码并显示到屏幕上。并把每帧视频对应的时间戳转换成日期年月日时分秒打印到每帧的图像上显示。
使用到的库:
1.用Qt做了GUI,QLabel显示图像画面,每解码得到一帧转换成图像格式再用Qt的信号机制发送信号到QLabel上显示。
2.用Live555 建立本机与IP Camera的RTSP的会话,需要用到sdp文件。sdp文件提供了编码的描述和相关的H264的SPS和PPS(可以去百度,sdp文件中是以Base64编码的字符串序列)。Live555的相关回调函数会把每帧传递出来(frameBuffer和frameSize) (FFMPEG也提供rtsp流的支持,但是支持不好,它可以直接打开rtsp://129.234.32.123:2324/wesdadawd.sdp这样的URL形式)
3.用FFMPEG的H264解码器去解frameBuffer 包含的帧数据。
关键的知识点:
1.单独一个RTP包可以单独包含一帧,也可以包含很多帧,还有一帧分别打包成多个RTP包,取决于服务端的打包方式。
2.RTP的timestamp是相对的(是个随机的初始值),需要配合RTCP包里面的NTP来相应的计算绝对时间,RTCP包的设计目的是保证RTP包的传输质量的,RTCP包里面有丢包等计数,当RTSP会话建立完成,会建立一个端口对,一个是偶数(RTP包端口),一个奇数(RTCP包端口),这样发送给客户端。当然,可以用WireShark去抓来看看,不过貌似WireShark不能直接识别RTP包,抓出来的UDP,因为RTP包一般来说是基于UDP的包
2.H264的帧是以 NAL单元的单位来传送的,一个NAL单元包含一帧(I帧 或 P帧 或 B帧),这三种类型的帧可以百度。所谓的NAL单元就是去掉SPS、PPS的视频帧, I帧是关键帧,所有的解析都需要靠它,两个I帧之间被称为视频序列,I帧头部需要加入SPS和PPS,这两个之间需要0x00000001来分割, 0x00 0x00 0x00 0x01 + SPS的Base64解码形式 + 0x00 0x00 0x00 0x01 + PPS的解码形式 + 0x00 0x00 0x00 0x01 视频帧(IDR帧) 这样组成的一个buffer,FFMPEG的H264解码器才能成功解码。
大概的关键代码片断就是这样:
1 if (!m_ffmpeg->Init()) return; 2 3 m_scheduler = BasicTaskScheduler::createNew(); 4 m_env = BasicUsageEnvironment::createNew(*m_scheduler); 5 6 m_session = MediaSession::createNew(*m_env, readSDPFile(m_url.toStdString().c_str())); 7 8 if (!m_session) 9 { 10 qDebug() << "Created session failed from SDP file"; 11 return ; 12 } 13 14 MediaSubsessionIterator iter(*m_session); 15 16 while ((m_subsession = iter.next()) != NULL) 17 { 18 if (!m_subsession->initiate(0)) 19 { 20 *m_env << "Failed to initiate the "" << *m_subsession << "" subsession: " << m_env->getResultMsg() << " "; 21 } 22 else 23 { 24 m_subsession->sink = DummySink::createNew(*m_env, *m_subsession, m_url.toStdString().c_str()); 25 if (m_subsession->sink == NULL) 26 { 27 *m_env << "Failed to create a data sink for the "" << *m_subsession << "" subsession: " << m_env->getResultMsg() << " "; 28 } 29 else 30 { 31 const char* sps = m_subsession->fmtp_spropsps(); 32 const char* pps = m_subsession->fmtp_sproppps(); 33 m_subsession->sink->startPlaying(*m_subsession->rtpSource(), NULL, NULL); 34 } 35 } 36 } 37 38 char eventLoopWatchVariable = 0; 39 m_env->taskScheduler().doEventLoop(&eventLoopWatchVariable); 40 41 42 43 ////////////////////frame callback function in Dummy::Sink///////// 44 45 void afterGettingFrame(unsigned frameSize, unsigned numTruncatedBytes, 46 struct timeval presentationTime, unsigned durationInMicroseconds) 47 { 48 if (fStreamId != NULL) envir() << "Stream "" << fStreamId << ""; "; 49 envir() << fSubsession.mediumName() << "/" << fSubsession.codecName() << ": Received " << frameSize << " bytes"; 50 51 QByteArray frameBuffer((char*)fReceiveBuffer, frameSize); 52 53 54 //插入SPS PPS才能让H264解码器正确解码 55 QByteArray sps = sprop_parameter_sets; 56 QByteArray extraData; 57 QList<QByteArray> recodList = sps.split(','); 58 59 for (int i = 0; i < recodList.size(); ++i) 60 { 61 extraData.append(char(0x00)); 62 extraData.append(char(0x00)); 63 extraData.append(char(0x00)); 64 extraData.append(char(0x01)); 65 66 extraData += QByteArray::fromBase64(recodList.at(i)); 67 } 68 QByteArray endMark = QByteArray(4, 0); 69 endMark[3] = 0x01; 70 71 72 frameBuffer.insert(0, extraData); 73 frameBuffer.insert(extraData.size(), endMark); 74 75 gffmpeg->decodeFrame((uint8_t*)frameBuffer.data(), frameBuffer.size(), presentationTime.tv_sec, presentationTime.tv_usec); 76 77 // Then continue, to request the next frame of data: 78 continuePlaying(); 79 } 80 81 82 ////////////////////////decode frame function///////////////////////// 83 void QFFmpeg::decodeFrame(uint8_t* frameBuffer, int frameLength, long second, long microSecond) 84 { 85 if (frameLength <= 0) return; 86 87 int frameFinished = 0; 88 89 90 91 AVPacket framePacket; 92 av_init_packet(&framePacket); 93 94 framePacket.size = frameLength; 95 framePacket.data = frameBuffer; 96 97 int ret = avcodec_decode_video2(m_pAVCodecContext, m_pAVFrame, &frameFinished, &framePacket); 98 99 if (ret < 0) 100 { 101 qDebug() << "Decode error"; 102 return; 103 } 104 105 if (frameFinished) 106 { 107 m_playMutex.lock(); 108 109 m_videoWidth = m_pAVFrame->width; 110 m_videoHeight = m_pAVFrame->height; 111 112 113 QImage frame = QImage(m_videoWidth, m_videoHeight, QImage::Format_ARGB32); 114 m_pSwsContext = sws_getCachedContext(m_pSwsContext, m_videoWidth, m_videoHeight, AVPixelFormat::AV_PIX_FMT_YUV420P, m_videoWidth, m_videoHeight, 115 AVPixelFormat::AV_PIX_FMT_RGB32, SWS_BICUBIC, NULL, NULL, NULL); 116 117 uint8_t *dstSlice[] = { frame.bits() }; 118 int dstStride = frame.width() * 4; 119 120 sws_scale(m_pSwsContext, m_pAVFrame->data, m_pAVFrame->linesize, 0, m_videoHeight, dstSlice, &dstStride); 121 122 char timestamp[100]; 123 char millisecond[50]; 124 125 126 127 time_t time = second; 128 129 struct tm *now_time; 130 131 now_time = gmtime(&time); 132 133 134 strftime(timestamp, 100, "%Y-%m-%d %H:%M:%S", now_time); 135 sprintf_s(millisecond, 50, " %ld.%d%d%d",microSecond/1000,0,0,0); 136 137 138 //发送获取一帧图像信号 139 QImage image((uchar*)dstSlice[0], m_videoWidth, m_videoHeight, QImage::Format_RGB32); 140 141 QPainter pen(&image); 142 pen.setPen(Qt::white); 143 pen.setFont(QFont("Times", 30, QFont::Bold)); 144 pen.drawText(image.rect(), Qt::AlignBottom, QString(timestamp) + QString(millisecond)); 145 emit GetImage(image); 146 m_playMutex.unlock(); 147 } 148 149 150 }
##############################Update 2016-5-13###################
前面几乎都废弃了。
现在我直接发现了一个开源的流媒体解决方案平台
http://www.easydarwin.org
里面很多都是对RTSP IP Camera这类的流媒体库,已经封装的很成熟了。
references:
http://blog.csdn.net/sno_guo/article/details/22388233
http://blog.csdn.net/sunnylgz/article/details/7680262
http://stackoverflow.com/questions/11330764/ffmpeg-cant-decode-h264-stream-frame-data/
http://stackoverflow.com/questions/20786781/ffmpeg-decode-raw-buffer-with-avcodec-decode-video2
http://stackoverflow.com/questions/18857737/decoding-h264-frames-from-rtp-stream
http://stackoverflow.com/questions/32475317/how-to-open-the-local-sdp-file-by-live555
http://blog.csdn.net/leixiaohua1020 (强烈推荐这个博客,在上面学到了很多编码的基础知识)
http://www.zhihu.com/question/20278635 (RTSP、RTP,RTCP的区别)
http://www.zhihu.com/question/20997688 (MP4/RMVB/MKV/AVI 等,这些视频格式与编码压缩标准 mpeg4,H.264.H.265 等有什么关系?)
http://www.zhihu.com/question/27460676
http://www.2cto.com/kf/201408/327163.html
http://stackoverflow.com/questions/4873493/how-can-i-convert-number-of-seconds-since-1970-to-datetime-in-c