zoukankan      html  css  js  c++  java
  • Android 音视频深入 五 完美的录视频(附源码下载)

    本篇项目地址,名字是录视频,求star

    https://github.com/979451341/Audio-and-video-learning-materials

    这一次的代码录视频在各个播放器都可以用,有时长显示,对比上一次的代码说说为何两者效果不同,但是我先补充一些之前漏掉的MediaCodec的官方说明还有MediaCodec.BufferInfo

    1.MediaCodec的补充


    buffer_flag_codec_config:提示标志等含有编码初始化/编解码器的具体数据,而不是媒体数据缓冲区。

    buffer_flag_end_of_stream:这个信号流的结束

    buffer_flag_sync_frame提:包含数据的同步帧缓冲区。

    info_output_buffers_changed:输出缓冲区发生了变化,客户必须向输出缓冲区新设置的返回getoutputbuffers()这一点上。

    info_output_format_changed:输出格式发生了变化,随后的数据将按照新格式。

    info_try_again_later:表明呼叫超时,超时时调用dequeueoutputbuffer

    dequeueInputBuffer(long timeoutUs):返回输入缓冲区的索引以填充有效数据或-如果当前没有这样的缓冲区,则返回1。

    dequeueOutputBuffer(MediaCodec.BufferInfo info, long timeoutUs):将输出缓冲器,挡住了timeoutUs微妙

    flush():刷新输入和输出端口的组件,所有指标以前返回调用dequeueinputbuffer(长)和dequeueoutputbuffer(mediacodec.bufferinfo,长)无效。

    mediacodecinfo getcodecinfo():获取编解码器信息。

    getinputbuffers():这start()返回后调用。

    getoutputbuffers():在start()返回时dequeueoutputbuffer信号输出缓冲的变化通过返回info_output_buffers_changed

    mediaformat getoutputformat():这叫dequeueoutputbuffer信号后返回info_output_format_changed格式变化

    queueInputBuffer(int index, int offset, int size, long presentationTimeUs, int flags):在指定索引上填充一个输入缓冲区之后,将其提交给组件。


    MediaCodec.BufferInfo每个缓冲区元数据包括一个偏移量和大小,指定相关联编解码器缓冲区中有效数据的范围。 我就理解为将缓存区数据写入本地的时候需要做出一些调整的 参数



    2.代码对比

    不废话直接来看Video编码这部分,首先对比MediaFormat

            mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, this.mWidth, this.mHeight);
            mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
            mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
            mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
            mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
    
    
    
            final MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);
            format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);	// API >= 18
            format.setInteger(MediaFormat.KEY_BIT_RATE, calcBitRate());
            format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
            format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);

    两者什么宽高视频格式都一样,甚至帧率、采集点大小都一样,只有一个不一样MediaFormat.KEY_COLOR_FORMAT,这个官方说明是:
    由用户设置编码器,在解码器的输出格式中可读。

    也就是说它能够设置编码器,能设置编码方式,也就是说这个两个工程最大的不同是编码,我们继续对比

     private void encodeFrame(byte[] input) {
            Log.w(TAG, "VideoEncoderThread.encodeFrame()");
    
            // 将原始的N21数据转为I420
            NV21toI420SemiPlanar(input, mFrameData, this.mWidth, this.mHeight);
    
            ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
            ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();
    
            int inputBufferIndex = mMediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
            if (inputBufferIndex >= 0) {
                ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
                inputBuffer.clear();
                inputBuffer.put(mFrameData);
                mMediaCodec.queueInputBuffer(inputBufferIndex, 0, mFrameData.length, System.nanoTime() / 1000, 0);
            } else {
                Log.e(TAG, "input buffer not available");
            }
    
    —————-省略
        }
    
    
    
    
    
    
        protected void encode(final ByteBuffer buffer, final int length, final long presentationTimeUs) {
        	if (!mIsCapturing) return;
            final ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
            while (mIsCapturing) {
    	        final int inputBufferIndex = mMediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
    	        if (inputBufferIndex >= 0) {
    	            final ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
    	            inputBuffer.clear();
    	            if (buffer != null) {
    	            	inputBuffer.put(buffer);
    	            }
    //	            if (DEBUG) Log.v(TAG, "encode:queueInputBuffer");
    	            if (length <= 0) {
    	            	// send EOS
    	            	mIsEOS = true;
    	            	if (DEBUG) Log.i(TAG, "send BUFFER_FLAG_END_OF_STREAM");
    	            	mMediaCodec.queueInputBuffer(inputBufferIndex, 0, 0,
    	            		presentationTimeUs, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
    		            break;
    	            } else {
    	            	mMediaCodec.queueInputBuffer(inputBufferIndex, 0, length,
    	            		presentationTimeUs, 0);
    	            }
    	            break;
    	        } else if (inputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
    	        	// wait for MediaCodec encoder is ready to encode
    	        	// nothing to do here because MediaCodec#dequeueInputBuffer(TIMEOUT_USEC)
    	        	// will wait for maximum TIMEOUT_USEC(10msec) on each call
    	        }
            }
        }
    
    

    一样,说实话没啥不同,只有一个不同那就是第一个编码函数开头用了这个,然后在编码的时候使用了i420bytes,这个i420bytes通过xnv21bytes变换而来

        private static void NV21toI420SemiPlanar(byte[] nv21bytes, byte[] i420bytes, int width, int height) {
            System.arraycopy(nv21bytes, 0, i420bytes, 0, width * height);
            for (int i = width * height; i < nv21bytes.length; i += 2) {
                i420bytes[i] = nv21bytes[i + 1];
                i420bytes[i + 1] = nv21bytes[i];
            }
        }

    我们再看每一次编码一个帧放入混合器分别是咋搞的,下面这个代码意思是监听编码一帧前后编码器的状态的变化并将编码后的数据放入MP4文件里,然后释放内存

    	           	drain();
    	           	// request stop recording
    	           	signalEndOfInputStream();
    	           	// process output data again for EOS signale
    	           	drain();
    	           	// release all related objects
    	           	release();

    我们在看看drain()里面说啥,
    开头就mMediaCodec.getOutputBuffers输出数据,然后得到编码器的状态,如果超时了就退出当前循环,如果输出缓冲区发生了变化,那就在执行一次mMediaCodec.getOutputBuffers,如果输出格式变化了重新给编码器配置MediaFormat,然后编码器再次加入混合器,状态的值小于0就是不可预料的状态了,既然是不可预料那就没办法了,剩下来的就是正常的状态,配合着BufferInfo将数据写入混合器

        protected void drain() {
        	if (mMediaCodec == null) return;
            ByteBuffer[] encoderOutputBuffers = mMediaCodec.getOutputBuffers();
            int encoderStatus, count = 0;
            final MediaMuxerWrapper muxer = mWeakMuxer.get();
            if (muxer == null) {
    //        	throw new NullPointerException("muxer is unexpectedly null");
            	Log.w(TAG, "muxer is unexpectedly null");
            	return;
            }
    LOOP:	while (mIsCapturing) {
    			// get encoded data with maximum timeout duration of TIMEOUT_USEC(=10[msec])
                encoderStatus = mMediaCodec.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
                if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                    // wait 5 counts(=TIMEOUT_USEC x 5 = 50msec) until data/EOS come
                    if (!mIsEOS) {
                    	if (++count > 5)
                    		break LOOP;		// out of while
                    }
                } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                	if (DEBUG) Log.v(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
                    // this shoud not come when encoding
                    encoderOutputBuffers = mMediaCodec.getOutputBuffers();
                } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                	if (DEBUG) Log.v(TAG, "INFO_OUTPUT_FORMAT_CHANGED");
                	// this status indicate the output format of codec is changed
                    // this should come only once before actual encoded data
                	// but this status never come on Android4.3 or less
                	// and in that case, you should treat when MediaCodec.BUFFER_FLAG_CODEC_CONFIG come.
                    if (mMuxerStarted) {	// second time request is error
                        throw new RuntimeException("format changed twice");
                    }
    				// get output format from codec and pass them to muxer
    				// getOutputFormat should be called after INFO_OUTPUT_FORMAT_CHANGED otherwise crash.
                    final MediaFormat format = mMediaCodec.getOutputFormat(); // API >= 16
                   	mTrackIndex = muxer.addTrack(format);
                   	mMuxerStarted = true;
                   	if (!muxer.start()) {
                   		// we should wait until muxer is ready
                   		synchronized (muxer) {
    	               		while (!muxer.isStarted())
    						try {
    							muxer.wait(100);
    						} catch (final InterruptedException e) {
    							break LOOP;
    						}
                   		}
                   	}
                } else if (encoderStatus < 0) {
                	// unexpected status
                	if (DEBUG) Log.w(TAG, "drain:unexpected result from encoder#dequeueOutputBuffer: " + encoderStatus);
                } else {
                    final ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                    if (encodedData == null) {
                    	// this never should come...may be a MediaCodec internal error
                        throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");
                    }
                    if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    	// You shoud set output format to muxer here when you target Android4.3 or less
                    	// but MediaCodec#getOutputFormat can not call here(because INFO_OUTPUT_FORMAT_CHANGED don't come yet)
                    	// therefor we should expand and prepare output format from buffer data.
                    	// This sample is for API>=18(>=Android 4.3), just ignore this flag here
    					if (DEBUG) Log.d(TAG, "drain:BUFFER_FLAG_CODEC_CONFIG");
    					mBufferInfo.size = 0;
                    }
    
                    if (mBufferInfo.size != 0) {
                    	// encoded data is ready, clear waiting counter
                		count = 0;
                        if (!mMuxerStarted) {
                        	// muxer is not ready...this will prrograming failure.
                            throw new RuntimeException("drain:muxer hasn't started");
                        }
                        // write encoded data to muxer(need to adjust presentationTimeUs.
                       	mBufferInfo.presentationTimeUs = getPTSUs();
                       	muxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);
    					prevOutputPTSUs = mBufferInfo.presentationTimeUs;
                    }
                    // return buffer to encoder
                    mMediaCodec.releaseOutputBuffer(encoderStatus, false);
                    if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    	// when EOS come.
                   		mIsCapturing = false;
                        break;      // out of while
                    }
                }
            }
        }
    

    打完收工,代码很多,多多抽象理解,重在理解过程,细节。。。。。,自我总结

  • 相关阅读:
    Vue优化首页加载速度 CDN引入
    vue中前进刷新、后退缓存用户浏览数据和浏览位置的实践
    node.js
    keep-alive前进没有刷新
    移动端ios和安卓input问题
    前端技术原理
    Vue给子组件传值为空
    使用vue开发输入型组件更好的一种解决方式(子组件向父组件传值,基于2.2.0)
    Vue路由参数设置可有可无
    Vue组件的三种调用方式
  • 原文地址:https://www.cnblogs.com/jianpanwuzhe/p/8409068.html
Copyright © 2011-2022 走看看