zoukankan      html  css  js  c++  java
  • [iOS 视频流开发-获得视频帧处理]

    调用视频流所使用框架:<Foundation/Foundation.h>

    必须定义的参数:

    1.AVCaptureDevice(捕获设备:前置、后置摄像头等)

    2.AVCaptureInput(捕获输入:一般就是捕获设备的输入)

    3.AVCaptureOutput(捕获输出:可输入为视频文件、图像文件等)

    4.AVCaptureSession(调节多个输入输出)

    关键代码:

    - (void)setupCamera
    {
        NSError *error = nil;
        
        // Create the session
        _session = [[AVCaptureSession alloc] init];
        
        // Configure the session to produce lower resolution video frames, if your
        // processing algorithm can cope. We'll specify medium quality for the
        // chosen device.
        _session.sessionPreset = AVCaptureSessionPresetMedium;
        
        // Find a suitable AVCaptureDevice
        AVCaptureDevice *device = [AVCaptureDevice
                                   defaultDeviceWithMediaType:AVMediaTypeVideo];
        
        // Create a device input with the device and add it to the session.
        _input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                            error:&error];
        if (!_input) {
            // Handling the error appropriately.
        }
        [_session addInput:_input];
        
        // Create a VideoDataOutput and add it to the session
        _output = [[AVCaptureVideoDataOutput alloc] init];
        [_session addOutput:_output];
        
        // Configure your output.
        dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
        [_output setSampleBufferDelegate:self queue:queue];
        
        // Specify the pixel format
        _output.videoSettings =
        [NSDictionary dictionaryWithObject:
         [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                    forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        
        
        // If you wish to cap the frame rate to a known value, such as 15 fps, set
        // minFrameDuration.
        _output.minFrameDuration = CMTimeMake(1, 15);
        
        // Start the session running to start the flow of data
        [_session startRunning];
        
        // Assign session to an ivar.
        [self setSession:_session];
    }
    
    
    // Delegate routine that is called when a sample buffer was written
    - (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
           fromConnection:(AVCaptureConnection *)connection
    {
        // Create a UIImage from the sample buffer data
        UIImage *img = [self imageFromSampleBuffer:sampleBuffer];
    
        /*
        dispatch_async(dispatch_get_main_queue(), ^{
            self.catchview.image=img;
        });
        */
        
    }
    // Create a UIImage from sample buffer data
    - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
    {
        
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);
        
        // Get the number of bytes per row for the pixel buffer
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        // Get the pixel buffer width and height
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        
        
        // Get the number of bytes per row for the pixel buffer
        u_int8_t *baseAddress = (u_int8_t *)malloc(bytesPerRow*height);
        memcpy( baseAddress, CVPixelBufferGetBaseAddress(imageBuffer), bytesPerRow * height     );
        
        // size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
        
        // Create a device-dependent RGB color space
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        
        // Create a bitmap graphics context with the sample buffer data
        
        //The context draws into a bitmap which is `width'
        //  pixels wide and `height' pixels high. The number of components for each
        //      pixel is specified by `space'
        CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                     bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst);
        
        // Create a Quartz image from the pixel data in the bitmap graphics context
        CGImageRef quartzImage = CGBitmapContextCreateImage(context);
        
        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
        
        // Free up the context and color space
        CGContextRelease(context);
        //CGColorSpaceRelease(colorSpace);
        
        // Create an image object from the Quartz image
        UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1.0 orientation:UIImageOrientationRight];
        free(baseAddress);
        // Release the Quartz image
        CGImageRelease(quartzImage);
        return (image);
    }
  • 相关阅读:
    ASP.NET MVC 重点教程一周年版 第二回 UrlRouting
    ASP.NET MVC 重点教程一周年版 第三回 Controller与View
    DynamicData for Asp.net Mvc留言本实例 下篇 更新
    Asp.net MVC视频教程 18 单选与复选框
    使用ASP.NET MVC Futures 中的异步Action
    ASP.NET MVC RC 升级要注意的几点
    ATL、MFC、WTL CString 的今生前世
    msvcprt.lib(MSVCP90.dll) : error LNK2005:已经在libcpmtd.lib(xmutex.obj) 中定义
    关于Windows内存的一些参考文章
    Windows访问令牌相关使用方法
  • 原文地址:https://www.cnblogs.com/rayshen/p/4399363.html
Copyright © 2011-2022 走看看