zoukankan      html  css  js  c++  java
  • iOS开发-实现相机app的方法[转载自官方]

    This brief code example to illustrates how you can capture video and convert the frames you get to UIImage objects. It shows you how to:

    Note: To focus on the most relevant code, this example omits several aspects of a complete application, including memory management. To use AV Foundation, you are expected to have enough experience with Cocoa to be able to infer the missing pieces.

     

    Create and Configure a Capture Session

    You use an AVCaptureSession object to coordinate the flow of data from an AV input device to an output. Create a session, and configure it to produce medium resolution video frames.

     

    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    
    session.sessionPreset = AVCaptureSessionPresetMedium;

    Create and Configure the Device and Device Input

    Capture devices are represented by AVCaptureDevice objects; the class provides methods to retrieve an object for the input type you want. A device has one or more ports, configured using an AVCaptureInput object. Typically, you use the capture input in its default configuration.

    Find a video capture device, then create a device input with the device and add it to the session.

     

    AVCaptureDevice *device =
    
     
    
            [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
     
    
     
    
     
    
    NSError *error = nil;
    
    AVCaptureDeviceInput *input =[AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    
    if (!input) {
    
        // Handle the error appropriately.
    
    }
    
     
    
    [session addInput:input];

    Create and Configure the Data Output

    You use an AVCaptureVideoDataOutput object to process uncompressed frames from the video being captured. You typically configure several aspects of an output. For video, for example, you can specify the pixel format using the videoSettings property, and cap the frame rate by setting the minFrameDuration property.

    Create and configure an output for video data and add it to the session; cap the frame rate to 15 fps by setting the minFrameDuration property to 1/15 second:

     

    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
    
    [session addOutput:output];
    
    output.videoSettings =@{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
    output.minFrameDuration = CMTimeMake(1, 15);

    The data output object uses delegation to vend the video frames. The delegate must adopt the AVCaptureVideoDataOutputSampleBufferDelegate protocol. When you set the data output’s delegate, you must also provide a queue on which callbacks should be invoked.

     

    dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
    
    [output setSampleBufferDelegate:self queue:queue];
    
    dispatch_release(queue);

    You use the queue to modify the priority given to delivering and processing the video frames.

    Implement the Sample Buffer Delegate Method

    In the delegate class, implement the method (captureOutput:didOutputSampleBuffer:fromConnection:) that is called when a sample buffer is written. The video data output object delivers frames as CMSampleBuffers, so you need to convert from the CMSampleBuffer to a UIImage object. The function for this operation is shown in “Converting a CMSampleBuffer to a UIImage.”

     

    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer  fromConnection:(AVCaptureConnection *)connection {
    
        UIImage *image = imageFromSampleBuffer(sampleBuffer);
        // Add your code here that uses the image.
    
    }

    Remember that the delegate method is invoked on the queue you specified in setSampleBufferDelegate:queue:; if you want to update the user interface, you must invoke any relevant code on the main thread.

    Starting and Stopping Recording

    After configuring the capture session, you send it a startRunning message to start the recording.

     

    [session startRunning];

    To stop recording, you send the session a stopRunning message.

     

     

    DEMO Code

    这个demo 运行时出了一个问题,- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer        fromConnection:(AVCaptureConnection*)connection 里载到图后把图片转换为UIImage后转出来后怎么都不显示图片, 经查后直接转为NSData传出来后一切正常, 特别说明一下。 

     

    // Create and configure a capture session and start it running
    
    - (void)setupCaptureSession
    
    {
    
        NSError *error = nil;
    
        
    
        // Create the session
    
        AVCaptureSession *session = [[AVCaptureSession alloc] init];
    
     
    
        
    
        // Configure the session to produce lower resolution video frames, if your
    
        // processing algorithm can cope. We'll specify medium quality for the
    
        // chosen device.
    
        session.sessionPreset = AVCaptureSessionPresetLow; 
    
        
    
        // Find a suitable AVCaptureDevice
    
        AVCaptureDevice *device = [AVCaptureDevice
    
                                   defaultDeviceWithMediaType:AVMediaTypeVideo];
    
        
    
        // Create a device input with the device and add it to the session.
    
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
    
                                                                            error:&error];
    
        if (!input) {
    
            // Handling the error appropriately.
    
        }
    
        [session addInput:input];
    
        
    
        // Create a VideoDataOutput and add it to the session
    
        AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
    
        [session addOutput:output];
    
        
    
        // Configure your output.
    
        dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    
        [output setSampleBufferDelegate:self queue:queue];
        //ARC下不再使用
        //dispatch_release(queue);
    
        
    
        // Specify the pixel format
    
        output.videoSettings =
    
        [NSDictionary dictionaryWithObject:
    
         [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
    
                                    forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    
        
    
        
    
        // 添加界面显示
    
        AVCaptureVideoPreviewLayer *previewLayer = nil; 
    
        previewLayer = [[[AVCaptureVideoPreviewLayer alloc] initWithSession:session] autorelease];
    
        [previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    
        CGRect layerRect = [[[self view] layer] bounds];
    
        [previewLayer setBounds:layerRect];
    
        [previewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
    
    [[[self view] layer] addSublayer:previewLayer];
    
        
    
     
    
        // If you wish to cap the frame rate to a known value, such as 15 fps, set
    
        // minFrameDuration.
    
    //    output.minFrameDuration = CMTimeMake(1, 15);
    
        
    
        // Start the session running to start the flow of data
    
        [session startRunning];
    
        
    
        sessionGlobal = session; 
    
        // Assign session to an ivar.
    
       //  [self setSession:session];
    
        isCapture = FALSE;
    
        UIView *v = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 300, 300)];
    
        v.backgroundColor = [UIColor blueColor];
    
        v.layer.masksToBounds = YES;
    
        v1 = [v retain]; 
    
     
    
        
    
        [self.view addSubview:v];
    
       // [v release];
    
     
    
        start = [[NSDate date] timeIntervalSince1970];
    
        before = start;
    
        num = 0; 
    
    }
    
     
    
     (NSTimeInterval)getTimeFromStart
    
    {
    
        NSDate* dat = [NSDate dateWithTimeIntervalSinceNow:0];
    
        NSTimeInterval now = [dat timeIntervalSince1970]*1;
    
     
    
        
    
        NSTimeInterval b = now - start;
    
        return b; 
    
    }
    
     
    
    - (void)showImage:(NSData *)topImageData
    
    {
    
        if(num > 5)
    
        {
    
            [sessionGlobal stopRunning]; 
    
            return;
    
        }
    
        num ++;
    
        
    
        
    
        NSString *numStr = [NSString stringWithFormat:@"%d.jpg", num];
    
        NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:numStr];
    
        NSLog(@"PATH : %@", path);
    
        [topImageData writeToFile:path atomically:YES];
    
        
    
        
    
        UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)];
    
        imageView.layer.masksToBounds = YES;
    
        imageView.backgroundColor = [UIColor redColor];
    
        UIImage *img = [[UIImage alloc] initWithData:topImageData];
    
        imageView.image = img;
    
        [img release]; 
    
        [self.view addSubview:imageView];
    
        [imageView release];
    
        [self.view setNeedsDisplay]; 
    
    //    [v1 setNeedsDisplay];
    
    }
    
     
    
    // Delegate routine that is called when a sample buffer was written
    
    - (void)captureOutput:(AVCaptureOutput *)captureOutput
    
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
    
           fromConnection:(AVCaptureConnection *)connection
    
    {
    
        NSDate* dat = [NSDate dateWithTimeIntervalSinceNow:0];
    
        NSTimeInterval now = [dat timeIntervalSince1970]*1;
    
        NSLog(@" before: %f  num: %f" , before, now - before);
    
        
    
        if((now - before) > 5)
    
        {
    
            before = [[NSDate date] timeIntervalSince1970];
    
            
    
            
    
            // Create a UIImage from the sample buffer data
    
            UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
    
            if(image != nil)
    
            { //            NSTimeInterval t = [self getTimeFromStart];
    
                NSData* topImageData = UIImageJPEGRepresentation(image, 1.0);
    
                [self performSelectorOnMainThread:@selector(showImage:) withObject:topImageData waitUntilDone:NO];
    
     
    
            }
    
        }
    
        
    
    }
    
     
    
    // Create a UIImage from sample buffer data
    
    - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
    
    {
    
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
        // Lock the base address of the pixel buffer
    
        CVPixelBufferLockBaseAddress(imageBuffer,0);
    
        
    
        // Get the number of bytes per row for the pixel buffer
    
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    
        // Get the pixel buffer width and height
    
        size_t width = CVPixelBufferGetWidth(imageBuffer);
    
        size_t height = CVPixelBufferGetHeight(imageBuffer);
    
        
    
        // Create a device-dependent RGB color space
    
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
        if (!colorSpace)
    
        {
    
            NSLog(@"CGColorSpaceCreateDeviceRGB failure");
    
            return nil;
    
        }
    
        
    
        // Get the base address of the pixel buffer
    
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    
        // Get the data size for contiguous planes of the pixel buffer.
    
        size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
    
        
    
        // Create a Quartz direct-access data provider that uses data we supply
    
        CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize,
    
                                                                  NULL);
    
        // Create a bitmap image from data supplied by our data provider
    
        CGImageRef cgImage =
    
        CGImageCreate(width,
    
                      height,
    
                      8,
    
                      32,
    
                      bytesPerRow,
    
                      colorSpace,
    
                      kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
    
                      provider,
    
                      NULL,
    
                      true,
    
                      kCGRenderingIntentDefault);
    
        CGDataProviderRelease(provider);
    
        CGColorSpaceRelease(colorSpace);
    
        
    
        // Create and return an image object representing the specified Quartz image
    
        UIImage *image = [UIImage imageWithCGImage:cgImage];
    
        CGImageRelease(cgImage);
    
        
    
        CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    
        
    
        return image;
    
    }

     

     

    7 结束捕捉 

    - (void)stopVideoCapture:(id)arg
    
    {
    
    //停止摄像头捕抓
    
    if(self->avCaptureSession){
    
    [self->avCaptureSession stopRunning];
    
    self->avCaptureSession= nil;
    
    [labelStatesetText:@"Video capture stopped"];
    
    }
  • 相关阅读:
    SpringBoot学习:整合shiro(身份认证和权限认证),使用EhCache缓存
    帝国备份王出错
    spring boot整合mybatis+mybatis-plus
    Druid连接池简介和配置
    thinkphp生成的验证码不显示问题解决
    分布式文件系统-FastDFS
    Spring Security OAuth2 Demo
    spring cloud-给Eureka Server加上安全的用户认证
    spring cloud 报错Error creating bean with name 'hystrixCommandAspect' ,解决方案
    分布式唯一ID极简教程
  • 原文地址:https://www.cnblogs.com/mantgh/p/4315606.html
Copyright © 2011-2022 走看看