zoukankan      html  css  js  c++  java
  • android Camera 数据流程分析

    这篇文章主要针对其数据流程进行分析。Camera一般用于图像浏览、拍照和视频录制。这里先对图像浏览和拍照的数据流进行分析,后面再对视频电话部分进行分析。


    1、针对HAL层对摄像头数据处理补充一下


    Linux中使用V4L2最为摄像头驱动,V4L2在用户空间通过各种ioctl调用进行控制,并且可以使用mmap进行内存映射

    常用IOCTL函数介绍:
    ioctl函数命令参数如下:
     .vidioc_querycap  = vidioc_querycap,    //查询驱动功能
     .vidioc_enum_fmt_vid_cap = vidioc_enum_fmt_vid_cap,  //获取当前驱动支持的视频格式
     .vidioc_g_fmt_vid_cap  = vidioc_g_fmt_vid_cap,       //读取当前驱动的频捕获格式
     .vidioc_s_fmt_vid_cap  = vidioc_s_fmt_vid_cap,       //设置当前驱动的频捕获格式
     .vidioc_try_fmt_vid_cap  = vidioc_try_fmt_vid_cap,   //验证当前驱动的显示格式
     .vidioc_reqbufs   = vidioc_reqbufs,                              //分配内存
     .vidioc_querybuf  = vidioc_querybuf,                           //把VIDIOC_REQBUFS中分配的数据缓存转换成物理地址
     .vidioc_qbuf   = vidioc_qbuf,                                         //把数据从缓存中读取出来
     .vidioc_dqbuf   = vidioc_dqbuf,                                    //把数据放回缓存队列
     .vidioc_streamon  = vidioc_streamon,                    //开始视频显示函数
     .vidioc_streamoff  = vidioc_streamoff,                   //结束视频显示函数
     .vidioc_cropcap   = vidioc_cropcap,                       //查询驱动的修剪能力
     .vidioc_g_crop   = vidioc_g_crop,                           //读取视频信号的矩形边框
     .vidioc_s_crop   = vidioc_s_crop,                          //设置视频信号的矩形边框 
     .vidioc_querystd  = vidioc_querystd,                     //检查当前视频设备支持的标准,例如PAL或NTSC。


    初始化的时候进行camera基础参数的设置,然后调用mmap系统调用将camera驱动层的数据队列映射到用户空间

    主要有两个线程:
    pictureThread 拍照线程
    当用户使用拍照的功能的时候,拍照线程被调用(非循环),检测队列中的帧数据,将帧数据从队列中取出,
    拍照的数据一定需要传到JAVA层,所有可以将数据转换成JPEG格式再上传,也可以转换成RGB的数据上传给java层


    previewThread 预览线程
    当预览方法被调用的时候启动预览线程,循环的检测队列中是否有帧数据,如果帧数据存在,读取帧数据,由于读取的数据为YUV格式的数据,所有要将YUV数据转换成RGB的送给显示框架显示,也可以将转换过的数据送给视频编码模块,编码成功后储存变成录像的功能

    所有上传的数据处理都要经过dataCallback,除非实现了overlay

    2、数据流控制

    上一节了解的是其控制层次及逻辑,为了更好的理解其数据走向并且为以后优化,那么非常有必要了解它。

    以jpeg数据格式存储为例:
    注册回调函数:
    public final void takePicture(ShutterCallback shutter, PictureCallback raw,
            PictureCallback postview, PictureCallback jpeg) {
    mShutterCallback = shutter;
    mRawImageCallback = raw;
    mPostviewCallback = postview;
    mJpegCallback = jpeg;
           native_takePicture();       
    }

    处理回函数数据:
    @Override
    public void handleMessage(Message msg) {
    switch(msg.what) {
       case CAMERA_MSG_SHUTTER: //有数据到达通知
       case CAMERA_MSG_RAW_IMAGE: //处理未压缩照片函数
       case CAMERA_MSG_COMPRESSED_IMAGE:  //处理压缩处理的照片函数
    if (mJpegCallback != null) {
                 mJpegCallback.onPictureTaken((byte[])msg.obj, mCamera);
          }
          return ;
        case CAMERA_MSG_PREVIEW_FRAME: //处理预览数据函数
        ...
    }


    应用注册回调函数:
    android.hardware.Camera mCameraDevice; //JAVA层Camera对象
    mCameraDevice.takePicture(mShutterCallback, mRawPictureCallback,
       mPostViewPictureCallback, new JpegPictureCallback(loc));


    应用获取数据流程:
    private final class JpegPictureCallback implements PictureCallback {
    public void onPictureTaken(
             final byte [] jpegData, final android.hardware.Camera camera) {
        ...
       mImageCapture.storeImage(jpegData, camera, mLocation); 
       ...
      }
    }


    private class ImageCapture {
    private int storeImage(byte[] data, Location loc) {
    ImageManager.addImage(
                    mContentResolver,
                    title,
                    dateTaken,
                    loc, // location from gps/network
                    ImageManager.CAMERA_IMAGE_BUCKET_NAME, filename,
                    null, data,
                    degree);
    }
    }


    --> 噢,这里就是真正存储数据的地方了,在android系统有四个地方可以存储共同数据区,
    ContentProvider,sharedpreference、file、sqlite这几种方式,这里利用的是file方式

    //
    // Stores a bitmap or a jpeg byte array to a file (using the specified
    // directory and filename). Also add an entry to the media store for
    // this picture. The title, dateTaken, location are attributes for the
    // picture. The degree is a one element array which returns the orientation
    // of the picture.
    //
    public static Uri addImage(ContentResolver cr, String title, long dateTaken,
            Location location, String directory, String filename,
            Bitmap source, byte[] jpegData, int[] degree) {
            ...
            File file = new File(directory, filename);
            outputStream = new FileOutputStream(file);
            if (source != null) {
                source.compress(CompressFormat.JPEG, 75, outputStream);
                degree[0] = 0;
            } else {
                outputStream.write(jpegData);
                degree[0] = getExifOrientation(filePath);
            }
            ...
    }


    holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    SURFACE_TYPE_PUSH_BUFFERS表明该Surface不包含原生数据,Surface用到的数据由其他对象提供,在Camera图像预览中就使用该类型的Surface,有Camera负责提供给预览Surface数据,这样图像预览会比较流畅。


    ok,到这里我们了解了JAVA层回调的流程,下面了解下JAVA-JNI-C++层数据的流程


    这里从底层往上层进行分析比较好:

    1、CameraHardwareInterface提供的回调函数:

    typedef void (*notify_callback)(int32_t msgType,
                                    int32_t ext1,
                                    int32_t ext2,
                                    void* user);


    typedef void (*data_callback)(int32_t msgType,
                                  const sp<IMemory>& dataPtr,
                                  void* user);


    typedef void (*data_callback_timestamp)(nsecs_t timestamp,
                                            int32_t msgType,
                                            const sp<IMemory>& dataPtr,
                                            void* user);


    接口如下:
    /** Set the notification and data callbacks */
    virtual void setCallbacks(notify_callback notify_cb,
                              data_callback data_cb,
                              data_callback_timestamp data_cb_timestamp,
                              void* user) = 0;

    2、CameraService处理HAL的消息函数:
    void CameraService::Client::dataCallback(int32_t msgType, const sp<IMemory>& dataPtr, void* user)
    {
     //...
    switch (msgType) {------------------------------------ 1 接收到HAL消息
            case CAMERA_MSG_PREVIEW_FRAME:
                client->handlePreviewData(dataPtr);
                break;
            case CAMERA_MSG_POSTVIEW_FRAME:
                client->handlePostview(dataPtr);
                break;
            case CAMERA_MSG_RAW_IMAGE:
                client->handleRawPicture(dataPtr);
                break;
            case CAMERA_MSG_COMPRESSED_IMAGE:
                client->handleCompressedPicture(dataPtr); ---------  2 处理图片压缩消息
                --> c->dataCallback(CAMERA_MSG_COMPRESSED_IMAGE, mem); -------- 3 调用如下回调函数
                break;
            default:
                if (c != NULL) {
                    c->dataCallback(msgType, dataPtr);
                }
                break;
        }
    //...
    }

    void CameraService::Client::notifyCallback(int32_t msgType, int32_t ext1, int32_t ext2, void* user)
    {
        LOGV("notifyCallback(%d)", msgType);


        sp<Client> client = getClientFromCookie(user);
        if (client == 0) {
            return;
        }

        switch (msgType) {
            case CAMERA_MSG_SHUTTER:
                // ext1 is the dimension of the yuv picture.
                client->handleShutter((image_rect_type *)ext1);
                break;
            default:
                sp<ICameraClient> c = client->mCameraClient;
                if (c != NULL) {
                    c->notifyCallback(msgType, ext1, ext2); -------------- 4 回调消息(服务端)
                }
                break;
        }
    }


    3、Client客户端处理:
    // callback from camera service when frame or image is ready -------------数据回调处理
    void Camera::dataCallback(int32_t msgType, const sp<IMemory>& dataPtr)
    {
        sp<CameraListener> listener;
        {
            Mutex::Autolock _l(mLock);
            listener = mListener;
        }
        if (listener != NULL) {
            listener->postData(msgType, dataPtr);
        }
    }


    // callback from camera service  ------------------- 消息回调处理
    void Camera::notifyCallback(int32_t msgType, int32_t ext1, int32_t ext2)
    {
        sp<CameraListener> listener;
        {
            Mutex::Autolock _l(mLock);
            listener = mListener;
        }
        if (listener != NULL) {
            listener->notify(msgType, ext1, ext2);
        }
    }


    4、JNI:android_hardware_Camera.cpp
    // provides persistent context for calls from native code to Java
    class JNICameraContext: public CameraListener
    {
    ...
        virtual void notify(int32_t msgType, int32_t ext1, int32_t ext2);
        virtual void postData(int32_t msgType, const sp<IMemory>& dataPtr);
        ...
    }


    数据通过JNI层传给JAVA层,利用copyAndPost函数进行
    void JNICameraContext::postData(int32_t msgType, const sp<IMemory>& dataPtr)
    {
        // return data based on callback type
        switch(msgType) {
        case CAMERA_MSG_VIDEO_FRAME:
            // should never happen
            break;
        // don't return raw data to Java
        case CAMERA_MSG_RAW_IMAGE:
            LOGV("rawCallback");
            env->CallStaticVoidMethod(mCameraJClass, fields.post_event,
                    mCameraJObjectWeak, msgType, 0, 0, NULL);
            break;
        default:
            // TODO: Change to LOGV
            LOGV("dataCallback(%d, %p)", msgType, dataPtr.get());
            copyAndPost(env, dataPtr, msgType);
            break;
        }
    }


    主要数据操作,此处利用IMemory进行数据的传递
    void JNICameraContext::copyAndPost(JNIEnv* env, const sp<IMemory>& dataPtr, int msgType)
    {
    // allocate Java byte array and copy data
        if (dataPtr != NULL) {
         sp<IMemoryHeap> heap = dataPtr->getMemory(&offset, &size);
         uint8_t *heapBase = (uint8_t*)heap->base();
         //由应用管理buffer情形
         const jbyte* data = reinterpret_cast<const jbyte*>(heapBase + offset);
         obj = env->NewByteArray(size);
         env->SetByteArrayRegion(obj, 0, size, data);
        }
       
        // post image data to Java
        env->CallStaticVoidMethod(mCameraJClass, fields.post_event,
                mCameraJObjectWeak, msgType, 0, 0, obj);
    }


    注意这里有个C++调用Java函数:
    fields.post_event = env->GetStaticMethodID(clazz, "postEventFromNative",
                                               "(Ljava/lang/Object;IIILjava/lang/Object;)V");


    Camera.java中定义:
    private static void postEventFromNative(Object camera_ref,
                                            int what, int arg1, int arg2, Object obj)
    {
        Camera c = (Camera)((WeakReference)camera_ref).get();
        if (c == null)
            return;


        if (c.mEventHandler != null) {
            Message m = c.mEventHandler.obtainMessage(what, arg1, arg2, obj);
            c.mEventHandler.sendMessage(m); //由应用层调用takePicture处理回调的数据
        }
    }
    由于视频的数据流较大,通过不会送到JAVA层进行处理,只是通过设置输出设备 Surface 本地处理即可。


    ok,通过以上的分析,数据流已经走通了,下面举例说明一下:


    预览功能:从startPreview开始,调用stopPreview结束
    startPreview() --> startCameraMode() --> startPreviewMode()


    status_t CameraService::Client::startPreviewMode()
    {
    ...
        if (mUseOverlay) {
            // If preview display has been set, set overlay now.
            if (mSurface != 0) {
                ret = setOverlay(); --> createOverlay/setOverlay操作
            }
            ret = mHardware->startPreview();
        } else {    
         ret = mHardware->startPreview();
         // If preview display has been set, register preview buffers now.
          if (mSurface != 0) {
             // Unregister here because the surface registered with raw heap.
             mSurface->unregisterBuffers();
             ret = registerPreviewBuffers();
          }
        }
        ...
    }


    这里有个是否使用Overlay,通过读取CameraHardwareInterface::useOverlay进行确定,使用Overlay
    则数据流在Camera的硬件抽象层中处理,只需要利用setOverlay把Overlay设备设置到其中即可。
    --> mHardware->setOverlay(new Overlay(mOverlayRef));


    如果没有Overlay情况下,则需要从Camera的硬件中得到预览内容的数据,然后调用ISurface的registerBuffers
    将内存注册到输出设备ISurface中,最后通过SurfaceFlinger进行合成输出。
    -->
        // don't use a hardcoded format here
        ISurface::BufferHeap buffers(w, h, w, h,
                                     HAL_PIXEL_FORMAT_YCrCb_420_SP,
                                     mOrientation,
                                     0,
                                     mHardware->getPreviewHeap());


        status_t ret = mSurface->registerBuffers(buffers);

    数据回调处理流程:
    a、注册dataCallback
    mHardware->setCallbacks(notifyCallback,
                            dataCallback,
                            dataCallbackTimestamp,
                            mCameraService.get());


    b、处理消息
    void CameraService::Client::dataCallback(int32_t msgType, const sp<IMemory>& dataPtr, void* user)
    {
        case CAMERA_MSG_PREVIEW_FRAME:
            client->handlePreviewData(dataPtr);
            break;


    }

    c、输出数据
    // preview callback - frame buffer update
    void CameraService::Client::handlePreviewData(const sp<IMemory>& mem)
    {
    ...
    //调用ISurface的postBuffer将视频数据输出
    if (mSurface != NULL) {
          mSurface->postBuffer(offset);
      }
     
      // 调用ICameraClientr 的回调函数,将视频数据回调到上层
      // Is the received frame copied out or not?
      if (flags & FRAME_CALLBACK_FLAG_COPY_OUT_MASK) {
          LOGV("frame is copied");
          copyFrameAndPostCopiedFrame(c, heap, offset, size);
      } else {
          LOGV("frame is forwarded");
          c->dataCallback(CAMERA_MSG_PREVIEW_FRAME, mem);
      }
    ...
    }

  • 相关阅读:
    良好的三元组(求已排列好的数组中各个元素的排位)
    山理工oj 2556传说中的数据结构
    山理oj 1177 时间间隔
    山理oj1525:字符统计2
    linux常用命令
    多线程并发教程
    合理设置线程数量
    Java多线程处理任务(摘抄)
    解决2013Lost connection to MySQL server during query错误方法
    javaMail邮件发送
  • 原文地址:https://www.cnblogs.com/xiaochao1234/p/3923780.html
Copyright © 2011-2022 走看看