Android CameraHal 类图分析中得知,CameraAdapter与其他类交互的通道是FrameNotifier,因此,我们的学习入口就从FrameNotifier开始。
FrameNotifier继承了接口类MessageNotifier,因此,CameraAdapter需要实现这两个类的所有接口。调用的基础是enableMsgType,使用此接口来使能及注册相关回调函数。
1.enableMsgType
从上图看,MessageNotifier中的接口函数enableMsgType最终被BaseCameraAdapter继承实现,而MessageNotifier接口类被EventProvider引用。而EventProvider被两个类引用——AppCallbackNotifier及CameraHal。EventProvider与FrameProvider基本类似。
- 先看从CameraHal发起的EventProvider的调用:
1.
在Hal调用setParameters时,当下列条件满足时会调用setEventProvider()函数:
ExCameraParameters::KEY_TEMP_BRACKETING!=NULL && strcmp(valstr, ExCameraParameters::BRACKET_ENABLE) == 0,
就是说当使能BRACKETING功能时才会用到。
BRACKETING:使用不同曝光参数拍摄同一场景。用于图片合成。例如,广袤的草原,远处有高山,如果采用同一组曝光参数,势必会导致拍摄效果不佳,但如果近处的草原及远处的高山采用不同的曝光参数,之后再进行合成。就可以兼顾。
setEventProvider设置的参数为ALL_EVENTS 及 mCameraAdapter;
ALL_EVENTS定义:= enum CameraHalEventType { NO_EVENTS = 0x0, EVENT_FOCUS_LOCKED = 0x1, EVENT_FOCUS_ERROR = 0x2, EVENT_ZOOM_INDEX_REACHED = 0x4, EVENT_SHUTTER = 0x8, EVENT_FACE = 0x10, ///@remarks Future enum related to display, like frame displayed event, could be added here ALL_EVENTS = 0xFFFF ///Maximum of 16 event types supported };
2~4.
在setEventProvider()中, 构造一个EventProvider,并使用ALL_EVENTS注册。
void CameraHal::setEventProvider(int32_t eventMask, MessageNotifier * eventNotifier) { if ( NULL != mEventProvider ) { mEventProvider->disableEventNotification(CameraHalEvent::ALL_EVENTS); delete mEventProvider; mEventProvider = NULL; } mEventProvider = new EventProvider(eventNotifier, this, eventCallbackRelay); if ( NULL == mEventProvider ) { CAMHAL_LOGEA("Error in creating EventProvider"); } else { mEventProvider->enableEventNotification(eventMask); } }
mEventProvider = new EventProvider(eventNotifier, this, eventCallbackRelay);
第一个参数是mCameraAdapter的类型转换,第三个参数为Hal中的eventCallbackRelay. 没有特别的内容,只是初始化赋值操作。
mEventProvider->enableEventNotification(eventMask);
int EventProvider::enableEventNotification(int32_t frameTypes) { status_t ret = NO_ERROR; ///Enable the frame notification to CameraAdapter (which implements FrameNotifier interface) mEventNotifier->enableMsgType(frameTypes<<MessageNotifier::EVENT_BIT_FIELD_POSITION , NULL , mEventCallback , mCookie ); return ret; }
mEventNotifier是mCameraAdapter的类型转换。因此,enableMsgType函数最终会调用到BaseCameraAdapter中的实现。
5. BaseCameraAdapter中的enableMsgType
void BaseCameraAdapter::enableMsgType(int32_t msgs, frame_callback callback, event_callback eventCb, void* cookie) { Mutex::Autolock lock(mSubscriberLock); LOG_FUNCTION_NAME; if ( CameraFrame::PREVIEW_FRAME_SYNC == msgs ) { mFrameSubscribers.add((int) cookie, callback); } else if ( CameraFrame::FRAME_DATA_SYNC == msgs ) { mFrameDataSubscribers.add((int) cookie, callback); } else if ( CameraFrame::IMAGE_FRAME == msgs) { mImageSubscribers.add((int) cookie, callback); } else if ( CameraFrame::RAW_FRAME == msgs) { mRawSubscribers.add((int) cookie, callback); } else if ( CameraFrame::VIDEO_FRAME_SYNC == msgs) { mVideoSubscribers.add((int) cookie, callback); } else if ( CameraHalEvent::ALL_EVENTS == msgs) { mFocusSubscribers.add((int) cookie, eventCb); mShutterSubscribers.add((int) cookie, eventCb); mZoomSubscribers.add((int) cookie, eventCb); mFaceSubscribers.add((int) cookie, eventCb); } else { CAMHAL_LOGEA("Message type subscription no supported yet!"); } LOG_FUNCTION_NAME_EXIT; }
其中ALL_EVENTS的相关类型CameraHalEvent并没有像CameraFrame类枚举做区分。
我们再回头看回调函数的实现
void CameraHal::eventCallback(CameraHalEvent* event) { if ( NULL != event ) { switch( event->mEventType ) { case CameraHalEvent::EVENT_FOCUS_LOCKED: case CameraHalEvent::EVENT_FOCUS_ERROR: { if ( mBracketingEnabled ) { startImageBracketing(); } break; } default: { break; } }; } }
回调函数中有针对EVENT_FOCUS_LOCKED及EVENT_FOCUS_ERROR两个事件的处理函数——>startImageBracketing();因为我们不太会用到Bracketing功能,所以,此处不深究。
从HAL中发起的Event事件处理就结束了,回顾一下,从setParameters()函数中,若Bracketing功能打开,则会新建EventProvider,并使用ALL_EVENTS向BaseCameraAdapter注册回调函数。当有事件发生,CameraAdapter会调用回调函数,如果事件类型是EVENT_FOCUS_LOCKED及EVENT_FOCUS_ERROR,则会调用处理函数startImageBracketing();
- 从AppCallbackNotifier发起的Event事件处理。
跟CameraHal发起的EVENT,从流程上讲没有区别,包括CameraAdapter都是一个。唯一的区别是回调函数不一样。AppCallbackNotifier的处理函数eventCallbackRelay——》eventCallback:
void AppCallbackNotifier::eventCallback(CameraHalEvent* chEvt) { ///Post the event to the event queue of AppCallbackNotifier MSGUTILS::Message msg; CameraHalEvent *event; LOG_FUNCTION_NAME; if ( NULL != chEvt ) { event = new CameraHalEvent(*chEvt); if ( NULL != event ) { msg.command = AppCallbackNotifier::NOTIFIER_CMD_PROCESS_EVENT; msg.arg1 = event; { Mutex::Autolock lock(mLock); mEventQ.put(&msg); } } else { CAMHAL_LOGEA("Not enough resources to allocate CameraHalEvent"); } } LOG_FUNCTION_NAME_EXIT; }
使用消息队列mEventQ发送到消息处理线程AppCallbackNotifier::notificationThread()
bool AppCallbackNotifier::notificationThread() { bool shouldLive = true; status_t ret; LOG_FUNCTION_NAME; //CAMHAL_LOGDA("Notification Thread waiting for message"); ret = MSGUTILS::MessageQueue::waitForMsg(&mNotificationThread->msgQ(), &mEventQ, &mFrameQ, AppCallbackNotifier::NOTIFIER_TIMEOUT); //CAMHAL_LOGDA("Notification Thread received message"); if (mNotificationThread->msgQ().hasMsg()) { ///Received a message from CameraHal, process it CAMHAL_LOGDA("Notification Thread received message from Camera HAL"); shouldLive = processMessage(); if(!shouldLive) { CAMHAL_LOGDA("Notification Thread exiting."); } } if(mEventQ.hasMsg()) { ///Received an event from one of the event providers CAMHAL_LOGDA("Notification Thread received an event from event provider (CameraAdapter)"); notifyEvent(); } if(mFrameQ.hasMsg()) { ///Received a frame from one of the frame providers //CAMHAL_LOGDA("Notification Thread received a frame from frame provider (CameraAdapter)"); notifyFrame(); } LOG_FUNCTION_NAME_EXIT; return shouldLive; }
mEventQ的处理函数为notifyEvent()
void AppCallbackNotifier::notifyEvent() { ///Receive and send the event notifications to app MSGUTILS::Message msg; LOG_FUNCTION_NAME; { Mutex::Autolock lock(mLock); mEventQ.get(&msg); } bool ret = true; CameraHalEvent *evt = NULL; CameraHalEvent::FocusEventData *focusEvtData; CameraHalEvent::ZoomEventData *zoomEvtData; CameraHalEvent::FaceEventData faceEvtData; if(mNotifierState != AppCallbackNotifier::NOTIFIER_STARTED) { return; } switch(msg.command) { case AppCallbackNotifier::NOTIFIER_CMD_PROCESS_EVENT: evt = ( CameraHalEvent * ) msg.arg1; if ( NULL == evt ) { CAMHAL_LOGEA("Invalid CameraHalEvent"); return; } switch(evt->mEventType) { case CameraHalEvent::EVENT_SHUTTER: if ( ( NULL != mCameraHal ) && ( NULL != mNotifyCb ) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_SHUTTER) ) ) { mNotifyCb(CAMERA_MSG_SHUTTER, 0, 0, mCallbackCookie); } mRawAvailable = false; break; case CameraHalEvent::EVENT_FOCUS_LOCKED: case CameraHalEvent::EVENT_FOCUS_ERROR: focusEvtData = &evt->mEventData->focusEvent; if ( ( focusEvtData->focusLocked ) && ( NULL != mCameraHal ) && ( NULL != mNotifyCb ) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_FOCUS) ) ) { mNotifyCb(CAMERA_MSG_FOCUS, true, 0, mCallbackCookie); mCameraHal->disableMsgType(CAMERA_MSG_FOCUS); } else if ( focusEvtData->focusError && ( NULL != mCameraHal ) && ( NULL != mNotifyCb ) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_FOCUS) ) ) { mNotifyCb(CAMERA_MSG_FOCUS, false, 0, mCallbackCookie); mCameraHal->disableMsgType(CAMERA_MSG_FOCUS); } break; case CameraHalEvent::EVENT_ZOOM_INDEX_REACHED: zoomEvtData = &evt->mEventData->zoomEvent; if ( ( NULL != mCameraHal ) && ( NULL != mNotifyCb) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_ZOOM) ) ) { mNotifyCb(CAMERA_MSG_ZOOM, zoomEvtData->currentZoomIndex, zoomEvtData->targetZoomIndexReached, mCallbackCookie); } break; case CameraHalEvent::EVENT_FACE: faceEvtData = evt->mEventData->faceEvent; if ( ( NULL != mCameraHal ) && ( NULL != mNotifyCb) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_METADATA) ) ) { // WA for an issue inside CameraService camera_memory_t *tmpBuffer = mRequestMemory(-1, 1, 1, NULL); mDataCb(CAMERA_MSG_PREVIEW_METADATA, tmpBuffer, 0, faceEvtData->getFaceResult(), mCallbackCookie); faceEvtData.clear(); if ( NULL != tmpBuffer ) { tmpBuffer->release(tmpBuffer); } } break; case CameraHalEvent::ALL_EVENTS: break; default: break; } break; } if ( NULL != evt ) { delete evt; } }
case CameraHalEvent::EVENT_SHUTTER:
case CameraHalEvent::EVENT_FOCUS_LOCKED:
case CameraHalEvent::EVENT_FOCUS_ERROR:
case CameraHalEvent::EVENT_ZOOM_INDEX_REACHED:
都是使用mNotifyCb实现通知,
case CameraHalEvent::EVENT_FACE:
使用mDataCb通知,需要将数据传递给上层。
上面是所有的notifyevent事件,看看Frame event都在那里定义?一共有下面这些类型
enum FrameType { PREVIEW_FRAME_SYNC = 0x1, ///SYNC implies that the frame needs to be explicitly returned after consuming in order to be filled by camera again PREVIEW_FRAME = 0x2 , ///Preview frame includes viewfinder and snapshot frames IMAGE_FRAME_SYNC = 0x4, ///Image Frame is the image capture output frame IMAGE_FRAME = 0x8, VIDEO_FRAME_SYNC = 0x10, ///Timestamp will be updated for these frames VIDEO_FRAME = 0x20, FRAME_DATA_SYNC = 0x40, ///Any extra data assosicated with the frame. Always synced with the frame FRAME_DATA= 0x80, RAW_FRAME = 0x100, SNAPSHOT_FRAME = 0x200, ALL_FRAMES = 0xFFFF ///Maximum of 16 frame types supported };
FRAME_DATA_SYNC IMAGE_FRAME RAW_FRAME PREVIEW_FRAME_SYNC VIDEO_FRAME_SYNC
第一处: FRAME_DATA_SYNC = 0x40, ///Any extra data assosicated with the frame. Always synced with the frame
开启Measurements时,将完整数据传递给mPreviewBufs[],不进行2Dto1D转换,
void AppCallbackNotifier::setMeasurements(bool enable) { Mutex::Autolock lock(mLock); LOG_FUNCTION_NAME; mMeasurementEnabled = enable; if ( enable ) { mFrameProvider->enableFrameNotification(CameraFrame::FRAME_DATA_SYNC); } LOG_FUNCTION_NAME_EXIT; }
else if ( ( CameraFrame::FRAME_DATA_SYNC == frame->mFrameType ) && ( NULL != mCameraHal ) && ( NULL != mDataCb) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)) ) { copyAndSendPreviewFrame(frame, CAMERA_MSG_PREVIEW_FRAME);
void AppCallbackNotifier::copyAndSendPreviewFrame(CameraFrame* frame, int32_t msgType) { dest = (void*) mPreviewBufs[mPreviewBufCount]; CAMHAL_LOGVB("%d:copy2Dto1D(%p, %p, %d, %d, %d, %d, %d, %d,%s)", __LINE__, NULL, //buf, frame->mBuffer, frame->mWidth, frame->mHeight, frame->mPixelFmt, frame->mAlignment, 2, frame->mLength, mPreviewPixelFormat); if ( NULL != dest ) { // data sync frames don't need conversion if (CameraFrame::FRAME_DATA_SYNC == frame->mFrameType) { if ( (mPreviewMemory->size / MAX_BUFFERS) >= frame->mLength ) { memcpy(dest, (void*) src, frame->mLength); } else { memset(dest, 0, (mPreviewMemory->size / MAX_BUFFERS)); } } else { if ((NULL == (void*)frame->mYuv[0]) || (NULL == (void*)frame->mYuv[1])){ CAMHAL_LOGEA("Error! One of the YUV Pointer is NULL"); goto exit; } else{ copy2Dto1D(dest, frame->mYuv, frame->mWidth, frame->mHeight, frame->mPixelFmt, frame->mAlignment, frame->mOffset, 2, frame->mLength, mPreviewPixelFormat); } } }
第二处:只注册IMAGE_FRAME RAW_FRAME两个类型
void AppCallbackNotifier::setFrameProvider(FrameNotifier *frameNotifier) { LOG_FUNCTION_NAME; ///@remarks There is no NULL check here. We will check ///for NULL when we get the start command from CameraAdapter mFrameProvider = new FrameProvider(frameNotifier, this, frameCallbackRelay); if ( NULL == mFrameProvider ) { CAMHAL_LOGEA("Error in creating FrameProvider"); } else { //Register only for captured images and RAW for now //TODO: Register for and handle all types of frames mFrameProvider->enableFrameNotification(CameraFrame::IMAGE_FRAME); mFrameProvider->enableFrameNotification(CameraFrame::RAW_FRAME); } LOG_FUNCTION_NAME_EXIT; }
IMAGE_FRAME,表示需要JPEG编码数据,有两种方式,如果满足下列条件
else if ( (CameraFrame::IMAGE_FRAME == frame->mFrameType) && (NULL != mCameraHal) && (NULL != mDataCb) && ((CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG & frame->mQuirks) || (CameraFrame::ENCODE_RAW_RGB24_TO_JPEG & frame->mQuirks)|| (CameraFrame::ENCODE_RAW_YUV420SP_TO_JPEG & frame->mQuirks)))
则需要新建encoder线程,将YUV或RGB数据传递给encoder线程,并将AppCallbackNotifierEncoderCallback回调函数作为参数传递进入,当编码完成时,调用回调函数,回调函数调用AppCallbackNotifier::EncoderDoneCb(),函数内部会调用mDataCb(CAMERA_MSG_COMPRESSED_IMAGE, picture, 0, NULL, mCallbackCookie);将数据回传。
若不满足上面的条件,只满足
else if ( ( CameraFrame::IMAGE_FRAME == frame->mFrameType ) && ( NULL != mCameraHal ) && ( NULL != mDataCb) )
则表示此frame不需要编码,直接调用 copyAndSendPictureFrame(frame, CAMERA_MSG_COMPRESSED_IMAGE);将数据回调
void AppCallbackNotifier::copyAndSendPictureFrame(CameraFrame* frame, int32_t msgType) { camera_memory_t* picture = NULL; void *dest = NULL, *src = NULL; // scope for lock { picture = mRequestMemory(-1, frame->mLength, 1, NULL); if (NULL != picture) { dest = picture->data; if (NULL != dest) { src = (void *) ((unsigned int) frame->mBuffer + frame->mOffset); memcpy(dest, src, frame->mLength); } } } exit: mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType) frame->mFrameType); if(picture) { if((mNotifierState == AppCallbackNotifier::NOTIFIER_STARTED) && mCameraHal->msgTypeEnabled(msgType)) { mDataCb(msgType, picture, 0, NULL, mCallbackCookie); } picture->release(picture); } }
注意,这两种处理都会调用mFrameProvider->returnFrame函数,用于回收buffer。
对于RAW_FRAME类型,notifyFrame处理较简单:只是回传数据,然后通过returnFrame回收buffer
if ( (CameraFrame::RAW_FRAME == frame->mFrameType )&& ( NULL != mCameraHal ) && ( NULL != mDataCb) && ( NULL != mNotifyCb ) ) { if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE) ) { #ifdef COPY_IMAGE_BUFFER copyAndSendPictureFrame(frame, CAMERA_MSG_RAW_IMAGE); #else //TODO: Find a way to map a Tiler buffer to a MemoryHeapBase #endif } else { if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE_NOTIFY) ) { mNotifyCb(CAMERA_MSG_RAW_IMAGE_NOTIFY, 0, 0, mCallbackCookie); } mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType) frame->mFrameType); } mRawAvailable = true; }
第三处:在CameraFrame::PREVIEW_FRAME_SYNC用于Preview的回调处理
status_t AppCallbackNotifier::startPreviewCallbacks(CameraParameters ¶ms, void *buffers, uint32_t *offsets, int fd, size_t length, size_t count) {
if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME ) ) {
mFrameProvider->enableFrameNotification(CameraFrame::PREVIEW_FRAME_SYNC);
}
}
notifyFrame中的处理,如果MeasureMent没有打开,则直接回收buffer,如果打开了,则需要回传数据。
else if ( ( CameraFrame::PREVIEW_FRAME_SYNC== frame->mFrameType ) && ( NULL != mCameraHal ) && ( NULL != mDataCb) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)) ) { //When enabled, measurement data is sent instead of video data if ( !mMeasurementEnabled ) { copyAndSendPreviewFrame(frame, CAMERA_MSG_PREVIEW_FRAME);// } else { mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType) frame->mFrameType); } }
第四处:startRecording中 CameraFrame::VIDEO_FRAME_SYNC,用于录像,需要拷贝数据。
status_t AppCallbackNotifier::startRecording() { status_t ret = NO_ERROR; LOG_FUNCTION_NAME; Mutex::Autolock lock(mRecordingLock); if ( NULL == mFrameProvider ) { CAMHAL_LOGEA("Trying to start video recording without FrameProvider"); ret = -1; } if(mRecording) { return NO_INIT; } if ( NO_ERROR == ret ) { mFrameProvider->enableFrameNotification(CameraFrame::VIDEO_FRAME_SYNC); } mRecording = true; LOG_FUNCTION_NAME_EXIT; return ret; }
第一处: FRAME_DATA_SYNC = 0x40, ///Any extra data assosicated with the frame. Always synced with the frame
开启Measurements时,将完整数据传递给mPreviewBufs[],不进行2Dto1D转换,
第二初:setFrameProvider时,设置对于RAW_FRAME及IMAGE_FRAME,用于图片的上传及回收
第三处:在CameraFrame::PREVIEW_FRAME_SYNC用于Preview的回调及回收
第四处:startRecording中 CameraFrame::VIDEO_FRAME_SYNC,用于录像,需要拷贝数据
else if ( (CameraFrame::IMAGE_FRAME == frame->mFrameType) && (NULL != mCameraHal) && (NULL != mDataCb) && ((CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG & frame->mQuirks) || (CameraFrame::ENCODE_RAW_RGB24_TO_JPEG & frame->mQuirks)|| (CameraFrame::ENCODE_RAW_YUV420SP_TO_JPEG & frame->mQuirks)))