zoukankan      html  css  js  c++  java
  • Android:高通平台Camera HFR Usecase分析

    一、高帧率录像简介

      高帧率录像即慢动作拍摄,通常人眼能够接受的最好的视频帧速率是24帧/每秒。如果用120帧/秒拍摄一个动作,再用24帧每秒来播放的话,视频就放慢了5倍。

      高通平台的 Slow motion feature :

    • 高速录制(HSR) : 以高fps(运行速率)捕获、编码并保存为高 fps(目标速率),运行速率等于目标速率。
    • 高帧率录制(HFR) : 以高fps(运行速率)捕获、编码并保存为30 fps(目标速率),运行速率大于目标速率。

    、代码流程分析 (高通相机源码路径:packagesappsSnapdragonCamera)

    1、app启动录像 : packagesappsSnapdragonCamerasrccomandroidcameraCaptureModule.java

        private boolean startRecordingVideo(final int cameraId) {
                ...
    
                if (ApiHelper.isAndroidPOrHigher()) {
                    if (mHighSpeedCapture && ((int) mHighSpeedFPSRange.getUpper() > NORMAL_SESSION_MAX_FPS)) {
                        CaptureRequest initialRequest = mVideoRequestBuilder.build();
                        buildConstrainedCameraSession(mCameraDevice[cameraId], surfaces,
                                mSessionListener, mCameraHandler, initialRequest);
    
                    } else {
                        configureCameraSessionWithParameters(cameraId, surfaces,
                                mSessionListener, mCameraHandler, mVideoRequestBuilder.build());
                    }
                } else {
    
                    //hfr开启且最大帧率大于NORMAL_SESSION_MAX_FPS(60fps)
                    //创建的是createConstrainedHighSpeedCaptureSession
                    //否则是createCaptureSession
                    if (mHighSpeedCapture && ((int) mHighSpeedFPSRange.getUpper() > NORMAL_SESSION_MAX_FPS)) {
                //创建高速流 mCameraDevice[cameraId].createConstrainedHighSpeedCaptureSession(surfaces,
    new CameraConstrainedHighSpeedCaptureSession.StateCallback() { @Override public void onConfigured(CameraCaptureSession cameraCaptureSession) { mCurrentSession = cameraCaptureSession; Log.v(TAG, "createConstrainedHighSpeedCaptureSession onConfigured"); mCaptureSession[cameraId] = cameraCaptureSession; CameraConstrainedHighSpeedCaptureSession session = (CameraConstrainedHighSpeedCaptureSession) mCurrentSession;

                                        try {
                                            setUpVideoCaptureRequestBuilder(mVideoRequestBuilder, cameraId);
                                            List list = CameraUtil
                                                    .createHighSpeedRequestList(mVideoRequestBuilder.build());
                            // 通过setRepeatingBurst每次同时提交多个request申请,对应的native方法是submitRequestList
                                            session.setRepeatingBurst(list, mCaptureCallback, mCameraHandler);
                                        } catch (CameraAccessException e) {
                                            Log.e(TAG, "Failed to start high speed video recording "
                                                    + e.getMessage());
                                            e.printStackTrace();
                                        } catch (IllegalArgumentException e) {
                                            Log.e(TAG, "Failed to start high speed video recording "
                                                    + e.getMessage());
                                            e.printStackTrace();
                                        } catch (IllegalStateException e) {
                                            Log.e(TAG, "Failed to start high speed video recording "
                                                    + e.getMessage());
                                            e.printStackTrace();
                                        }
                                        if (!mFrameProcessor.isFrameListnerEnabled() && !startMediaRecorder()) {
                                            startRecordingFailed();
                                            return;
                                        }
    }, null); } else { surfaces.add(mVideoSnapshotImageReader.getSurface()); String zzHDR = mSettingsManager.getValue(SettingsManager.KEY_VIDEO_HDR_VALUE); boolean zzHdrStatue = zzHDR.equals("1"); // if enable ZZHDR mode, don`t call the setOpModeForVideoStream method. if (!zzHdrStatue) { setOpModeForVideoStream(cameraId); } String value = mSettingsManager.getValue(SettingsManager.KEY_FOVC_VALUE); if (value != null && Boolean.parseBoolean(value)) { mStreamConfigOptMode = mStreamConfigOptMode | STREAM_CONFIG_MODE_FOVC; } if (zzHdrStatue) { mStreamConfigOptMode = STREAM_CONFIG_MODE_ZZHDR; } if (DEBUG) { Log.v(TAG, "createCustomCaptureSession mStreamConfigOptMode :" + mStreamConfigOptMode); } if (mStreamConfigOptMode == 0) {
                   //普通流,但是该过程设置了setOpModeForVideoStream,会导致config->operation_mode变化。 mCameraDevice[cameraId].createCaptureSession(surfaces, mCCSSateCallback,
    null); } else { List<OutputConfiguration> outConfigurations = new ArrayList<>(surfaces.size()); for (Surface sface : surfaces) { outConfigurations.add(new OutputConfiguration(sface)); } mCameraDevice[cameraId].createCustomCaptureSession(null, outConfigurations, mStreamConfigOptMode, mCCSSateCallback, null); } } } } ...

    2、HFR 配置流 : frameworksasecorejavaandroidhardwarecamera2CameraDevice.java

     //创建高速捕获会话接口
    public abstract void createConstrainedHighSpeedCaptureSession(@NonNull List<Surface> outputs, @NonNull CameraCaptureSession.StateCallback callback, @Nullable Handler handler) throws CameraAccessException;

      具体实现在:frameworksasecorejavaandroidhardwarecamera2implCameraDeviceImpl.java

        @Override
        public void createConstrainedHighSpeedCaptureSession(List<Surface> outputs,
                android.hardware.camera2.CameraCaptureSession.StateCallback callback, Handler handler)
                throws CameraAccessException {
            if (outputs == null || outputs.size() == 0 || outputs.size() > 2) {
                throw new IllegalArgumentException(
                        "Output surface list must not be null and the size must be no more than 2");
            }
            List<OutputConfiguration> outConfigurations = new ArrayList<>(outputs.size());
            for (Surface surface : outputs) {
                outConfigurations.add(new OutputConfiguration(surface));
            }
            createCaptureSessionInternal(null, outConfigurations, callback,
                    checkAndWrapHandler(handler),
                    /*operatingMode*/ICameraDeviceUser.CONSTRAINED_HIGH_SPEED_MODE,
                    /*sessionParams*/ null);
        }

      其中 createCaptureSessionInternal 函数实现如下:

        private void createCaptureSessionInternal(InputConfiguration inputConfig,
                List<OutputConfiguration> outputConfigurations,
                CameraCaptureSession.StateCallback callback, Executor executor,
                int operatingMode, CaptureRequest sessionParams) throws CameraAccessException {
            synchronized(mInterfaceLock) {
                if (DEBUG) {
                    Log.d(TAG, "createCaptureSessionInternal");
                }
    
                checkIfCameraClosedOrInError();
    
                boolean isConstrainedHighSpeed =
                        (operatingMode == ICameraDeviceUser.CONSTRAINED_HIGH_SPEED_MODE);
                if (isConstrainedHighSpeed && inputConfig != null) {
                    throw new IllegalArgumentException("Constrained high speed session doesn't support"
                            + " input configuration yet.");
                }
    
                // Notify current session that it's going away, before starting camera operations
                // After this call completes, the session is not allowed to call into CameraDeviceImpl
                if (mCurrentSession != null) {
                    mCurrentSession.replaceSessionClose();
                }
    
                // TODO: dont block for this
                boolean configureSuccess = true;
                CameraAccessException pendingException = null;
                Surface input = null;
                try {
                    // configure streams and then block until IDLE
                    // 里面会获取设备属性
                    configureSuccess = configureStreamsChecked(inputConfig, outputConfigurations,
                            operatingMode, sessionParams);
                    if (configureSuccess == true && inputConfig != null) {
                        input = mRemoteDevice.getInputSurface();
                    }
                } catch (CameraAccessException e) {
                    configureSuccess = false;
                    pendingException = e;
                    input = null;
                    if (DEBUG) {
                        Log.v(TAG, "createCaptureSession - failed with exception ", e);
                    }
                }
    
                // Fire onConfigured if configureOutputs succeeded, fire onConfigureFailed otherwise.
                CameraCaptureSessionCore newSession = null;
                if (isConstrainedHighSpeed) {
                    ArrayList<Surface> surfaces = new ArrayList<>(outputConfigurations.size());
                    for (OutputConfiguration outConfig : outputConfigurations) {
                        surfaces.add(outConfig.getSurface());
                    }
                    StreamConfigurationMap config =
                        getCharacteristics().get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
    
                    // 检查格式是否正确、fps是否是有效范围、是不是预览/录像编码流等
                    SurfaceUtils.checkConstrainedHighSpeedSurfaces(surfaces, /*fpsRange*/null, config);
    
                    newSession = new CameraConstrainedHighSpeedCaptureSessionImpl(mNextSessionId++,
                            callback, executor, this, mDeviceExecutor, configureSuccess,
                            mCharacteristics);
                } else {
                    newSession = new CameraCaptureSessionImpl(mNextSessionId++, input,
                            callback, executor, this, mDeviceExecutor, configureSuccess);
                }
    
                // TODO: wait until current session closes, then create the new session
                mCurrentSession = newSession;
    
                if (pendingException != null) {
                    throw pendingException;
                }
    
                mSessionStateCallback = mCurrentSession.getDeviceStateCallback();
            }
        }

      继续分析configureStreamsChecked()

        public boolean configureStreamsChecked(InputConfiguration inputConfig,
                List<OutputConfiguration> outputs, int operatingMode, CaptureRequest sessionParams)
                        throws CameraAccessException {
            // Treat a null input the same an empty list
            if (outputs == null) {
                outputs = new ArrayList<OutputConfiguration>();
            }
            if (outputs.size() == 0 && inputConfig != null) {
                throw new IllegalArgumentException("cannot configure an input stream without " +
                        "any output streams");
            }
    
            checkInputConfiguration(inputConfig);
    
            boolean success = false;
    
            synchronized(mInterfaceLock) {
                checkIfCameraClosedOrInError();
                // Streams to create
                HashSet<OutputConfiguration> addSet = new HashSet<OutputConfiguration>(outputs);
                // Streams to delete
                List<Integer> deleteList = new ArrayList<Integer>();
    
                // Determine which streams need to be created, which to be deleted
                for (int i = 0; i < mConfiguredOutputs.size(); ++i) {
                    int streamId = mConfiguredOutputs.keyAt(i);
                    OutputConfiguration outConfig = mConfiguredOutputs.valueAt(i);
    
                    if (!outputs.contains(outConfig) || outConfig.isDeferredConfiguration()) {
                        // Always delete the deferred output configuration when the session
                        // is created, as the deferred output configuration doesn't have unique surface
                        // related identifies.
                        deleteList.add(streamId);
                    } else {
                        addSet.remove(outConfig);  // Don't create a stream previously created
                    }
                }
    
                mDeviceExecutor.execute(mCallOnBusy);
                stopRepeating();
    
                try {
                    waitUntilIdle();
    
                    // 开始配置
                    mRemoteDevice.beginConfigure();
    
                    // reconfigure the input stream if the input configuration is different.
                    InputConfiguration currentInputConfig = mConfiguredInput.getValue();
                    if (inputConfig != currentInputConfig &&
                            (inputConfig == null || !inputConfig.equals(currentInputConfig))) {
                        if (currentInputConfig != null) {
                            mRemoteDevice.deleteStream(mConfiguredInput.getKey());
                            mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
                                    REQUEST_ID_NONE, null);
                        }
                        if (inputConfig != null) {
                            int streamId = mRemoteDevice.createInputStream(inputConfig.getWidth(),
                                    inputConfig.getHeight(), inputConfig.getFormat());
                            mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
                                    streamId, inputConfig);
                        }
                    }
    
                    // Delete all streams first (to free up HW resources)
                    for (Integer streamId : deleteList) {
                        mRemoteDevice.deleteStream(streamId);
                        mConfiguredOutputs.delete(streamId);
                    }
    
                    // Add all new streams
                    for (OutputConfiguration outConfig : outputs) {
                        if (addSet.contains(outConfig)) {
                            int streamId = mRemoteDevice.createStream(outConfig);
                            mConfiguredOutputs.put(streamId, outConfig);
                        }
                    }
    
                    //customOpMode 可以通过setOpModeForVideoStream改变
                    //CameraConstrainedHighSpeedCaptureSessionImpl没有改变该值
                    operatingMode = (operatingMode | (customOpMode << 16));
    
                    //结束配置流
                    //mRemoteDevice类型是ICameraDeviceUserWrapper
                    //是在在打开相机时获取的。
                    if (sessionParams != null) {
                        mRemoteDevice.endConfigure(operatingMode, sessionParams.getNativeCopy());
                    } else {
                        mRemoteDevice.endConfigure(operatingMode, null);
                    }
    
                    success = true;
                } catch (IllegalArgumentException e) {
                    // OK. camera service can reject stream config if it's not supported by HAL
                    // This is only the result of a programmer misusing the camera2 api.
                    Log.w(TAG, "Stream configuration failed due to: " + e.getMessage());
                    return false;
                } catch (CameraAccessException e) {
                    if (e.getReason() == CameraAccessException.CAMERA_IN_USE) {
                        throw new IllegalStateException("The camera is currently busy." +
                                " You must wait until the previous operation completes.", e);
                    }
                    throw e;
                } finally {
                    if (success && outputs.size() > 0) {
                        mDeviceExecutor.execute(mCallOnIdle);
                    } else {
                        // Always return to the 'unconfigured' state if we didn't hit a fatal error
                        mDeviceExecutor.execute(mCallOnUnconfigured);
                    }
                }
            }
    
            return success;
        }

      上面的mRemoteDevice对象是ICameraDeviceUserWrapper类型,是在打开相机时获取的,代码如下:

      frameworksasecorejavaandroidhardwarecamera2CameraManager.java

        private CameraDevice openCameraDeviceUserAsync(String cameraId,
                CameraDevice.StateCallback callback, Executor executor, final int uid)
                throws CameraAccessException {
            CameraCharacteristics characteristics = getCameraCharacteristics(cameraId);
            CameraDevice device = null;
    
            synchronized (mLock) {
    
                ICameraDeviceUser cameraUser = null;
    
                //创建CameraDeviceImpl对象
                android.hardware.camera2.impl.CameraDeviceImpl deviceImpl =
                        new android.hardware.camera2.impl.CameraDeviceImpl(
                            cameraId,
                            callback,
                            executor,
                            characteristics,
                            mContext.getApplicationInfo().targetSdkVersion);
    
                ICameraDeviceCallbacks callbacks = deviceImpl.getCallbacks();
    
                try {
                    if (supportsCamera2ApiLocked(cameraId)) {
                        // Use cameraservice's cameradeviceclient implementation for HAL3.2+ devices
                        //获取cameraService代理对象
                        ICameraService cameraService = CameraManagerGlobal.get().getCameraService();
                        if (cameraService == null) {
                            throw new ServiceSpecificException(
                                ICameraService.ERROR_DISCONNECTED,
                                "Camera service is currently unavailable");
                        }
                        //通过cameraService代理对象打开相机,获取ICameraDeviceUser cameraUser对象
                        cameraUser = cameraService.connectDevice(callbacks, cameraId,
                                mContext.getOpPackageName(), uid);
                    } else {
                        // Use legacy camera implementation for HAL1 devices
                        int id;
                        try {
                            id = Integer.parseInt(cameraId);
                        } catch (NumberFormatException e) {
                            throw new IllegalArgumentException("Expected cameraId to be numeric, but it was: "
                                    + cameraId);
                        }
    
                        Log.i(TAG, "Using legacy camera HAL.");
                        cameraUser = CameraDeviceUserShim.connectBinderShim(callbacks, id);
                    }
                } catch (ServiceSpecificException e) {
                    if (e.errorCode == ICameraService.ERROR_DEPRECATED_HAL) {
                        throw new AssertionError("Should've gone down the shim path");
                    } else if (e.errorCode == ICameraService.ERROR_CAMERA_IN_USE ||
                            e.errorCode == ICameraService.ERROR_MAX_CAMERAS_IN_USE ||
                            e.errorCode == ICameraService.ERROR_DISABLED ||
                            e.errorCode == ICameraService.ERROR_DISCONNECTED ||
                            e.errorCode == ICameraService.ERROR_INVALID_OPERATION) {
                        // Received one of the known connection errors
                        // The remote camera device cannot be connected to, so
                        // set the local camera to the startup error state
                        deviceImpl.setRemoteFailure(e);
    
                        if (e.errorCode == ICameraService.ERROR_DISABLED ||
                                e.errorCode == ICameraService.ERROR_DISCONNECTED ||
                                e.errorCode == ICameraService.ERROR_CAMERA_IN_USE) {
                            // Per API docs, these failures call onError and throw
                            throwAsPublicException(e);
                        }
                    } else {
                        // Unexpected failure - rethrow
                        throwAsPublicException(e);
                    }
                } catch (RemoteException e) {
                    // Camera service died - act as if it's a CAMERA_DISCONNECTED case
                    ServiceSpecificException sse = new ServiceSpecificException(
                        ICameraService.ERROR_DISCONNECTED,
                        "Camera service is currently unavailable");
                    deviceImpl.setRemoteFailure(sse);
                    throwAsPublicException(sse);
                }
    
                // TODO: factor out callback to be non-nested, then move setter to constructor
                // For now, calling setRemoteDevice will fire initial
                // onOpened/onUnconfigured callbacks.
                // This function call may post onDisconnected and throw CAMERA_DISCONNECTED if
                // cameraUser dies during setup.
                //将打开相机获取的cameraUser对象设置到CameraDeviceImpl deviceImpl对象中
                deviceImpl.setRemoteDevice(cameraUser);
                device = deviceImpl;
            }
            
            //返回CameraDeviceImpl对象deviceImpl
            return device;
        }

     前面分析configureStreamsChecked可知其主要分为3步:

    •     mRemoteDevice.beginConfigure();//开始配置
    •     mRemoteDevice.deleteStream(streamId)和mRemoteDevice.createStream(outConfig)//创建、删除流
    •     mRemoteDevice.endConfigure(operatingMode);//结束配置,前两步是准备工作,这一步才是真正配置流

        主要分析下endConfigure:
      frameworksasecorejavaandroidhardwarecamera2implICameraDeviceUserWrapper.java

        public void endConfigure(int operatingMode, CameraMetadataNative sessionParams)
               throws CameraAccessException {
            try {
                // 通过Binder IPC 实际调用接口实现在 CameraDeviceClient.cpp
                mRemoteDevice.endConfigure(operatingMode, (sessionParams == null) ?
                        new CameraMetadataNative() : sessionParams);
            } catch (Throwable t) {
                CameraManager.throwAsPublicException(t);
                throw new UnsupportedOperationException("Unexpected exception", t);
            }
        }

      进入 frameworksavservicescameralibcameraserviceapi2CameraDeviceClient.cpp

    binder::Status CameraDeviceClient::endConfigure(int operatingMode,
            const hardware::camera2::impl::CameraMetadataNative& sessionParams) {
        ATRACE_CALL();
        ALOGV("%s: ending configure (%d input stream, %zu output surfaces)",
                __FUNCTION__, mInputStream.configured ? 1 : 0,
                mStreamMap.size());
    ......
      
    // Sanitize the high speed session against necessary capability bit. bool isConstrainedHighSpeed = (operatingMode == ICameraDeviceUser::CONSTRAINED_HIGH_SPEED_MODE); // 检查是否支持CONSTRAINED_HIGH_SPEED_MODE if (isConstrainedHighSpeed) { CameraMetadata staticInfo = mDevice->info(); camera_metadata_entry_t entry = staticInfo.find(ANDROID_REQUEST_AVAILABLE_CAPABILITIES); bool isConstrainedHighSpeedSupported = false; for(size_t i = 0; i < entry.count; ++i) { uint8_t capability = entry.data.u8[i]; if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_CONSTRAINED_HIGH_SPEED_VIDEO) { isConstrainedHighSpeedSupported = true; break; } } if (!isConstrainedHighSpeedSupported) { String8 msg = String8::format( "Camera %s: Try to create a constrained high speed configuration on a device" " that doesn't support it.", mCameraIdStr.string()); ALOGE("%s: %s", __FUNCTION__, msg.string()); return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string()); } } //检查通过后开始配置流 status_t err = mDevice->configureStreams(sessionParams, operatingMode); if (err == BAD_VALUE) { String8 msg = String8::format("Camera %s: Unsupported set of inputs/outputs provided", mCameraIdStr.string()); ALOGE("%s: %s", __FUNCTION__, msg.string()); res = STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string()); } else if (err != OK) { String8 msg = String8::format("Camera %s: Error configuring streams: %s (%d)", mCameraIdStr.string(), strerror(-err), err); ALOGE("%s: %s", __FUNCTION__, msg.string()); res = STATUS_ERROR(CameraService::ERROR_INVALID_OPERATION, msg.string()); } return res; }

     接着进入到:frameworksavservicescameralibcameraservicedevice3Camera3Device.cpp

    status_t Camera3Device::configureStreams(const CameraMetadata& sessionParams, int operatingMode) {
        ATRACE_CALL();
        ALOGV("%s: E", __FUNCTION__);
    
        Mutex::Autolock il(mInterfaceLock);
        Mutex::Autolock l(mLock);
    
        // In case the client doesn't include any session parameter, try a
        // speculative configuration using the values from the last cached
        // default request.
        if (sessionParams.isEmpty() &&
                ((mLastTemplateId > 0) && (mLastTemplateId < CAMERA3_TEMPLATE_COUNT)) &&
                (!mRequestTemplateCache[mLastTemplateId].isEmpty())) {
            ALOGV("%s: Speculative session param configuration with template id: %d", __func__,
                    mLastTemplateId);
            return filterParamsAndConfigureLocked(mRequestTemplateCache[mLastTemplateId],
                    operatingMode);
        }
    
        return filterParamsAndConfigureLocked(sessionParams, operatingMode);
    }

     接着调用到如下函数:

    status_t Camera3Device::configureStreamsLocked(int operatingMode,
            const CameraMetadata& sessionParams, bool notifyRequestThread) {
        ATRACE_CALL();
        status_t res;
    
        if (mStatus != STATUS_UNCONFIGURED && mStatus != STATUS_CONFIGURED) {
            CLOGE("Not idle");
            return INVALID_OPERATION;
        }
    
        if (operatingMode < 0) {
            CLOGE("Invalid operating mode: %d", operatingMode);
            return BAD_VALUE;
        }
    
        // 检查是否是isConstrainedHighSpeed模式
        bool isConstrainedHighSpeed =
                static_cast<int>(StreamConfigurationMode::CONSTRAINED_HIGH_SPEED_MODE) ==
                operatingMode;
    
        if (mOperatingMode != operatingMode) {
            mNeedConfig = true;
            mIsConstrainedHighSpeedConfiguration = isConstrainedHighSpeed;
            mOperatingMode = operatingMode;
        }
    
        if (!mNeedConfig) {
            ALOGV("%s: Skipping config, no stream changes", __FUNCTION__);
            return OK;
        }
    
        // Workaround for device HALv3.2 or older spec bug - zero streams requires
        // adding a dummy stream instead.
        // TODO: Bug: 17321404 for fixing the HAL spec and removing this workaround.
        if (mOutputStreams.size() == 0) {
            addDummyStreamLocked();
        } else {
            tryRemoveDummyStreamLocked();
        }
    
        // Start configuring the streams
        ALOGV("%s: Camera %s: Starting stream configuration", __FUNCTION__, mId.string());
    
        mPreparerThread->pause();
    
        camera3_stream_configuration config;
        config.operation_mode = mOperatingMode; //将mOperatingMode赋值给config.operation_mode
        config.num_streams = (mInputStream != NULL) + mOutputStreams.size();
    
        Vector<camera3_stream_t*> streams;
        streams.setCapacity(config.num_streams);
        std::vector<uint32_t> bufferSizes(config.num_streams, 0);
    
    
        if (mInputStream != NULL) {
            camera3_stream_t *inputStream;
            inputStream = mInputStream->startConfiguration();
            if (inputStream == NULL) {
                CLOGE("Can't start input stream configuration");
                cancelStreamsConfigurationLocked();
                return INVALID_OPERATION;
            }
            streams.add(inputStream);
        }
    
        // 输出流配置
        for (size_t i = 0; i < mOutputStreams.size(); i++) {
    
            // Don't configure bidi streams twice, nor add them twice to the list
            if (mOutputStreams[i].get() ==
                static_cast<Camera3StreamInterface*>(mInputStream.get())) {
    
                config.num_streams--;
                continue;
            }
    
            camera3_stream_t *outputStream;
            outputStream = mOutputStreams.editValueAt(i)->startConfiguration();
            if (outputStream == NULL) {
                CLOGE("Can't start output stream configuration");
                cancelStreamsConfigurationLocked();
                return INVALID_OPERATION;
            }
            streams.add(outputStream);
    
            if (outputStream->format == HAL_PIXEL_FORMAT_BLOB &&
                    outputStream->data_space == HAL_DATASPACE_V0_JFIF) {
                size_t k = i + ((mInputStream != nullptr) ? 1 : 0); // Input stream if present should
                                                                    // always occupy the initial entry.
                bufferSizes[k] = static_cast<uint32_t>(
                        getJpegBufferSize(outputStream->width, outputStream->height));
            }
        }
    
        config.streams = streams.editArray();
    
        // Do the HAL configuration; will potentially touch stream
        // max_buffers, usage, priv fields.
    
        const camera_metadata_t *sessionBuffer = sessionParams.getAndLock();
    
        //通知HAL层配置流
        res = mInterface->configureStreams(sessionBuffer, &config, bufferSizes);
        sessionParams.unlock(sessionBuffer);
    ......
      
    return OK; }

     最后调用HIDL接口: frameworksavservicescameralibcameraservicedevice3Camera3Device.cpp

    status_t Camera3Device::HalInterface::configureStreams(const camera_metadata_t *sessionParams,
            camera3_stream_configuration *config, const std::vector<uint32_t>& bufferSizes) {
        ATRACE_NAME("CameraHal::configureStreams");
        if (!valid()) return INVALID_OPERATION;
        status_t res = OK;

    ......
      
    // See if we have v3.4 or v3.3 HAL if (mHidlSession_3_4 != nullptr) { // We do; use v3.4 for the call ALOGV("%s: v3.4 device found", __FUNCTION__); device::V3_4::HalStreamConfiguration finalConfiguration3_4; auto err = mHidlSession_3_4->configureStreams_3_4(requestedConfiguration3_4, [&status, &finalConfiguration3_4] (common::V1_0::Status s, const device::V3_4::HalStreamConfiguration& halConfiguration) { finalConfiguration3_4 = halConfiguration; status = s; }); if (!err.isOk()) { ALOGE("%s: Transaction error: %s", __FUNCTION__, err.description().c_str()); return DEAD_OBJECT; } finalConfiguration.streams.resize(finalConfiguration3_4.streams.size()); for (size_t i = 0; i < finalConfiguration3_4.streams.size(); i++) { finalConfiguration.streams[i] = finalConfiguration3_4.streams[i].v3_3; } } else if (mHidlSession_3_3 != nullptr) { // We do; use v3.3 for the call ALOGV("%s: v3.3 device found", __FUNCTION__); auto err = mHidlSession_3_3->configureStreams_3_3(requestedConfiguration3_2, [&status, &finalConfiguration] (common::V1_0::Status s, const device::V3_3::HalStreamConfiguration& halConfiguration) { finalConfiguration = halConfiguration; status = s; }); if (!err.isOk()) { ALOGE("%s: Transaction error: %s", __FUNCTION__, err.description().c_str()); return DEAD_OBJECT; } } else { // We don't; use v3.2 call and construct a v3.3 HalStreamConfiguration ALOGV("%s: v3.2 device found", __FUNCTION__); HalStreamConfiguration finalConfiguration_3_2; auto err = mHidlSession->configureStreams(requestedConfiguration3_2, [&status, &finalConfiguration_3_2] (common::V1_0::Status s, const HalStreamConfiguration& halConfiguration) { finalConfiguration_3_2 = halConfiguration; status = s; }); if (!err.isOk()) { ALOGE("%s: Transaction error: %s", __FUNCTION__, err.description().c_str()); return DEAD_OBJECT; } finalConfiguration.streams.resize(finalConfiguration_3_2.streams.size()); for (size_t i = 0; i < finalConfiguration_3_2.streams.size(); i++) { finalConfiguration.streams[i].v3_2 = finalConfiguration_3_2.streams[i]; finalConfiguration.streams[i].overrideDataSpace = requestedConfiguration3_2.streams[i].dataSpace; } } ......
      return res; }

     HIDL接口各版本在: hardwareinterfacescameradevice

     HAL层的代码部分如下: vendorqcomproprietarycamxsrccorehalcamxhal3.cpp

    static int configure_streams(
        const struct camera3_device*    pCamera3DeviceAPI,
        camera3_stream_configuration_t* pStreamConfigsAPI)
    {
        CAMX_ENTRYEXIT_SCOPE(CamxLogGroupHAL, SCOPEEventHAL3ConfigureStreams);
    ......
    
            Camera3StreamConfig* pStreamConfigs = reinterpret_cast<Camera3StreamConfig*>(pStreamConfigsAPI);
    
            result = pHALDevice->ConfigureStreams(pStreamConfigs);
    
            if ((CamxResultSuccess != result) && (CamxResultEInvalidArg != result))
            {
                // HAL interface requires -ENODEV (EFailed) if a fatal error occurs
                result = CamxResultEFailed;
            }
            if (CamxResultSuccess == result)
            {
                for (UINT32 stream = 0; stream < pStreamConfigsAPI->num_streams; stream++)
                {
                    CAMX_ASSERT(NULL != pStreamConfigsAPI->streams[stream]);
    
                    if (NULL == pStreamConfigsAPI->streams[stream])
                    {
                        CAMX_LOG_ERROR(CamxLogGroupHAL, "Invalid argument 2 for configure_streams()");
                        // HAL interface requires -EINVAL (EInvalidArg) for invalid arguments
                        result = CamxResultEInvalidArg;
                        break;
                    }
                    else
                    {
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, " FINAL stream[%d] = %p - info:", stream,
                            pStreamConfigsAPI->streams[stream]);
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            format       : %d, %s",
                            pStreamConfigsAPI->streams[stream]->format,
                            FormatToString(pStreamConfigsAPI->streams[stream]->format));
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            width        : %d",
                            pStreamConfigsAPI->streams[stream]->width);
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            height       : %d",
                            pStreamConfigsAPI->streams[stream]->height);
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            stream_type  : %08x, %s",
                            pStreamConfigsAPI->streams[stream]->stream_type,
                            StreamTypeToString(pStreamConfigsAPI->streams[stream]->stream_type));
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            usage        : %08x",
                            pStreamConfigsAPI->streams[stream]->usage);
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            max_buffers  : %d",
                            pStreamConfigsAPI->streams[stream]->max_buffers);
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            rotation     : %08x, %s",
                            pStreamConfigsAPI->streams[stream]->rotation,
                            RotationToString(pStreamConfigsAPI->streams[stream]->rotation));
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            data_space   : %08x, %s",
                            pStreamConfigsAPI->streams[stream]->data_space,
                            DataSpaceToString(pStreamConfigsAPI->streams[stream]->data_space));
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            priv         : %p",
                            pStreamConfigsAPI->streams[stream]->priv);
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            reserved[0]         : %p",
                            pStreamConfigsAPI->streams[stream]->reserved[0]);
                        CAMX_LOG_CONFIG(CamxLogGroupHAL, "            reserved[1]         : %p",
                            pStreamConfigsAPI->streams[stream]->reserved[1]);
    
                        Camera3HalStream* pHalStream =
                            reinterpret_cast<Camera3HalStream*>(pStreamConfigsAPI->streams[stream]->reserved[0]);
                        if (pHalStream != NULL)
                        {
                            if (TRUE == HwEnvironment::GetInstance()->GetStaticSettings()->enableHALFormatOverride) //GetInstance()中会初始化当前设备、sensor的 capabilities
                            {
                                pStreamConfigsAPI->streams[stream]->format =
                                    static_cast<HALPixelFormat>(pHalStream->overrideFormat);
                            }
                            CAMX_LOG_CONFIG(CamxLogGroupHAL,
                                "   pHalStream: %p format : 0x%x, overrideFormat : 0x%x consumer usage: %llx, producer usage: %llx",
                                pHalStream, pStreamConfigsAPI->streams[stream]->format,
                                pHalStream->overrideFormat, pHalStream->consumerUsage, pHalStream->producerUsage);
                        }
                    }
                }
            }
     ......
      return Utils::CamxResultToErrno(result);
    }

     其中ConfigureStreams的实现在  vendorqcomproprietarycamxsrccorehalcamxhaldevice.cpp :

    CamxResult HALDevice::ConfigureStreams(
        Camera3StreamConfig* pStreamConfigs)
    {
        CamxResult result = CamxResultSuccess;
    
        // Validate the incoming stream configurations
        result = CheckValidStreamConfig(pStreamConfigs);
    
    ......
      
    if (CamxResultSuccess == result) { ClearFrameworkRequestBuffer(); m_numPipelines = 0; if (TRUE == m_bCHIModuleInitialized) { GetCHIAppCallbacks()->chi_teardown_override_session(reinterpret_cast<camera3_device*>(&m_camera3Device), 0, NULL); }
    m_bCHIModuleInitialized
    = CHIModuleInitialize(pStreamConfigs); // 初始化Chi模块 ...... } return result; }

      CHIModuleInitialize中调用了Chi层注册的回调函数:

    ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    // HALDevice::CHIModuleInitialize
    ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    BOOL HALDevice::CHIModuleInitialize(
        Camera3StreamConfig* pStreamConfigs)
    {
        BOOL isOverrideEnabled = FALSE;
    
        if (TRUE == HAL3Module::GetInstance()->IsCHIOverrideModulePresent())
        {
            /// @todo (CAMX-1518) Handle private data from Override module
            VOID*                   pPrivateData;
            chi_hal_callback_ops_t* pCHIAppCallbacks  = GetCHIAppCallbacks();
         
         // 调用Chi层的回调函数
    pCHIAppCallbacks
    ->chi_initialize_override_session(GetCameraId(), reinterpret_cast<const camera3_device_t*>(&m_camera3Device), &m_HALCallbacks, reinterpret_cast<camera3_stream_configuration_t*>(pStreamConfigs), &isOverrideEnabled, &pPrivateData); } return isOverrideEnabled; }

     chi层的回调函数实现在  vendorqcomproprietarychi-cdkvendorchioverridedefaultchxextensioninterface.cpp :

    CDKResult ExtensionModule::InitializeOverrideSession(
        uint32_t                        logicalCameraId,
        const camera3_device_t*         pCamera3Device,
        const chi_hal_ops_t*            chiHalOps,
        camera3_stream_configuration_t* pStreamConfig,
        int*                            pIsOverrideEnabled,
        VOID**                          pPrivate)
    {
        CDKResult          result             = CDKResultSuccess;
        UINT32             modeCount          = 0;
        ChiSensorModeInfo* pAllModes          = NULL;
        UINT32             fps                = *m_pDefaultMaxFPS;
        BOOL               isVideoMode        = FALSE;
        uint32_t           operation_mode;
        static BOOL        fovcModeCheck      = EnableFOVCUseCase();
        UsecaseId          selectedUsecaseId  = UsecaseId::NoMatch;
        UINT               minSessionFps      = 0;
        UINT               maxSessionFps      = 0;
    ......
    if ((isVideoMode == TRUE) && (operation_mode != 0)) { UINT32 numSensorModes = m_logicalCameraInfo[logicalCameraId].m_cameraCaps.numSensorModes; // 获取sensor的信息,跟HFR相关的有 frameRate、batchedFrames等等 CHISENSORMODEINFO* pAllSensorModes = m_logicalCameraInfo[logicalCameraId].pSensorModeInfo; if ((operation_mode - 1) >= numSensorModes) { result = CDKResultEOverflow; CHX_LOG_ERROR("operation_mode: %d, numSensorModes: %d", operation_mode, numSensorModes); } else { fps = pAllSensorModes[operation_mode - 1].frameRate; } } if (CDKResultSuccess == result) { #if defined(CAMX_ANDROID_API) && (CAMX_ANDROID_API >= 28) //Android-P or better camera_metadata_t *metadata = const_cast<camera_metadata_t*>(pStreamConfig->session_parameters); camera_metadata_entry_t entry = { 0 }; entry.tag = ANDROID_CONTROL_AE_TARGET_FPS_RANGE; // The client may choose to send NULL sesssion parameter, which is fine. For example, torch mode // will have NULL session param. if (metadata != NULL) { // 获取对应tag的entry结构体,并将数据保存在entry传入的参数中。 int ret = find_camera_metadata_entry(metadata, entry.tag, &entry); if(ret == 0) { minSessionFps = entry.data.i32[0]; maxSessionFps = entry.data.i32[1]; m_usecaseMaxFPS = maxSessionFps; } } #endif if ((StreamConfigModeConstrainedHighSpeed == pStreamConfig->operation_mode) || (StreamConfigModeSuperSlowMotionFRC == pStreamConfig->operation_mode)) { // 如果是HFR模式则进行如下操作: // 1)查找与Video/Preview stream匹配的HFRVideoSizes。 // 注:preview 和 recording streams size必须一样,否则高速摄像 session会创建失败。 // 2)如果single entry在SupportedHFRVideoSizes中找到,我们就选择这个entry中的batchsize。 SearchNumBatchedFrames(logicalCameraId, pStreamConfig, &m_usecaseNumBatchedFrames, &m_usecaseMaxFPS, maxSessionFps); if (480 > m_usecaseMaxFPS) { m_CurrentpowerHint = PERF_LOCK_POWER_HINT_VIDEO_ENCODE_HFR; } else { // For 480FPS or higher, require more aggresive power hint m_CurrentpowerHint = PERF_LOCK_POWER_HINT_VIDEO_ENCODE_HFR_480FPS; } } else { // Not a HFR usecase, batch frames value need to be set to 1. m_usecaseNumBatchedFrames = 1; if (maxSessionFps == 0) { m_usecaseMaxFPS = fps; } if (TRUE == isVideoMode) { if (30 >= m_usecaseMaxFPS) { m_CurrentpowerHint = PERF_LOCK_POWER_HINT_VIDEO_ENCODE; } else { m_CurrentpowerHint = PERF_LOCK_POWER_HINT_VIDEO_ENCODE_60FPS; } } else { m_CurrentpowerHint = PERF_LOCK_POWER_HINT_PREVIEW; } } if ((NULL != m_pPerfLockManager[logicalCameraId]) && (m_CurrentpowerHint != m_previousPowerHint)) { m_pPerfLockManager[logicalCameraId]->ReleasePerfLock(m_previousPowerHint); } // Example [B == batch]: (240 FPS / 4 FPB = 60 BPS) / 30 FPS (Stats frequency goal) = 2 BPF i.e. skip every other stats *m_pStatsSkipPattern = m_usecaseMaxFPS / m_usecaseNumBatchedFrames / 30; if (*m_pStatsSkipPattern < 1) { *m_pStatsSkipPattern = 1; } m_VideoHDRMode = (StreamConfigModeVideoHdr == pStreamConfig->operation_mode); m_torchWidgetUsecase = (StreamConfigModeQTITorchWidget == pStreamConfig->operation_mode); // this check is introduced to avoid set *m_pEnableFOVC == 1 if fovcEnable is disabled in // overridesettings & fovc bit is set in operation mode. // as well as to avoid set,when we switch Usecases. if (TRUE == fovcModeCheck) { *m_pEnableFOVC = ((pStreamConfig->operation_mode & StreamConfigModeQTIFOVC) == StreamConfigModeQTIFOVC) ? 1 : 0; } SetHALOps(chiHalOps, logicalCameraId); m_logicalCameraInfo[logicalCameraId].m_pCamera3Device = pCamera3Device; // 根据CameraInfo找到匹配Usecase selectedUsecaseId = m_pUsecaseSelector->GetMatchingUsecase(&m_logicalCameraInfo[logicalCameraId], pStreamConfig); CHX_LOG_CONFIG("Session_parameters FPS range %d:%d, BatchSize: %u FPS: %u SkipPattern: %u, " "cameraId = %d selected use case = %d", minSessionFps, maxSessionFps, m_usecaseNumBatchedFrames, m_usecaseMaxFPS, *m_pStatsSkipPattern, logicalCameraId, selectedUsecaseId); // FastShutter mode supported only in ZSL usecase. if ((pStreamConfig->operation_mode == StreamConfigModeFastShutter) && (UsecaseId::PreviewZSL != selectedUsecaseId)) { pStreamConfig->operation_mode = StreamConfigModeNormal; } m_operationMode[logicalCameraId] = pStreamConfig->operation_mode; } if (UsecaseId::NoMatch != selectedUsecaseId) { // 根据UsecaseId创建Usecase对象,HFR的UsecaseId是default m_pSelectedUsecase[logicalCameraId] = m_pUsecaseFactory->CreateUsecaseObject(&m_logicalCameraInfo[logicalCameraId], selectedUsecaseId, pStreamConfig);
       } .....

    return result; }

      其中会调用到 AdvancedCameraUsecase::Create,实现在 vendorqcomproprietarychi-cdkvendorchioverridedefaultchxadvancedcamerausecase.cpp

    AdvancedCameraUsecase* AdvancedCameraUsecase::Create(
        LogicalCameraInfo*              pCameraInfo,   ///< Camera info
        camera3_stream_configuration_t* pStreamConfig, ///< Stream configuration
        UsecaseId                       usecaseId)     ///< Identifier for usecase function
    {
        CDKResult              result                 = CDKResultSuccess;
        AdvancedCameraUsecase* pAdvancedCameraUsecase = CHX_NEW AdvancedCameraUsecase;
    
        if ((NULL != pAdvancedCameraUsecase) && (NULL != pStreamConfig))
        {
            result = pAdvancedCameraUsecase->Initialize(pCameraInfo, pStreamConfig, usecaseId); //其中接着会调用CameraUsecaseBase::Initialize(m_pCallbacks),然后调用CameraUsecaseBase::CreatePipeline
    
            if (CDKResultSuccess != result)
            {
                pAdvancedCameraUsecase->Destroy(FALSE);
                pAdvancedCameraUsecase = NULL;
            }
        }
        else
        {
            result = CDKResultEFailed;
        }
    
        return pAdvancedCameraUsecase;
    }

     至此,chi层根据APP的参数以及平台、sensor的信息和topology xml的结构,选择匹配的UsecaseID并创建了所需的pipeline。

     HFR配置流时的限制:

    •     通过createConstrainedHighSpeedCaptureSession配置高速流
    •     只能配置一个或者两个流,一个预览流,一个是拍照流
    •     对预览流的限制是usage为 GRALLOC_USAGE_HW_TEXTURE | GRALLOC_USAGE_HW_COMPOSER | GRALLOC_USAGE_HW_RENDER
    •     对录像流的限制是usage为GRALLOC_USAGE_HW_VIDEO_ENCODER

     

    高通平台如何获取platform和camera sensor的capabilities?

      Snapdragon相机的setting界面中可以选择Video quality和对应的Video High FrameRate,而选择列表中的支持项是在camera服务启动时根据平台和camera sensor的输出能力共同决定。
      在选择一个Video quality后,HFR选项列表会被更新,其中的操作就是查询当前分辨率支持的FPS,流程如下:

     (1)packagesappsSnapdragonCamerasrccomandroidcameraSettingsManager.java

       //查询支持的fps并更新列表
      private void filterHFROptions() { ListPreference hfrPref = mPreferenceGroup.findPreference(KEY_VIDEO_HIGH_FRAME_RATE); if (hfrPref != null) { hfrPref.reloadInitialEntriesAndEntryValues(); if (filterUnsupportedOptions(hfrPref, getSupportedHighFrameRate())) { mFilteredKeys.add(hfrPref.getKey()); } } }
        private List<String> getSupportedHighFrameRate() {
            ArrayList<String> supported = new ArrayList<String>();
            supported.add("off");
            ListPreference videoQuality = mPreferenceGroup.findPreference(KEY_VIDEO_QUALITY);
            ListPreference videoEncoder = mPreferenceGroup.findPreference(KEY_VIDEO_ENCODER);
            if (videoQuality == null || videoEncoder == null) return supported;
            String videoSizeStr = videoQuality.getValue();
            int videoEncoderNum = SettingTranslation.getVideoEncoder(videoEncoder.getValue());
            VideoCapabilities videoCapabilities = null;
            boolean findVideoEncoder = false;
            if (videoSizeStr != null) {
                Size videoSize = parseSize(videoSizeStr);
                MediaCodecList allCodecs = new MediaCodecList(MediaCodecList.ALL_CODECS);
                for (MediaCodecInfo info : allCodecs.getCodecInfos()) {
                    if (!info.isEncoder() || info.getName().contains("google")) continue;
                    for (String type : info.getSupportedTypes()) {
                        if ((videoEncoderNum == MediaRecorder.VideoEncoder.MPEG_4_SP && type.equalsIgnoreCase(MediaFormat.MIMETYPE_VIDEO_MPEG4))
                                || (videoEncoderNum == MediaRecorder.VideoEncoder.H263 && type.equalsIgnoreCase(MediaFormat.MIMETYPE_VIDEO_H263))
                                || (videoEncoderNum == MediaRecorder.VideoEncoder.H264 && type.equalsIgnoreCase(MediaFormat.MIMETYPE_VIDEO_AVC))
                                || (videoEncoderNum == MediaRecorder.VideoEncoder.HEVC && type.equalsIgnoreCase(MediaFormat.MIMETYPE_VIDEO_HEVC))) {
                            CodecCapabilities codecCapabilities = info.getCapabilitiesForType(type);
                            videoCapabilities = codecCapabilities.getVideoCapabilities();
                            findVideoEncoder = true;
                            break;
                        }
                    }
                    if (findVideoEncoder) break;
                }
    
                try {
                    // 获取当前video size对应支持的fps
                    Range[] range = getSupportedHighSpeedVideoFPSRange(mCameraId, videoSize);
                    for (Range r : range) {
                        // To support HFR for both preview and recording,
                        // minmal FPS needs to be equal to maximum FPS
                        if ((int) r.getUpper() == (int) r.getLower()) {
                            if (videoCapabilities != null) {
                                if (videoCapabilities.areSizeAndRateSupported(
                                        videoSize.getWidth(), videoSize.getHeight(), (int) r.getUpper())) {
                                    supported.add("hfr" + String.valueOf(r.getUpper()));
                                    supported.add("hsr" + String.valueOf(r.getUpper()));
                                }
                            }
                        }
                    }
                } catch (IllegalArgumentException ex) {
                    Log.w(TAG, "HFR is not supported for this resolution " + ex);
                }
          .......
            }
            return supported;
        }

     (2) frameworksasecorejavaandroidhardwarecamera2paramsStreamConfigurationMap.java

        public Range<Integer>[] getHighSpeedVideoFpsRangesFor(Size size) {
            // 检查当前选择的video size是否支持HFR
            Integer fpsRangeCount = mHighSpeedVideoSizeMap.get(size);
            if (fpsRangeCount == null || fpsRangeCount == 0) {
                throw new IllegalArgumentException(String.format(
                        "Size %s does not support high speed video recording", size));
            }
    
            @SuppressWarnings("unchecked")
            Range<Integer>[] fpsRanges = new Range[fpsRangeCount];
            int i = 0;
            // 获取当前video size支持的各fps
            for (HighSpeedVideoConfiguration config : mHighSpeedVideoConfigurations) {
                if (size.equals(config.getSize())) {
                    fpsRanges[i++] = config.getFpsRange();
                }
            }
            return fpsRanges;
        }
    其中 mHighSpeedVideoConfigurations 在下面接口中初始化 :frameworksasecorejavaandroidhardwarecamera2implCameraMetadataNative.java
        private StreamConfigurationMap getStreamConfigurationMap() {
            StreamConfiguration[] configurations = getBase(
                    CameraCharacteristics.SCALER_AVAILABLE_STREAM_CONFIGURATIONS);
            StreamConfigurationDuration[] minFrameDurations = getBase(
                    CameraCharacteristics.SCALER_AVAILABLE_MIN_FRAME_DURATIONS);
            StreamConfigurationDuration[] stallDurations = getBase(
                    CameraCharacteristics.SCALER_AVAILABLE_STALL_DURATIONS);
            StreamConfiguration[] depthConfigurations = getBase(
                    CameraCharacteristics.DEPTH_AVAILABLE_DEPTH_STREAM_CONFIGURATIONS);
            StreamConfigurationDuration[] depthMinFrameDurations = getBase(
                    CameraCharacteristics.DEPTH_AVAILABLE_DEPTH_MIN_FRAME_DURATIONS);
            StreamConfigurationDuration[] depthStallDurations = getBase(
                    CameraCharacteristics.DEPTH_AVAILABLE_DEPTH_STALL_DURATIONS);

         // 从camx层获取high speed video的配置信息 HighSpeedVideoConfiguration[] highSpeedVideoConfigurations
    = getBase( CameraCharacteristics.CONTROL_AVAILABLE_HIGH_SPEED_VIDEO_CONFIGURATIONS);
    ReprocessFormatsMap inputOutputFormatsMap
    = getBase( CameraCharacteristics.SCALER_AVAILABLE_INPUT_OUTPUT_FORMATS_MAP); int[] capabilities = getBase(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES); boolean listHighResolution = false; for (int capability : capabilities) { if (capability == CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_BURST_CAPTURE) { listHighResolution = true; break; } }

         // 创建StreamConfigurationMap,检查config信息
    return new StreamConfigurationMap( configurations, minFrameDurations, stallDurations, depthConfigurations, depthMinFrameDurations, depthStallDurations, highSpeedVideoConfigurations, inputOutputFormatsMap, listHighResolution); }

    (3) HAL层的 cameraInfo 就是本文第二部分分析ConfigureStreams时获取的,在 vendorqcomproprietarycamxsrccorehalcamxhal3.cpp中调用pHALDevice->ConfigureStreams(pStreamConfigs);
       进入到了vendorqcomproprietarycamxsrccorehalcamxhaldevice.cpp:

    ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    // HALDevice::ConfigureStreams
    ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    CamxResult HALDevice::ConfigureStreams(
        Camera3StreamConfig* pStreamConfigs)
    {
        CamxResult result = CamxResultSuccess;
    
        // Validate the incoming stream configurations 
        result = CheckValidStreamConfig(pStreamConfigs); // 其中调用了 pHWEnvironment->GetCameraInfo(logicalCameraId, &cameraInfo);
    
        if ((StreamConfigModeConstrainedHighSpeed == pStreamConfigs->operationMode) ||
            (StreamConfigModeSuperSlowMotionFRC == pStreamConfigs->operationMode))
        {
            SearchNumBatchedFrames (pStreamConfigs, &m_usecaseNumBatchedFrames, &m_FPSValue);
            CAMX_ASSERT(m_usecaseNumBatchedFrames > 1);
        }
        else
        {
            // Not a HFR usecase batch frames value need to set to 1.
            m_usecaseNumBatchedFrames = 1;
        }
    
       ......
     
    return result; }

    (4)platform、sensor的capabilities信息是在 vendorqcomproprietarycamxsrccorecamxhwenvironment.cpp 中初始化

    VOID HwEnvironment::InitCaps()
    {
    ......
    if (CamxResultSuccess == result) {
         // 平台、sensor的capabilities主要通过以下函数初始化 ProbeImageSensorModules(); //创建ImageSensorModuleDataManager,Initialize()时调用CreateAllSensorModuleSetManagers,其中会加载sensor的信息bin文件 EnumerateDevices(); InitializeSensorSubModules(); InitializeSensorStaticCaps(); //其中进一步调用ImageSensorModuleData::GetStaticCaps获取sensor的capability result
    = m_staticEntryMethods.GetStaticCaps(&m_platformCaps[0]); // copy the static capacity to remaining sensor's for (UINT index = 1; index < m_numberSensors; index++) { Utils::Memcpy(&m_platformCaps[index], &m_platformCaps[0], sizeof(m_platformCaps[0])); } if (NULL != m_pOEMInterface->pInitializeExtendedPlatformStaticCaps) { m_pOEMInterface->pInitializeExtendedPlatformStaticCaps(&m_platformCaps[0], m_numberSensors); } } ...... }

    (5) 在ProbeImageSensorModules中创建ImageSensorModuleDataManager时会加载模组的bin文件获取sensor信息 :vendorqcomproprietarycamxsrccorecamximagesensormoduledatamanager.cpp

    CamxResult ImageSensorModuleDataManager::CreateAllSensorModuleSetManagers()
    {
        CamxResult                   result                  = CamxResultSuccess;
        ImageSensorModuleSetManager* pSensorModuleSetManager = NULL;
        UINT16                       fileCount               = 0;
        CHAR                         binaryFiles[MaxSensorModules][FILENAME_MAX];
    
        // 当前8150mtp使用的camera模组是ov12a10(wide),所以会加载com.qti.sensormodule.ofilm_ov12a10.bin
        fileCount = OsUtils::GetFilesFromPath(SensorModulesPath, FILENAME_MAX, &binaryFiles[0][0], "*", "sensormodule", "*", "bin");
        CAMX_ASSERT((fileCount != 0) && (fileCount < MaxSensorModules));
    
        m_numSensorModuleManagers = 0;
    
        if ((fileCount == 0) || (fileCount >= MaxSensorModules))
        {
            CAMX_LOG_ERROR(CamxLogGroupSensor, "Invalid fileCount", fileCount);
            result = CamxResultEFailed;
        }
        else
        {
            for (UINT i = 0; i < fileCount; i++)
            {
                result = GetSensorModuleManagerObj(&binaryFiles[i][0], &pSensorModuleSetManager);
                if (CamxResultSuccess == result)
                {
                    m_pSensorModuleManagers[m_numSensorModuleManagers++] = pSensorModuleSetManager;
                }
                else
                {
                    CAMX_LOG_ERROR(CamxLogGroupSensor,
                    "GetSensorModuleManagerObj failed i: %d binFile: %s",
                    i, &binaryFiles[i][0]);
                }
            }
    
            CAMX_ASSERT(m_numSensorModuleManagers > 0);
            if (0 == m_numSensorModuleManagers)
            {
                CAMX_LOG_ERROR(CamxLogGroupSensor, "Invalid number of sensor module managers");
                result = CamxResultEFailed;
            }
        }
    
        return result;
    }

     获取sensor static capability的调用时序图:

      

      上面com.qti.sensormodule.ofilm_ov12a10.bin可以修改对应xml编译更新,路径在:vendorqcomproprietarychi-cdkvendorsensordefaultov12a10

      

     从ov12a10_sensor.xml可以看到1080p支持最大60fps:

      

      需要注意的是,修改xml参数frameRate为120,更新.bin后app的设置中的确会增加120 fps选项,但sensor的输出能力如果只能达到1080p@60fps的话,录制结果会卡顿,由于sensor的输出帧率低于编码率,所以插入了很多重复帧。

      Ps: HFR Usecase需要帧率大于等于120fps。

  • 相关阅读:
    Android 之 进度条样式
    最佳下载实例
    最佳下载实例
    使用git将项目上传到github(最简单方法)
    使用git将项目上传到github(最简单方法)
    打开URL启动android默认浏览器,启动指定浏览器
    打开URL启动android默认浏览器,启动指定浏览器
    android指定浏览器打开特定网页
    【POJ】[3026]Borg Maze
    【POJ】[3026]Borg Maze
  • 原文地址:https://www.cnblogs.com/blogs-of-lxl/p/11430133.html
Copyright © 2011-2022 走看看