zoukankan      html  css  js  c++  java
  • Cartographer源码阅读(7):轨迹推算和位姿推算的原理

    其实也就是包括两个方面的内容:类似于运动模型的位姿估计和扫描匹配,因为需要计算速度,所以时间就有必要了!

    1. PoseExtrapolator解决了IMU数据、里程计和位姿信息进行融合的问题。

    该类定义了三个队列。

    1 std::deque<TimedPose> timed_pose_queue_;
    2 std::deque<sensor::ImuData> imu_data_;
    3 std::deque<sensor::OdometryData> odometry_data_;

    定义了(a)通过位姿计算线速度和角速度对象

      Eigen::Vector3d linear_velocity_from_poses_ = Eigen::Vector3d::Zero();
      Eigen::Vector3d angular_velocity_from_poses_ = Eigen::Vector3d::Zero();

    和(b)通过里程计计算线速度和角速度对象

    1 Eigen::Vector3d linear_velocity_from_odometry_ = Eigen::Vector3d::Zero();
    2 Eigen::Vector3d angular_velocity_from_odometry_ = Eigen::Vector3d::Zero();

     轮询处理三类消息 IMU消息、里程计消息,激光测距消息。有如下情况:

    1)不使用IMU和里程计

      只执行AddPose,注意12-15行的代码,$ time{ m{ }} - timed\_pose\_queue\_left[ 1 ight].time ge pose\_queue\_duration\_ $,队列中最前面的数据的时间,当距离当前时间超过一定间隔时执行,作用是将较早时间的数据剔除。接着,根据位姿计算运动速度。对齐IMU数据,里程计数据。

     1 void PoseExtrapolator::AddPose(const common::Time time,
     2                                const transform::Rigid3d& pose) {
     3   if (imu_tracker_ == nullptr) {
     4     common::Time tracker_start = time;
     5     if (!imu_data_.empty()) {
     6       tracker_start = std::min(tracker_start, imu_data_.front().time);
     7     }
     8     imu_tracker_ =
     9         common::make_unique<ImuTracker>(gravity_time_constant_, tracker_start);
    10   }
    11   timed_pose_queue_.push_back(TimedPose{time, pose});
    12   while (timed_pose_queue_.size() > 2 &&
    13          timed_pose_queue_[1].time <= time - pose_queue_duration_) {
    14     timed_pose_queue_.pop_front();
    15   }
    16   UpdateVelocitiesFromPoses();
    17   AdvanceImuTracker(time, imu_tracker_.get());
    18   TrimImuData();
    19   TrimOdometryData();
    20   odometry_imu_tracker_ = common::make_unique<ImuTracker>(*imu_tracker_);
    21   extrapolation_imu_tracker_ = common::make_unique<ImuTracker>(*imu_tracker_);
    22 }

     第16行,更新了根据Pose计算的线速度和角速度。

     1 void PoseExtrapolator::UpdateVelocitiesFromPoses() 
     2 {
     3   if (timed_pose_queue_.size() < 2)
     4   {
     5     // We need two poses to estimate velocities.
     6     return;
     7   }
     8   CHECK(!timed_pose_queue_.empty());
     9   const TimedPose& newest_timed_pose = timed_pose_queue_.back();
    10   const auto newest_time = newest_timed_pose.time;
    11   const TimedPose& oldest_timed_pose = timed_pose_queue_.front();
    12   const auto oldest_time = oldest_timed_pose.time;
    13   const double queue_delta = common::ToSeconds(newest_time - oldest_time);
    14   if (queue_delta < 0.001) {  // 1 ms
    15     LOG(WARNING) << "Queue too short for velocity estimation. Queue duration: "
    16                  << queue_delta << " ms";
    17     return;
    18   }
    19   const transform::Rigid3d& newest_pose = newest_timed_pose.pose;
    20   const transform::Rigid3d& oldest_pose = oldest_timed_pose.pose;
    21   linear_velocity_from_poses_ =
    22       (newest_pose.translation() - oldest_pose.translation()) / queue_delta;
    23   angular_velocity_from_poses_ =
    24       transform::RotationQuaternionToAngleAxisVector(
    25           oldest_pose.rotation().inverse() * newest_pose.rotation()) /
    26       queue_delta;
    27 }
    PoseExtrapolator::UpdateVelocitiesFromPoses()

    17行执行了PoseExtrapolator::AdvanceImuTracker方法,当不使用IMU数据时,将 angular_velocity_from_poses_ 或者 angular_velocity_from_odometry_ 数据传入了imu_tracker.

     1 void PoseExtrapolator::AdvanceImuTracker(const common::Time time,
     2                                          ImuTracker* const imu_tracker) const 
     3 {
     4   CHECK_GE(time, imu_tracker->time());
     5   if (imu_data_.empty() || time < imu_data_.front().time) 
     6   {
     7     // There is no IMU data until 'time', so we advance the ImuTracker and use
     8     // the angular velocities from poses and fake gravity to help 2D stability.
     9     imu_tracker->Advance(time);
    10     imu_tracker->AddImuLinearAccelerationObservation(Eigen::Vector3d::UnitZ());
    11     imu_tracker->AddImuAngularVelocityObservation(
    12         odometry_data_.size() < 2 ? angular_velocity_from_poses_
    13                                   : angular_velocity_from_odometry_);
    14     return;
    15   }
    16   if (imu_tracker->time() < imu_data_.front().time) {
    17     // Advance to the beginning of 'imu_data_'.
    18     imu_tracker->Advance(imu_data_.front().time);
    19   }
    20   auto it = std::lower_bound(
    21       imu_data_.begin(), imu_data_.end(), imu_tracker->time(),
    22       [](const sensor::ImuData& imu_data, const common::Time& time) {
    23         return imu_data.time < time;
    24       });
    25   while (it != imu_data_.end() && it->time < time) {
    26     imu_tracker->Advance(it->time);
    27     imu_tracker->AddImuLinearAccelerationObservation(it->linear_acceleration);
    28     imu_tracker->AddImuAngularVelocityObservation(it->angular_velocity);
    29     ++it;
    30   }
    31   imu_tracker->Advance(time);
    32 }

     在执行ExtrapolatePose(),推测某一时刻的位姿的时候,调用了 ExtrapolateTranslation 和 ExtrapolateRotation 方法。

     1 transform::Rigid3d PoseExtrapolator::ExtrapolatePose(const common::Time time) {
     2   const TimedPose& newest_timed_pose = timed_pose_queue_.back();
     3   CHECK_GE(time, newest_timed_pose.time);
     4   if (cached_extrapolated_pose_.time != time) {
     5     const Eigen::Vector3d translation =
     6         ExtrapolateTranslation(time) + newest_timed_pose.pose.translation();
     7     const Eigen::Quaterniond rotation =
     8         newest_timed_pose.pose.rotation() *
     9         ExtrapolateRotation(time, extrapolation_imu_tracker_.get());
    10     cached_extrapolated_pose_ =
    11         TimedPose{time, transform::Rigid3d{translation, rotation}};
    12   }
    13   return cached_extrapolated_pose_.pose;
    14 }

    可以看到使用的是:(1)旋转,imu_tracker的方位角角的变化量;(2)平移,里程计或者位姿线速度计算的移动量。

     1 Eigen::Quaterniond PoseExtrapolator::ExtrapolateRotation(
     2     const common::Time time, ImuTracker* const imu_tracker) const {
     3   CHECK_GE(time, imu_tracker->time());
     4   AdvanceImuTracker(time, imu_tracker);
     5   const Eigen::Quaterniond last_orientation = imu_tracker_->orientation();
     6   return last_orientation.inverse() * imu_tracker->orientation();
     7 }
     8 
     9 Eigen::Vector3d PoseExtrapolator::ExtrapolateTranslation(common::Time time) {
    10   const TimedPose& newest_timed_pose = timed_pose_queue_.back();
    11   const double extrapolation_delta =
    12       common::ToSeconds(time - newest_timed_pose.time);
    13   if (odometry_data_.size() < 2) {
    14     return extrapolation_delta * linear_velocity_from_poses_;
    15   }
    16   return extrapolation_delta * linear_velocity_from_odometry_;
    17 }

     2)使用IMU和里程计

      IMU频率最高,假设消息进入的先后顺序是IMU、里程计,最后是激光消息。

    2. RealTimeCorrelativeScanMatcher解决了Scan和子图的扫描匹配的问题。

    通过 real_time_correlative_scan_matcher_ 和 ceres_scan_matcher_ 实现的

     1 std::unique_ptr<transform::Rigid2d> LocalTrajectoryBuilder::ScanMatch(
     2     const common::Time time, const transform::Rigid2d& pose_prediction,
     3     const sensor::RangeData& gravity_aligned_range_data) 
     4 {
     5   std::shared_ptr<const Submap> matching_submap =
     6       active_submaps_.submaps().front();
     7   // The online correlative scan matcher will refine the initial estimate for
     8   // the Ceres scan matcher.
     9   transform::Rigid2d initial_ceres_pose = pose_prediction;
    10   sensor::AdaptiveVoxelFilter adaptive_voxel_filter(
    11       options_.adaptive_voxel_filter_options());
    12   const sensor::PointCloud filtered_gravity_aligned_point_cloud =
    13       adaptive_voxel_filter.Filter(gravity_aligned_range_data.returns);
    14   if (filtered_gravity_aligned_point_cloud.empty())
    15   {
    16     return nullptr;
    17   }
    18   if (options_.use_online_correlative_scan_matching()) 
    19   {
    20     real_time_correlative_scan_matcher_.Match(
    21         pose_prediction, filtered_gravity_aligned_point_cloud,
    22         matching_submap->probability_grid(), &initial_ceres_pose);
    23   }
    24 
    25   auto pose_observation = common::make_unique<transform::Rigid2d>();
    26   ceres::Solver::Summary summary;
    27   ceres_scan_matcher_.Match(pose_prediction.translation(), initial_ceres_pose,
    28                             filtered_gravity_aligned_point_cloud,
    29                             matching_submap->probability_grid(),
    30                             pose_observation.get(), &summary);
    31   return pose_observation;
    32 }
  • 相关阅读:
    python入门1
    查找字段 和查找组件
    DBGRID 拖动滚动条 和 鼠标滚轮的问题
    数据集 过滤时 RecordCount 属性
    查找字段 如何 过滤
    数据集控件 放在 数据模块 上后,如何写事件代码
    取TTable 过滤后的记录数
    判断 Windows操作系统是32位还是64位
    MatchText MatchStr 区别
    EClassNotFound
  • 原文地址:https://www.cnblogs.com/yhlx125/p/8512764.html
Copyright © 2011-2022 走看看