Showing preview only (594K chars total). Download the full file or copy to clipboard to get everything.
Repository: YukunXia/VLOAM-CMU-16833
Branch: master
Commit: 59f37f888356
Files: 72
Total size: 564.9 KB
Directory structure:
gitextract_grozf6ne/
├── .gitignore
├── LICENSE
├── README.md
└── src/
├── lidar_odometry_mapping/
│ ├── CMakeLists.txt
│ ├── LICENSE
│ ├── include/
│ │ └── lidar_odometry_mapping/
│ │ ├── common.h
│ │ ├── laser_mapping.h
│ │ ├── laser_odometry.h
│ │ ├── lidarFactor.hpp
│ │ ├── lidar_odometry_mapping.h
│ │ ├── scan_registration.h
│ │ └── tic_toc.h
│ ├── launch/
│ │ ├── loam_velodyne_HDL_32.launch
│ │ ├── loam_velodyne_HDL_64.launch
│ │ ├── loam_velodyne_HDL_64_kitti.launch
│ │ └── loam_velodyne_VLP_16.launch
│ ├── package.xml
│ ├── rviz_cfg/
│ │ └── lidar_odometry_mapping.rviz
│ └── src/
│ ├── laser_mapping.cpp
│ ├── laser_odometry.cpp
│ ├── lidar_odometry_mapping.cpp
│ └── scan_registration.cpp
├── visual_odometry/
│ ├── CMakeLists.txt
│ ├── README.md
│ ├── include/
│ │ └── visual_odometry/
│ │ ├── ceres_cost_function.h
│ │ ├── image_util.h
│ │ ├── point_cloud_util.h
│ │ └── visual_odometry.h
│ ├── package.xml
│ └── src/
│ ├── image_util.cpp
│ ├── point_cloud_util.cpp
│ └── visual_odometry.cpp
├── vloam_main/
│ ├── CMakeLists.txt
│ ├── README.md
│ ├── action/
│ │ └── vloam_main.action
│ ├── launch/
│ │ └── vloam_main.launch
│ ├── package.xml
│ ├── results/
│ │ ├── 2011_09_26_drive_0001/
│ │ │ ├── LO0.txt
│ │ │ ├── MO0.txt
│ │ │ └── VO0.txt
│ │ ├── 2011_09_26_drive_0002/
│ │ │ ├── LO0.txt
│ │ │ ├── MO0.txt
│ │ │ └── VO0.txt
│ │ ├── 2011_09_26_drive_0005/
│ │ │ ├── LO0.txt
│ │ │ ├── LO1.txt
│ │ │ ├── MO0.txt
│ │ │ ├── MO1.txt
│ │ │ ├── VO0.txt
│ │ │ └── VO1.txt
│ │ ├── 2011_09_30_drive_0016/
│ │ │ ├── LO0.txt
│ │ │ ├── MO0.txt
│ │ │ └── VO0.txt
│ │ ├── 2011_09_30_drive_0027/
│ │ │ ├── LO1.txt
│ │ │ ├── MO1.txt
│ │ │ └── VO1.txt
│ │ ├── 2011_10_03_drive_0027/
│ │ │ ├── LO0.txt
│ │ │ ├── MO0.txt
│ │ │ └── VO0.txt
│ │ └── 2011_10_03_drive_0042/
│ │ ├── LO0.txt
│ │ ├── LO1.txt
│ │ ├── MO0.txt
│ │ ├── MO1.txt
│ │ ├── VO0.txt
│ │ └── VO1.txt
│ ├── rviz/
│ │ ├── vloam.rviz
│ │ └── vloam_combined.rviz
│ ├── scripts/
│ │ └── plotTrajectory.py
│ └── src/
│ └── vloam_main_node.cpp
└── vloam_tf/
├── CMakeLists.txt
├── include/
│ └── vloam_tf/
│ └── vloam_tf.h
├── package.xml
└── src/
└── vloam_tf.cpp
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
devel/
logs/
build/
bin/
lib/
msg_gen/
srv_gen/
msg/*Action.msg
msg/*ActionFeedback.msg
msg/*ActionGoal.msg
msg/*ActionResult.msg
msg/*Feedback.msg
msg/*Goal.msg
msg/*Result.msg
msg/_*.py
build_isolated/
devel_isolated/
.catkin_workspace
# Generated by dynamic reconfigure
*.cfgc
/cfg/cpp/
/cfg/*.py
# Ignore generated docs
*.dox
*.wikidoc
# eclipse stuff
.project
.cproject
# qcreator stuff
CMakeLists.txt.user
srv/_*.py
*.pcd
*.pyc
qtcreator-*
*.user
/planning/cfg
/planning/docs
/planning/src
*~
# Emacs
.#*
# Catkin custom files
CATKIN_IGNORE
# VS Code
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
*.code-workspace
# Local History for Visual Studio Code
.history/
# Commitizen
node_modules/
package-lock.json
package.json
# Data
data/
src/vloam_main/bags
KITTI_odometry_evaluation_tool
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) 2021 Yukun Xia
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.md
================================================
# Introduction
This repository is a reimplementation of the VLOAM algorithm [1]. The LOAM/Lidar Odometry part is adapted and refactored from ALOAM [2], and the Visual Odometry part is written according to the DEMO paper [3].
The following figure [1] illustrates the pipeline of the VLOAM algorithm.

# Results

Video: https://youtu.be/NnoxB3r_cDM

# Detailed Usage
Check README.md under `src/vloam_main`
## Prerequisites
OpenCV 4.5.1
Eigen3 3.3
Ceres 2.0
PCL 1.2
## Evaluation tool

https://github.com/LeoQLi/KITTI_odometry_evaluation_tool
## Data format
Place bag files under "src/vloam_main/bags/"
Note: current dataloader only support "synced" type dataset.
# Reference:
[1] J. Zhang and S. Singh. Laser-visual-inertial Odometry and
Mapping with High Robustness and Low Drift. Journal of
Field Robotics. vol. 35, no. 8, pp. 1242–1264, 2018.
[2] T. Qin and S. Cao. A-LOAM. https://github.com/HKUST-Aerial-Robotics/A-LOAM
[3] Zhang, Ji, Michael Kaess, and Sanjiv Singh. "Real-time depth enhanced monocular odometry." 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2014.
================================================
FILE: src/lidar_odometry_mapping/CMakeLists.txt
================================================
cmake_minimum_required(VERSION 2.8.3)
project(lidar_odometry_mapping)
set(CMAKE_BUILD_TYPE "Release")
set(CMAKE_CXX_FLAGS "-std=c++14")
set(CMAKE_CXX_FLAGS_RELEASE "-O3 -Wall -g")
find_package(catkin REQUIRED COMPONENTS
geometry_msgs
nav_msgs
sensor_msgs
roscpp
rospy
rosbag
std_msgs
image_transport
cv_bridge
tf
vloam_tf
)
#find_package(Eigen3 REQUIRED)
find_package(PCL REQUIRED)
find_package(OpenCV 4.5.1 REQUIRED)
find_package(Ceres REQUIRED)
include_directories(
include
${catkin_INCLUDE_DIRS}
${PCL_INCLUDE_DIRS}
${CERES_INCLUDE_DIRS}
${OpenCV_INCLUDE_DIRS})
catkin_package(
CATKIN_DEPENDS geometry_msgs nav_msgs roscpp rospy std_msgs
DEPENDS EIGEN3 PCL
INCLUDE_DIRS include
)
add_library(scan_registration SHARED
src/scan_registration.cpp
)
target_link_libraries(scan_registration ${catkin_LIBRARIES} ${PCL_LIBRARIES} vloam_tf)
set_target_properties(scan_registration PROPERTIES PUBLIC_HEADER include/${PROJECT_NAME}/scan_registration.h)
add_library(laser_odometry SHARED
src/laser_odometry.cpp
)
target_link_libraries(laser_odometry ${catkin_LIBRARIES} ${PCL_LIBRARIES} ${CERES_LIBRARIES} vloam_tf)
set_target_properties(laser_odometry PROPERTIES PUBLIC_HEADER include/${PROJECT_NAME}/laser_odometry.h)
add_library(laser_mapping SHARED
src/laser_mapping.cpp
)
target_link_libraries(laser_mapping ${catkin_LIBRARIES} ${PCL_LIBRARIES} ${CERES_LIBRARIES} vloam_tf)
set_target_properties(laser_mapping PROPERTIES PUBLIC_HEADER include/${PROJECT_NAME}/laser_mapping.h)
add_library(lidar_odometry_mapping SHARED
src/lidar_odometry_mapping.cpp
)
target_link_libraries(lidar_odometry_mapping ${catkin_LIBRARIES} ${PCL_LIBRARIES} ${CERES_LIBRARIES} scan_registration laser_odometry laser_mapping vloam_tf)
set_target_properties(lidar_odometry_mapping PROPERTIES PUBLIC_HEADER include/${PROJECT_NAME}/lidar_odometry_mapping.h)
install(
DIRECTORY include/${PROJECT_NAME}/
DESTINATION ${CATKIN_PACKAGE_INCLUDE_DESTINATION}
FILES_MATCHING
PATTERN "*.hpp"
PATTERN "*.h"
# PATTERN ".svn" EXCLUDE
)
================================================
FILE: src/lidar_odometry_mapping/LICENSE
================================================
This is an advanced implementation of the algorithm described in the following paper:
J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
Modifier: Tong Qin qintonguav@gmail.com
Shaozu Cao saozu.cao@connect.ust.hk
Copyright 2013, Ji Zhang, Carnegie Mellon University
Further contributions copyright (c) 2016, Southwest Research Institute
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
================================================
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/common.h
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#pragma once
#include <pcl/point_types.h>
#include <cmath>
typedef pcl::PointXYZI PointType;
inline double rad2deg(double radians)
{
return radians * 180.0 / M_PI;
}
inline double deg2rad(double degrees)
{
return degrees * M_PI / 180.0;
}
================================================
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/laser_mapping.h
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#pragma once
#include <ceres/ceres.h>
#include <geometry_msgs/PoseStamped.h>
#include <lidar_odometry_mapping/common.h>
#include <math.h>
#include <nav_msgs/Odometry.h>
#include <nav_msgs/Path.h>
#include <pcl/filters/voxel_grid.h>
#include <pcl/kdtree/kdtree_flann.h>
#include <pcl/point_cloud.h>
#include <pcl/point_types.h>
#include <pcl_conversions/pcl_conversions.h>
#include <ros/ros.h>
#include <sensor_msgs/Imu.h>
#include <sensor_msgs/PointCloud2.h>
#include <tf/transform_broadcaster.h>
#include <tf/transform_datatypes.h>
#include <eigen3/Eigen/Dense>
#include <iostream>
#include <mutex>
#include <queue>
#include <string>
#include <thread>
#include <vector>
// #include "lidarFactor.hpp"
#include <vloam_tf/vloam_tf.h>
#include "lidar_odometry_mapping/common.h"
#include "lidar_odometry_mapping/lidarFactor.hpp"
#include "lidar_odometry_mapping/tic_toc.h"
namespace vloam
{
class LaserMapping
{
public:
LaserMapping()
: laserCloudCenWidth(10)
, laserCloudCenHeight(10)
, laserCloudCenDepth(5)
, q_w_curr(parameters)
, t_w_curr(parameters + 4)
, nh("laser_mapping_node")
{
}
void init(std::shared_ptr<VloamTF>& vloam_tf_);
void reset();
void input(const pcl::PointCloud<PointType>::Ptr& laserCloudCornerLast_,
const pcl::PointCloud<PointType>::Ptr& laserCloudSurfLast_,
const pcl::PointCloud<PointType>::Ptr& laserCloudFullRes_, const Eigen::Quaterniond& q_wodom_curr_,
const Eigen::Vector3d& t_wodom_curr_, const bool& skip_frame_);
void publish();
void solveMapping();
void output();
// set initial guess
void transformAssociateToMap();
void transformUpdate();
void pointAssociateToMap(PointType const* const pi, PointType* const po);
void pointAssociateTobeMapped(PointType const* const pi, PointType* const po);
private:
std::shared_ptr<VloamTF> vloam_tf;
int frameCount = 0;
int laserCloudCenWidth;
int laserCloudCenHeight;
int laserCloudCenDepth;
static const int laserCloudWidth = 21;
static const int laserCloudHeight = 21;
static const int laserCloudDepth = 11;
static const int laserCloudNum = laserCloudWidth * laserCloudHeight * laserCloudDepth; // 4851
int laserCloudValidInd[125];
int laserCloudSurroundInd[125];
// input: from odom
pcl::PointCloud<PointType>::Ptr laserCloudCornerLast;
pcl::PointCloud<PointType>::Ptr laserCloudSurfLast;
// ouput: all visualble cube points
pcl::PointCloud<PointType>::Ptr laserCloudSurround;
// surround points in map to build tree
pcl::PointCloud<PointType>::Ptr laserCloudCornerFromMap;
pcl::PointCloud<PointType>::Ptr laserCloudSurfFromMap;
// input & output: points in one frame. local --> global
pcl::PointCloud<PointType>::Ptr laserCloudFullRes;
// points in every cube
pcl::PointCloud<PointType>::Ptr laserCloudCornerArray[laserCloudNum];
pcl::PointCloud<PointType>::Ptr laserCloudSurfArray[laserCloudNum];
// kd-tree
pcl::KdTreeFLANN<PointType>::Ptr kdtreeCornerFromMap;
pcl::KdTreeFLANN<PointType>::Ptr kdtreeSurfFromMap;
double parameters[7];
Eigen::Map<Eigen::Quaterniond> q_w_curr;
Eigen::Map<Eigen::Vector3d> t_w_curr;
Eigen::Quaterniond q_w_curr_highfreq;
Eigen::Vector3d t_w_curr_highfreq;
// wmap_T_odom * odom_T_curr = wmap_T_curr;
// transformation between odom's world and map's world frame
Eigen::Quaterniond q_wmap_wodom;
Eigen::Vector3d t_wmap_wodom;
Eigen::Quaterniond q_wodom_curr;
Eigen::Vector3d t_wodom_curr;
sensor_msgs::PointCloud2ConstPtr cornerLastBuf;
sensor_msgs::PointCloud2ConstPtr surfLastBuf;
sensor_msgs::PointCloud2ConstPtr fullResBuf;
nav_msgs::Odometry::ConstPtr odometryBuf;
pcl::VoxelGrid<PointType> downSizeFilterCorner;
pcl::VoxelGrid<PointType> downSizeFilterSurf;
std::vector<int> pointSearchInd;
std::vector<float> pointSearchSqDis;
PointType pointOri, pointSel;
ros::Publisher pubLaserCloudSurround, pubLaserCloudMap, pubLaserCloudFullRes, pubOdomAftMapped,
pubOdomAftMappedHighFrec, pubLaserAfterMappedPath;
nav_msgs::Path laserAfterMappedPath;
ros::NodeHandle nh;
int verbose_level;
float lineRes;
float planeRes;
int laserCloudValidNum;
int laserCloudSurroundNum;
TicToc t_whole;
bool skip_frame;
int mapping_skip_frame;
int map_pub_number;
};
} // namespace vloam
================================================
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/laser_odometry.h
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#pragma once
#include <geometry_msgs/PoseStamped.h>
#include <lidar_odometry_mapping/common.h>
#include <lidar_odometry_mapping/tic_toc.h>
#include <nav_msgs/Odometry.h>
#include <nav_msgs/Path.h>
#include <pcl/filters/voxel_grid.h>
#include <pcl/kdtree/kdtree_flann.h>
#include <pcl/point_cloud.h>
#include <pcl/point_types.h>
#include <pcl_conversions/pcl_conversions.h>
#include <ros/ros.h>
#include <sensor_msgs/Imu.h>
#include <sensor_msgs/PointCloud2.h>
#include <tf/transform_broadcaster.h>
#include <tf/transform_datatypes.h>
#include <vloam_tf/vloam_tf.h>
#include <cmath>
#include <eigen3/Eigen/Dense>
#include <lidar_odometry_mapping/lidarFactor.hpp>
#include <mutex>
#include <queue>
namespace vloam
{
class LaserOdometry
{
public:
LaserOdometry() : q_last_curr(para_q), t_last_curr(para_t), nh("laser_odometry_node")
{
}
void init(std::shared_ptr<VloamTF>& vloam_tf_);
// void reset ();
void input(const pcl::PointCloud<PointType>::Ptr& laserCloud_,
const pcl::PointCloud<PointType>::Ptr& cornerPointsSharp_,
const pcl::PointCloud<PointType>::Ptr& cornerPointsLessSharp_,
const pcl::PointCloud<PointType>::Ptr& surfPointsFlat_,
const pcl::PointCloud<PointType>::Ptr& surfPointsLessFlat_);
void solveLO();
void publish();
void output(Eigen::Quaterniond& q_w_curr_, Eigen::Vector3d& t_w_curr,
pcl::PointCloud<PointType>::Ptr& laserCloudCornerLast_,
pcl::PointCloud<PointType>::Ptr& laserCloudSurfLast_, pcl::PointCloud<PointType>::Ptr& laserCloudFullRes_,
bool& skip_frame);
void TransformToStart(PointType const* const pi, PointType* const po);
void TransformToEnd(PointType const* const pi, PointType* const po);
private:
const bool DISTORTION = false;
int corner_correspondence, plane_correspondence;
const double SCAN_PERIOD = 0.1;
const double DISTANCE_SQ_THRESHOLD = 25;
const double NEARBY_SCAN = 2.5;
int mapping_skip_frame;
bool systemInited;
double timeCornerPointsSharp;
double timeCornerPointsLessSharp;
double timeSurfPointsFlat;
double timeSurfPointsLessFlat;
double timeLaserCloudFullRes;
pcl::KdTreeFLANN<pcl::PointXYZI>::Ptr kdtreeCornerLast;
pcl::KdTreeFLANN<pcl::PointXYZI>::Ptr kdtreeSurfLast;
pcl::PointCloud<PointType>::Ptr cornerPointsSharp;
pcl::PointCloud<PointType>::Ptr cornerPointsLessSharp;
pcl::PointCloud<PointType>::Ptr surfPointsFlat;
pcl::PointCloud<PointType>::Ptr surfPointsLessFlat;
pcl::PointCloud<PointType>::Ptr laserCloudCornerLast;
pcl::PointCloud<PointType>::Ptr laserCloudSurfLast;
pcl::PointCloud<PointType>::Ptr laserCloudFullRes;
int laserCloudCornerLastNum;
int laserCloudSurfLastNum;
// Transformation from current frame to world frame
Eigen::Quaterniond q_w_curr;
Eigen::Vector3d t_w_curr;
// q_curr_last(x, y, z, w), t_curr_last
double para_q[4];
double para_t[3];
Eigen::Map<Eigen::Quaterniond> q_last_curr;
Eigen::Map<Eigen::Vector3d> t_last_curr;
std::shared_ptr<VloamTF> vloam_tf;
ros::NodeHandle nh;
int verbose_level;
bool detach_VO_LO;
ros::Publisher pubLaserCloudCornerLast;
ros::Publisher pubLaserCloudSurfLast;
ros::Publisher pubLaserCloudFullRes;
ros::Publisher pubLaserOdometry;
ros::Publisher pubLaserPath;
nav_msgs::Path laserPath;
int frameCount;
};
} // namespace vloam
================================================
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/lidarFactor.hpp
================================================
// Author: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
#pragma once
#include <ceres/ceres.h>
#include <ceres/rotation.h>
#include <pcl/kdtree/kdtree_flann.h>
#include <pcl/point_cloud.h>
#include <pcl/point_types.h>
#include <pcl_conversions/pcl_conversions.h>
#include <eigen3/Eigen/Dense>
struct LidarEdgeFactor
{
LidarEdgeFactor(Eigen::Vector3d curr_point_, Eigen::Vector3d last_point_a_, Eigen::Vector3d last_point_b_, double s_)
: curr_point(curr_point_), last_point_a(last_point_a_), last_point_b(last_point_b_), s(s_)
{
}
template <typename T>
bool operator()(const T *q, const T *t, T *residual) const
{
Eigen::Matrix<T, 3, 1> cp{ T(curr_point.x()), T(curr_point.y()), T(curr_point.z()) };
Eigen::Matrix<T, 3, 1> lpa{ T(last_point_a.x()), T(last_point_a.y()), T(last_point_a.z()) };
Eigen::Matrix<T, 3, 1> lpb{ T(last_point_b.x()), T(last_point_b.y()), T(last_point_b.z()) };
// Eigen::Quaternion<T> q_last_curr{q[3], T(s) * q[0], T(s) * q[1], T(s) * q[2]};
Eigen::Quaternion<T> q_last_curr{ q[3], q[0], q[1], q[2] };
Eigen::Quaternion<T> q_identity{ T(1), T(0), T(0), T(0) };
q_last_curr = q_identity.slerp(T(s), q_last_curr);
Eigen::Matrix<T, 3, 1> t_last_curr{ T(s) * t[0], T(s) * t[1], T(s) * t[2] };
Eigen::Matrix<T, 3, 1> lp;
lp = q_last_curr * cp + t_last_curr;
Eigen::Matrix<T, 3, 1> nu = (lp - lpa).cross(lp - lpb);
Eigen::Matrix<T, 3, 1> de = lpa - lpb;
residual[0] = nu.x() / de.norm();
residual[1] = nu.y() / de.norm();
residual[2] = nu.z() / de.norm();
return true;
}
static ceres::CostFunction *Create(const Eigen::Vector3d curr_point_, const Eigen::Vector3d last_point_a_,
const Eigen::Vector3d last_point_b_, const double s_)
{
return (new ceres::AutoDiffCostFunction<LidarEdgeFactor, 3, 4, 3>(
new LidarEdgeFactor(curr_point_, last_point_a_, last_point_b_, s_)));
}
Eigen::Vector3d curr_point, last_point_a, last_point_b;
double s;
};
struct LidarPlaneFactor
{
LidarPlaneFactor(Eigen::Vector3d curr_point_, Eigen::Vector3d last_point_j_, Eigen::Vector3d last_point_l_,
Eigen::Vector3d last_point_m_, double s_)
: curr_point(curr_point_)
, last_point_j(last_point_j_)
, last_point_l(last_point_l_)
, last_point_m(last_point_m_)
, s(s_)
{
ljm_norm = (last_point_j - last_point_l).cross(last_point_j - last_point_m);
ljm_norm.normalize();
}
template <typename T>
bool operator()(const T *q, const T *t, T *residual) const
{
Eigen::Matrix<T, 3, 1> cp{ T(curr_point.x()), T(curr_point.y()), T(curr_point.z()) };
Eigen::Matrix<T, 3, 1> lpj{ T(last_point_j.x()), T(last_point_j.y()), T(last_point_j.z()) };
// Eigen::Matrix<T, 3, 1> lpl{T(last_point_l.x()), T(last_point_l.y()), T(last_point_l.z())};
// Eigen::Matrix<T, 3, 1> lpm{T(last_point_m.x()), T(last_point_m.y()), T(last_point_m.z())};
Eigen::Matrix<T, 3, 1> ljm{ T(ljm_norm.x()), T(ljm_norm.y()), T(ljm_norm.z()) };
// Eigen::Quaternion<T> q_last_curr{q[3], T(s) * q[0], T(s) * q[1], T(s) * q[2]};
Eigen::Quaternion<T> q_last_curr{ q[3], q[0], q[1], q[2] };
Eigen::Quaternion<T> q_identity{ T(1), T(0), T(0), T(0) };
q_last_curr = q_identity.slerp(T(s), q_last_curr);
Eigen::Matrix<T, 3, 1> t_last_curr{ T(s) * t[0], T(s) * t[1], T(s) * t[2] };
Eigen::Matrix<T, 3, 1> lp;
lp = q_last_curr * cp + t_last_curr;
residual[0] = (lp - lpj).dot(ljm);
return true;
}
static ceres::CostFunction *Create(const Eigen::Vector3d curr_point_, const Eigen::Vector3d last_point_j_,
const Eigen::Vector3d last_point_l_, const Eigen::Vector3d last_point_m_,
const double s_)
{
return (new ceres::AutoDiffCostFunction<LidarPlaneFactor, 1, 4, 3>(
new LidarPlaneFactor(curr_point_, last_point_j_, last_point_l_, last_point_m_, s_)));
}
Eigen::Vector3d curr_point, last_point_j, last_point_l, last_point_m;
Eigen::Vector3d ljm_norm;
double s;
};
struct LidarPlaneNormFactor
{
LidarPlaneNormFactor(Eigen::Vector3d curr_point_, Eigen::Vector3d plane_unit_norm_, double negative_OA_dot_norm_)
: curr_point(curr_point_), plane_unit_norm(plane_unit_norm_), negative_OA_dot_norm(negative_OA_dot_norm_)
{
}
template <typename T>
bool operator()(const T *q, const T *t, T *residual) const
{
Eigen::Quaternion<T> q_w_curr{ q[3], q[0], q[1], q[2] };
Eigen::Matrix<T, 3, 1> t_w_curr{ t[0], t[1], t[2] };
Eigen::Matrix<T, 3, 1> cp{ T(curr_point.x()), T(curr_point.y()), T(curr_point.z()) };
Eigen::Matrix<T, 3, 1> point_w;
point_w = q_w_curr * cp + t_w_curr;
Eigen::Matrix<T, 3, 1> norm(T(plane_unit_norm.x()), T(plane_unit_norm.y()), T(plane_unit_norm.z()));
residual[0] = norm.dot(point_w) + T(negative_OA_dot_norm);
return true;
}
static ceres::CostFunction *Create(const Eigen::Vector3d curr_point_, const Eigen::Vector3d plane_unit_norm_,
const double negative_OA_dot_norm_)
{
return (new ceres::AutoDiffCostFunction<LidarPlaneNormFactor, 1, 4, 3>(
new LidarPlaneNormFactor(curr_point_, plane_unit_norm_, negative_OA_dot_norm_)));
}
Eigen::Vector3d curr_point;
Eigen::Vector3d plane_unit_norm;
double negative_OA_dot_norm;
};
struct LidarDistanceFactor
{
LidarDistanceFactor(Eigen::Vector3d curr_point_, Eigen::Vector3d closed_point_)
: curr_point(curr_point_), closed_point(closed_point_)
{
}
template <typename T>
bool operator()(const T *q, const T *t, T *residual) const
{
Eigen::Quaternion<T> q_w_curr{ q[3], q[0], q[1], q[2] };
Eigen::Matrix<T, 3, 1> t_w_curr{ t[0], t[1], t[2] };
Eigen::Matrix<T, 3, 1> cp{ T(curr_point.x()), T(curr_point.y()), T(curr_point.z()) };
Eigen::Matrix<T, 3, 1> point_w;
point_w = q_w_curr * cp + t_w_curr;
residual[0] = point_w.x() - T(closed_point.x());
residual[1] = point_w.y() - T(closed_point.y());
residual[2] = point_w.z() - T(closed_point.z());
return true;
}
static ceres::CostFunction *Create(const Eigen::Vector3d curr_point_, const Eigen::Vector3d closed_point_)
{
return (new ceres::AutoDiffCostFunction<LidarDistanceFactor, 3, 4, 3>(
new LidarDistanceFactor(curr_point_, closed_point_)));
}
Eigen::Vector3d curr_point;
Eigen::Vector3d closed_point;
};
================================================
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/lidar_odometry_mapping.h
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#pragma once
#include <lidar_odometry_mapping/laser_mapping.h>
#include <lidar_odometry_mapping/laser_odometry.h>
#include <lidar_odometry_mapping/scan_registration.h>
#include <vloam_tf/vloam_tf.h>
namespace vloam
{
class LidarOdometryMapping
{
public:
LidarOdometryMapping() : nh("lidar_odometry_mapping_node")
{
}
void init(std::shared_ptr<VloamTF>& vloam_tf_);
void reset();
void scanRegistrationIO(const pcl::PointCloud<pcl::PointXYZ>& laserCloudIn);
void laserOdometryIO();
// void laserOdometryIO(const tf2::Transform& cam0_curr_T_cam0_last);
void laserMappingIO();
private:
std::shared_ptr<VloamTF> vloam_tf;
ros::NodeHandle nh;
int verbose_level;
TicToc loam_timer;
double frame_time;
ScanRegistration scan_registration;
pcl::PointCloud<PointType>::Ptr laserCloud;
pcl::PointCloud<PointType>::Ptr cornerPointsSharp;
pcl::PointCloud<PointType>::Ptr cornerPointsLessSharp;
pcl::PointCloud<PointType>::Ptr surfPointsFlat;
pcl::PointCloud<PointType>::Ptr surfPointsLessFlat;
LaserOdometry laser_odometry;
Eigen::Quaterniond q_wodom_curr, q_w_curr;
Eigen::Vector3d t_wodom_curr, t_w_curr;
pcl::PointCloud<PointType>::Ptr laserCloudCornerLast;
pcl::PointCloud<PointType>::Ptr laserCloudSurfLast;
pcl::PointCloud<PointType>::Ptr laserCloudFullRes;
bool skip_frame;
LaserMapping laser_mapping;
};
} // namespace vloam
================================================
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/scan_registration.h
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#pragma once
#include <nav_msgs/Odometry.h>
#include <pcl/filters/voxel_grid.h>
#include <pcl/kdtree/kdtree_flann.h>
#include <pcl/point_cloud.h>
#include <pcl/point_types.h>
#include <pcl_conversions/pcl_conversions.h>
#include <ros/ros.h>
#include <sensor_msgs/Imu.h>
#include <sensor_msgs/PointCloud2.h>
#include <tf/transform_broadcaster.h>
#include <tf/transform_datatypes.h>
#include <cmath>
#include <opencv4/opencv2/opencv.hpp>
#include <string>
#include <vector>
#include "lidar_odometry_mapping/common.h"
#include "lidar_odometry_mapping/tic_toc.h"
using std::atan2;
using std::cos;
using std::sin;
namespace vloam
{
class ScanRegistration
{
public:
ScanRegistration() : nh("scan_registration_node")
{
}
void init();
void reset();
void input(const pcl::PointCloud<pcl::PointXYZ>& laserCloudIn_);
void publish();
void output(pcl::PointCloud<PointType>::Ptr& laserCloud_, pcl::PointCloud<PointType>::Ptr& cornerPointsSharp_,
pcl::PointCloud<PointType>::Ptr& cornerPointsLessSharp_, pcl::PointCloud<PointType>::Ptr& surfPointsFlat_,
pcl::PointCloud<PointType>::Ptr& surfPointsLessFlat_);
// bool comp (int i,int j) { return (cloudCurvature[i]<cloudCurvature[j]); }
template <typename PointT>
void removeClosedPointCloud(const pcl::PointCloud<PointT>& cloud_in, pcl::PointCloud<PointT>& cloud_out, float thres);
private:
const double scanPeriod = 0.1;
const int systemDelay = 0;
int systemInitCount;
bool systemInited;
int N_SCANS;
float cloudCurvature[400000];
int cloudSortInd[400000];
int cloudNeighborPicked[400000];
int cloudLabel[400000];
ros::NodeHandle nh;
int verbose_level;
// ros::Subscriber subLaserCloud;
ros::Publisher pubLaserCloud;
ros::Publisher pubCornerPointsSharp;
ros::Publisher pubCornerPointsLessSharp;
ros::Publisher pubSurfPointsFlat;
ros::Publisher pubSurfPointsLessFlat;
// ros::Publisher pubRemovePoints;
std::vector<ros::Publisher> pubEachScan;
bool PUB_EACH_LINE;
double MINIMUM_RANGE = 0.1;
pcl::PointCloud<PointType>::Ptr laserCloud;
pcl::PointCloud<PointType>::Ptr cornerPointsSharp;
pcl::PointCloud<PointType>::Ptr cornerPointsLessSharp;
pcl::PointCloud<PointType>::Ptr surfPointsFlat;
pcl::PointCloud<PointType>::Ptr surfPointsLessFlat;
std::vector<pcl::PointCloud<PointType>> laserCloudScans;
TicToc t_whole;
TicToc t_prepare;
};
} // namespace vloam
================================================
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/tic_toc.h
================================================
// Author: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
#pragma once
#include <chrono>
#include <cstdlib>
#include <ctime>
class TicToc
{
public:
TicToc()
{
tic();
}
void tic()
{
start = std::chrono::system_clock::now();
}
double toc()
{
end = std::chrono::system_clock::now();
std::chrono::duration<double> elapsed_seconds = end - start;
return elapsed_seconds.count() * 1000;
}
private:
std::chrono::time_point<std::chrono::system_clock> start, end;
};
================================================
FILE: src/lidar_odometry_mapping/launch/loam_velodyne_HDL_32.launch
================================================
<launch>
<param name="scan_line" type="int" value="32" />
<!-- if 1, do mapping 10 Hz, if 2, do mapping 5 Hz. Suggest to use 1, it will adjust frequence automaticlly -->
<param name="mapping_skip_frame" type="int" value="1" />
<!-- remove too closed points -->
<param name="minimum_range" type="double" value="0.3"/>
<param name="mapping_line_resolution" type="double" value="0.2"/>
<param name="mapping_plane_resolution" type="double" value="0.4"/>
<node pkg="lidar_odometry_mapping" type="ascanRegistration" name="ascanRegistration" output="screen" />
<node pkg="lidar_odometry_mapping" type="alaserOdometry" name="alaserOdometry" output="screen" />
<node pkg="lidar_odometry_mapping" type="alaserMapping" name="alaserMapping" output="screen" />
<arg name="rviz" default="true" />
<group if="$(arg rviz)">
<node launch-prefix="nice" pkg="rviz" type="rviz" name="rviz" args="-d $(find lidar_odometry_mapping)/rviz_cfg/lidar_odometry_mapping.rviz" />
</group>
</launch>
================================================
FILE: src/lidar_odometry_mapping/launch/loam_velodyne_HDL_64.launch
================================================
<launch>
<param name="scan_line" type="int" value="64" />
<!-- if 1, do mapping 10 Hz, if 2, do mapping 5 Hz. Suggest to use 1, it will adjust frequence automaticlly -->
<param name="mapping_skip_frame" type="int" value="1" />
<!-- remove too closed points -->
<param name="minimum_range" type="double" value="5"/>
<param name="mapping_line_resolution" type="double" value="0.4"/>
<param name="mapping_plane_resolution" type="double" value="0.8"/>
<node pkg="lidar_odometry_mapping" type="ascanRegistration" name="ascanRegistration" output="screen" />
<node pkg="lidar_odometry_mapping" type="alaserOdometry" name="alaserOdometry" output="screen" />
<node pkg="lidar_odometry_mapping" type="alaserMapping" name="alaserMapping" output="screen" />
<arg name="rviz" default="true" />
<group if="$(arg rviz)">
<node launch-prefix="nice" pkg="rviz" type="rviz" name="rviz" args="-d $(find lidar_odometry_mapping)/rviz_cfg/lidar_odometry_mapping.rviz" />
</group>
</launch>
================================================
FILE: src/lidar_odometry_mapping/launch/loam_velodyne_HDL_64_kitti.launch
================================================
<launch>
<!-- <remap from="/velodyne_points" to="/kitti/velo/pointcloud"/> -->
<param name="scan_line" type="int" value="64" />
<param name="mapping_skip_frame" type="int" value="1" />
<param name="map_pub_number" type="int" value="20" />
<param name="loam_verbose_level" type="int" value="1" />
<!-- if 1, do mapping 10 Hz, if 2, do mapping 5 Hz. Suggest to use 1, it will adjust frequence automaticlly -->
<!-- <param name="mapping_skip_frame" type="int" value="1" /> -->
<!-- unused by any code -->
<!-- remove too closed points -->
<param name="minimum_range" type="double" value="5"/>
<param name="mapping_line_resolution" type="double" value="0.4"/>
<param name="mapping_plane_resolution" type="double" value="0.8"/>
<arg name="rviz" default="false" />
<group if="$(arg rviz)">
<node launch-prefix="nice" pkg="rviz" type="rviz" name="rviz" args="-d $(find lidar_odometry_mapping)/rviz_cfg/lidar_odometry_mapping.rviz" />
</group>
</launch>
================================================
FILE: src/lidar_odometry_mapping/launch/loam_velodyne_VLP_16.launch
================================================
<launch>
<param name="scan_line" type="int" value="16" />
<!-- if 1, do mapping 10 Hz, if 2, do mapping 5 Hz. Suggest to use 1, it will adjust frequence automaticlly -->
<param name="mapping_skip_frame" type="int" value="1" />
<!-- remove too closed points -->
<param name="minimum_range" type="double" value="0.3"/>
<param name="mapping_line_resolution" type="double" value="0.2"/>
<param name="mapping_plane_resolution" type="double" value="0.4"/>
<node pkg="lidar_odometry_mapping" type="ascanRegistration" name="ascanRegistration" output="screen" />
<node pkg="lidar_odometry_mapping" type="alaserOdometry" name="alaserOdometry" output="screen" />
<node pkg="lidar_odometry_mapping" type="alaserMapping" name="alaserMapping" output="screen" />
<arg name="rviz" default="true" />
<group if="$(arg rviz)">
<node launch-prefix="nice" pkg="rviz" type="rviz" name="rviz" args="-d $(find lidar_odometry_mapping)/rviz_cfg/lidar_odometry_mapping.rviz" />
</group>
</launch>
================================================
FILE: src/lidar_odometry_mapping/package.xml
================================================
<?xml version="1.0"?>
<package>
<name>lidar_odometry_mapping</name>
<version>0.0.0</version>
<description>
This is an advanced implentation of LOAM.
LOAM is described in the following paper:
J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
</description>
<maintainer email="qintonguav@gmail.com">qintong</maintainer>
<license>BSD</license>
<author email="zhangji@cmu.edu">Ji Zhang</author>
<buildtool_depend>catkin</buildtool_depend>
<build_depend>geometry_msgs</build_depend>
<build_depend>nav_msgs</build_depend>
<build_depend>roscpp</build_depend>
<build_depend>rospy</build_depend>
<build_depend>std_msgs</build_depend>
<build_depend>rosbag</build_depend>
<build_depend>sensor_msgs</build_depend>
<build_depend>tf</build_depend>
<build_depend>image_transport</build_depend>
<build_depend>vloam_tf</build_depend>
<run_depend>geometry_msgs</run_depend>
<run_depend>nav_msgs</run_depend>
<run_depend>sensor_msgs</run_depend>
<run_depend>roscpp</run_depend>
<run_depend>rospy</run_depend>
<run_depend>std_msgs</run_depend>
<run_depend>rosbag</run_depend>
<run_depend>tf</run_depend>
<run_depend>image_transport</run_depend>
<run_depend>vloam_tf</run_depend>
<export>
</export>
</package>
================================================
FILE: src/lidar_odometry_mapping/rviz_cfg/lidar_odometry_mapping.rviz
================================================
Panels:
- Class: rviz/Displays
Help Height: 78
Name: Displays
Property Tree Widget:
Expanded:
- /Global Options1
- /Status1
- /odometry1
- /odometry1/odomPath1
- /odometry1/PointCloud21
- /mapping1
- /scan1
- /TF1
- /TF1/Status1
Splitter Ratio: 0.5
Tree Height: 531
- Class: rviz/Selection
Name: Selection
- Class: rviz/Tool Properties
Expanded:
- /2D Pose Estimate1
- /2D Nav Goal1
- /Publish Point1
Name: Tool Properties
Splitter Ratio: 0.5886790156364441
- Class: rviz/Views
Expanded:
- /Current View1
Name: Views
Splitter Ratio: 0.5
- Class: rviz/Time
Experimental: false
Name: Time
SyncMode: 0
SyncSource: surround
Preferences:
PromptSaveOnExit: true
Toolbars:
toolButtonStyle: 2
Visualization Manager:
Class: ""
Displays:
- Alpha: 0.5
Cell Size: 1
Class: rviz/Grid
Color: 160; 160; 164
Enabled: false
Line Style:
Line Width: 0.029999999329447746
Value: Lines
Name: Grid
Normal Cell Count: 0
Offset:
X: 0
Y: 0
Z: 0
Plane: XY
Plane Cell Count: 160
Reference Frame: <Fixed Frame>
Value: false
- Class: rviz/Axes
Enabled: true
Length: 1
Name: Axes
Radius: 0.10000000149011612
Reference Frame: <Fixed Frame>
Value: true
- Class: rviz/Group
Displays:
- Alpha: 1
Buffer Length: 1
Class: rviz/Path
Color: 255; 170; 0
Enabled: false
Head Diameter: 0.30000001192092896
Head Length: 0.20000000298023224
Length: 0.30000001192092896
Line Style: Lines
Line Width: 0.029999999329447746
Name: gtPathlc
Offset:
X: 0
Y: 0
Z: 0
Pose Color: 255; 85; 255
Pose Style: None
Radius: 0.029999999329447746
Shaft Diameter: 0.10000000149011612
Shaft Length: 0.10000000149011612
Topic: /path_gt
Unreliable: false
Value: false
- Alpha: 1
Buffer Length: 1
Class: rviz/Path
Color: 255; 170; 0
Enabled: true
Head Diameter: 0.30000001192092896
Head Length: 0.20000000298023224
Length: 0.30000001192092896
Line Style: Lines
Line Width: 0.029999999329447746
Name: gtPathLidar
Offset:
X: 0
Y: 0
Z: 0
Pose Color: 255; 85; 255
Pose Style: None
Radius: 0.029999999329447746
Shaft Diameter: 0.10000000149011612
Shaft Length: 0.10000000149011612
Topic: /path_gt_lidar
Unreliable: false
Value: true
Enabled: false
Name: GT
- Class: rviz/Group
Displays:
- Alpha: 1
Buffer Length: 1
Class: rviz/Path
Color: 255; 85; 0
Enabled: true
Head Diameter: 0.30000001192092896
Head Length: 0.20000000298023224
Length: 0.30000001192092896
Line Style: Lines
Line Width: 0.10000000149011612
Name: odomPath
Offset:
X: 0
Y: 0
Z: 0
Pose Color: 255; 85; 255
Pose Style: None
Radius: 0.029999999329447746
Shaft Diameter: 0.10000000149011612
Shaft Length: 0.10000000149011612
Topic: /laser_odom_path
Unreliable: false
Value: true
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 63.10047912597656
Min Color: 0; 0; 0
Min Intensity: -0.0067862942814826965
Name: PointCloud2
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.05000000074505806
Style: Flat Squares
Topic: /velodyne_cloud_2
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
Enabled: true
Name: odometry
- Class: rviz/Group
Displays:
- Alpha: 1
Buffer Length: 1
Class: rviz/Path
Color: 25; 255; 0
Enabled: true
Head Diameter: 0.30000001192092896
Head Length: 0.20000000298023224
Length: 0.30000001192092896
Line Style: Lines
Line Width: 0.029999999329447746
Name: mappingPath
Offset:
X: 0
Y: 0
Z: 0
Pose Color: 255; 85; 255
Pose Style: None
Radius: 0.029999999329447746
Shaft Diameter: 0.10000000149011612
Shaft Length: 0.10000000149011612
Topic: /aft_mapped_path
Unreliable: false
Value: true
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: FlatColor
Decay Time: 0
Enabled: true
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: -999999
Min Color: 0; 0; 0
Min Intensity: 999999
Name: allMapCloud
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.05000000074505806
Style: Flat Squares
Topic: /laser_cloud_map
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: true
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: FlatColor
Decay Time: 0
Enabled: true
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 15
Min Color: 0; 0; 0
Min Intensity: 0
Name: surround
Position Transformer: XYZ
Queue Size: 1
Selectable: true
Size (Pixels): 1
Size (m): 0.05000000074505806
Style: Squares
Topic: /laser_cloud_surround
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: true
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: Intensity
Decay Time: 0
Enabled: true
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 47.11855697631836
Min Color: 0; 0; 0
Min Intensity: -0.014337587170302868
Name: currPoints
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 2
Size (m): 0.009999999776482582
Style: Points
Topic: /velodyne_cloud_registered
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: true
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 0; 0
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 63.03843307495117
Min Color: 255; 0; 0
Min Intensity: -0.006431261543184519
Name: corner
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.20000000298023224
Style: Flat Squares
Topic: /mapping_corner
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 0; 0
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 63.0818977355957
Min Color: 0; 0; 0
Min Intensity: -0.005241604056209326
Name: surf
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.20000000298023224
Style: Flat Squares
Topic: /mapping_surf
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 0; 255; 0
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 17.08800506591797
Min Color: 0; 0; 0
Min Intensity: -0.006790489889681339
Name: used_corner
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.5
Style: Flat Squares
Topic: /mapping_used_corner
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 0; 255; 0
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 0; 255; 0
Max Intensity: 63.0818977355957
Min Color: 0; 0; 0
Min Intensity: -0.005241604056209326
Name: used_surf
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.5
Style: Flat Squares
Topic: /mapping_used_surf
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 21.062847137451172
Min Color: 0; 0; 0
Min Intensity: -0.006625724025070667
Name: map_corner
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.10000000149011612
Style: Flat Squares
Topic: /mapping_map_corner
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 63.08392333984375
Min Color: 0; 0; 0
Min Intensity: -0.00876065157353878
Name: map_surf
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.10000000149011612
Style: Flat Squares
Topic: /mapping_map_surf
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Buffer Length: 1
Class: rviz/Path
Color: 25; 255; 0
Enabled: false
Head Diameter: 0.30000001192092896
Head Length: 0.20000000298023224
Length: 0.30000001192092896
Line Style: Lines
Line Width: 0.029999999329447746
Name: leftcameraPath
Offset:
X: 0
Y: 0
Z: 0
Pose Color: 255; 85; 255
Pose Style: None
Radius: 0.029999999329447746
Shaft Diameter: 0.10000000149011612
Shaft Length: 0.10000000149011612
Topic: /lc_path
Unreliable: false
Value: false
Enabled: true
Name: mapping
- Class: rviz/Group
Displays:
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 15
Min Color: 0; 0; 0
Min Intensity: 0
Name: sharp
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.009999999776482582
Style: Points
Topic: /laser_cloud_sharp
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: Intensity
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 63.03987121582031
Min Color: 0; 0; 0
Min Intensity: 0.0263746976852417
Name: flat
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.05000000074505806
Style: Flat Squares
Topic: /laser_cloud_flat
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 255; 255
Color Transformer: FlatColor
Decay Time: 0
Enabled: true
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 0.9900000095367432
Min Color: 0; 0; 0
Min Intensity: 0
Name: raw_data
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.05000000074505806
Style: Flat Squares
Topic: /velodyne_points
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: true
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 0; 0
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 0.12361729145050049
Min Color: 0; 0; 0
Min Intensity: -0.00711920578032732
Name: scan
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.20000000298023224
Style: Flat Squares
Topic: /laser_scanid_63
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/PointCloud2
Color: 255; 85; 0
Color Transformer: FlatColor
Decay Time: 0
Enabled: false
Invert Rainbow: false
Max Color: 255; 255; 255
Max Intensity: 62.07567596435547
Min Color: 0; 0; 0
Min Intensity: -0.007054195739328861
Name: removPoints
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.20000000298023224
Style: Flat Squares
Topic: /laser_remove_points
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: false
Enabled: false
Name: scan
- Class: rviz/Image
Enabled: true
Image Topic: /kitti/image
Max Value: 1
Median window: 5
Min Value: 0
Name: Image
Normalize Range: true
Queue Size: 2
Transport Hint: raw
Unreliable: false
Value: true
- Class: rviz/TF
Enabled: true
Frame Timeout: 15
Frames:
All Enabled: true
aft_mapped:
Value: true
base_link:
Value: true
camera_color_left:
Value: true
camera_color_right:
Value: true
camera_gray_left:
Value: true
camera_gray_right:
Value: true
camera_init:
Value: true
imu_link:
Value: true
velo_link:
Value: true
world:
Value: true
Marker Scale: 1
Name: TF
Show Arrows: true
Show Axes: true
Show Names: true
Tree:
camera_init:
aft_mapped:
{}
Update Interval: 0
Value: true
Enabled: true
Global Options:
Background Color: 0; 0; 0
Default Light: true
Fixed Frame: camera_init
Frame Rate: 30
Name: root
Tools:
- Class: rviz/Interact
Hide Inactive Objects: true
- Class: rviz/MoveCamera
- Class: rviz/Select
- Class: rviz/FocusCamera
- Class: rviz/Measure
- Class: rviz/SetInitialPose
Theta std deviation: 0.2617993950843811
Topic: /initialpose
X std deviation: 0.5
Y std deviation: 0.5
- Class: rviz/SetGoal
Topic: /move_base_simple/goal
- Class: rviz/PublishPoint
Single click: true
Topic: /clicked_point
Value: true
Views:
Current:
Class: rviz/XYOrbit
Distance: 29.97575569152832
Enable Stereo Rendering:
Stereo Eye Separation: 0.05999999865889549
Stereo Focal Distance: 1
Swap Stereo Eyes: false
Value: false
Focal Point:
X: 102.75230407714844
Y: -5.920174598693848
Z: -2.309799909591675
Focal Shape Fixed Size: true
Focal Shape Size: 0.05000000074505806
Invert Z Axis: false
Name: Current View
Near Clip Distance: 0.009999999776482582
Pitch: 0.30479696393013
Target Frame: body
Value: XYOrbit (rviz)
Yaw: 3.1050009727478027
Saved: ~
Window Geometry:
Displays:
collapsed: false
Height: 1025
Hide Left Dock: false
Hide Right Dock: true
Image:
collapsed: false
QMainWindow State: 000000ff00000000fd00000004000000000000021b00000363fc0200000009fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073010000003d0000029e000000c900fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000000a0049006d00610067006501000002e1000000bf0000001600ffffff000000010000010f00000396fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000002800000396000000a400fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e100000197000000030000073d0000003efc0100000002fb0000000800540069006d006501000000000000073d000002eb00fffffffb0000000800540069006d006501000000000000045000000000000000000000051c0000036300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000
Selection:
collapsed: false
Time:
collapsed: false
Tool Properties:
collapsed: false
Views:
collapsed: true
Width: 1853
X: 67
Y: 27
================================================
FILE: src/lidar_odometry_mapping/src/laser_mapping.cpp
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#include <lidar_odometry_mapping/laser_mapping.h>
namespace vloam
{
void LaserMapping::init(std::shared_ptr<VloamTF> &vloam_tf_)
{
vloam_tf = vloam_tf_;
if (!ros::param::get("loam_verbose_level", verbose_level))
ROS_BREAK();
frameCount = 0;
// input: from odom
laserCloudCornerLast = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudSurfLast = boost::make_shared<pcl::PointCloud<PointType>>();
// ouput: all visualble cube points
laserCloudSurround = boost::make_shared<pcl::PointCloud<PointType>>();
// surround points in map to build tree
laserCloudCornerFromMap = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudSurfFromMap = boost::make_shared<pcl::PointCloud<PointType>>();
// input & output: points in one frame. local --> global
laserCloudFullRes = boost::make_shared<pcl::PointCloud<PointType>>();
// points in every cube
for (int j = 0; j < laserCloudNum; ++j)
{
laserCloudCornerArray[j] = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudSurfArray[j] = boost::make_shared<pcl::PointCloud<PointType>>();
}
// kd-tree
kdtreeCornerFromMap = boost::make_shared<pcl::KdTreeFLANN<PointType>>();
kdtreeSurfFromMap = boost::make_shared<pcl::KdTreeFLANN<PointType>>();
parameters[0] = 0.0;
parameters[1] = 0.0;
parameters[2] = 0.0;
parameters[3] = 1.0;
parameters[4] = 0.0;
parameters[5] = 0.0;
parameters[6] = 0.0;
new (&q_w_curr) Eigen::Map<Eigen::Quaterniond>(parameters);
new (&t_w_curr) Eigen::Map<Eigen::Vector3d>(parameters + 4);
// wmap_T_odom * odom_T_curr = wmap_T_curr;
// transformation between odom's world and map's world frame
q_wmap_wodom = Eigen::Quaterniond(1, 0, 0, 0);
t_wmap_wodom = Eigen::Vector3d(0, 0, 0);
q_wodom_curr = Eigen::Quaterniond(1, 0, 0, 0);
t_wodom_curr = Eigen::Vector3d(0, 0, 0);
lineRes = 0;
planeRes = 0;
if (!ros::param::get("mapping_line_resolution", lineRes))
ROS_BREAK();
if (!ros::param::get("mapping_plane_resolution", planeRes))
ROS_BREAK();
ROS_INFO("line resolution %f plane resolution %f \n", lineRes, planeRes);
downSizeFilterCorner.setLeafSize(lineRes, lineRes, lineRes);
downSizeFilterSurf.setLeafSize(planeRes, planeRes, planeRes);
pubLaserCloudSurround = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_surround", 100);
pubLaserCloudMap = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_map", 100);
pubLaserCloudFullRes = nh.advertise<sensor_msgs::PointCloud2>("/velodyne_cloud_registered", 100);
pubOdomAftMapped = nh.advertise<nav_msgs::Odometry>("/aft_mapped_to_init", 100);
// pubOdomAftMappedHighFrec = nh->advertise<nav_msgs::Odometry>("/aft_mapped_to_init_high_frec", 100);
pubLaserAfterMappedPath = nh.advertise<nav_msgs::Path>("/aft_mapped_path", 100);
laserAfterMappedPath.poses.clear();
// for (int i = 0; i < laserCloudNum; i++)
// {
// laserCloudCornerArray[i].reset(new pcl::PointCloud<PointType>());
// laserCloudSurfArray[i].reset(new pcl::PointCloud<PointType>());
// } // TOOD: check when this is needed
laserCloudValidNum = 0;
laserCloudSurroundNum = 0;
if (!ros::param::get("mapping_skip_frame", mapping_skip_frame))
ROS_BREAK();
if (!ros::param::get("map_pub_number", map_pub_number))
ROS_BREAK();
}
void LaserMapping::reset()
{
laserCloudValidNum = 0;
laserCloudSurroundNum = 0;
}
// // set initial guess
// void LaserMapping::transformAssociateToMap()
// {
// q_w_curr = q_wmap_wodom * q_wodom_curr;
// t_w_curr = q_wmap_wodom * t_wodom_curr + t_wmap_wodom;
// }
void LaserMapping::transformUpdate()
{
q_wmap_wodom = q_w_curr * q_wodom_curr.inverse();
t_wmap_wodom = t_w_curr - q_wmap_wodom * t_wodom_curr;
}
void LaserMapping::pointAssociateToMap(PointType const *const pi, PointType *const po)
{
Eigen::Vector3d point_curr(pi->x, pi->y, pi->z);
Eigen::Vector3d point_w = q_w_curr * point_curr + t_w_curr;
po->x = point_w.x();
po->y = point_w.y();
po->z = point_w.z();
po->intensity = pi->intensity;
// po->intensity = 1.0;
}
void LaserMapping::pointAssociateTobeMapped(PointType const *const pi, PointType *const po)
{
Eigen::Vector3d point_w(pi->x, pi->y, pi->z);
Eigen::Vector3d point_curr = q_w_curr.inverse() * (point_w - t_w_curr);
po->x = point_curr.x();
po->y = point_curr.y();
po->z = point_curr.z();
po->intensity = pi->intensity;
}
void LaserMapping::input(const pcl::PointCloud<PointType>::Ptr &laserCloudCornerLast_,
const pcl::PointCloud<PointType>::Ptr &laserCloudSurfLast_,
const pcl::PointCloud<PointType>::Ptr &laserCloudFullRes_,
const Eigen::Quaterniond &q_wodom_curr_, const Eigen::Vector3d &t_wodom_curr_,
const bool &skip_frame_)
{
skip_frame = skip_frame_;
if (!skip_frame)
{
laserCloudCornerLast = boost::make_shared<pcl::PointCloud<PointType>>(*laserCloudCornerLast_);
laserCloudSurfLast = boost::make_shared<pcl::PointCloud<PointType>>(*laserCloudSurfLast_);
laserCloudFullRes = boost::make_shared<pcl::PointCloud<PointType>>(*laserCloudFullRes_);
}
q_wodom_curr = q_wodom_curr_;
t_wodom_curr = t_wodom_curr_;
// transformAssociateToMap
if (skip_frame)
{
q_w_curr_highfreq = q_wmap_wodom * q_wodom_curr;
t_w_curr_highfreq = q_wmap_wodom * t_wodom_curr + t_wmap_wodom;
}
else
{
q_w_curr = q_wmap_wodom * q_wodom_curr;
t_w_curr = q_wmap_wodom * t_wodom_curr + t_wmap_wodom;
}
}
void LaserMapping::solveMapping()
{
// this->reset();
t_whole.tic();
// transformAssociateToMap();
TicToc t_shift;
int centerCubeI = int((t_w_curr.x() + 25.0) / 50.0) + laserCloudCenWidth;
int centerCubeJ = int((t_w_curr.y() + 25.0) / 50.0) + laserCloudCenHeight;
int centerCubeK = int((t_w_curr.z() + 25.0) / 50.0) + laserCloudCenDepth;
if (t_w_curr.x() + 25.0 < 0)
centerCubeI--;
if (t_w_curr.y() + 25.0 < 0)
centerCubeJ--;
if (t_w_curr.z() + 25.0 < 0)
centerCubeK--;
while (centerCubeI < 3)
{
for (int j = 0; j < laserCloudHeight; j++)
{
for (int k = 0; k < laserCloudDepth; k++)
{
int i = laserCloudWidth - 1;
pcl::PointCloud<PointType>::Ptr laserCloudCubeCornerPointer =
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
pcl::PointCloud<PointType>::Ptr laserCloudCubeSurfPointer =
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
for (; i >= 1; i--)
{
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCornerArray[i - 1 + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudSurfArray[i - 1 + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
}
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeCornerPointer;
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeSurfPointer;
laserCloudCubeCornerPointer->clear();
laserCloudCubeSurfPointer->clear();
}
}
centerCubeI++;
laserCloudCenWidth++;
}
while (centerCubeI >= laserCloudWidth - 3)
{
for (int j = 0; j < laserCloudHeight; j++)
{
for (int k = 0; k < laserCloudDepth; k++)
{
int i = 0;
pcl::PointCloud<PointType>::Ptr laserCloudCubeCornerPointer =
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
pcl::PointCloud<PointType>::Ptr laserCloudCubeSurfPointer =
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
for (; i < laserCloudWidth - 1; i++)
{
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCornerArray[i + 1 + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudSurfArray[i + 1 + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
}
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeCornerPointer;
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeSurfPointer;
laserCloudCubeCornerPointer->clear();
laserCloudCubeSurfPointer->clear();
}
}
centerCubeI--;
laserCloudCenWidth--;
}
while (centerCubeJ < 3)
{
for (int i = 0; i < laserCloudWidth; i++)
{
for (int k = 0; k < laserCloudDepth; k++)
{
int j = laserCloudHeight - 1;
pcl::PointCloud<PointType>::Ptr laserCloudCubeCornerPointer =
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
pcl::PointCloud<PointType>::Ptr laserCloudCubeSurfPointer =
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
for (; j >= 1; j--)
{
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCornerArray[i + laserCloudWidth * (j - 1) + laserCloudWidth * laserCloudHeight * k];
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudSurfArray[i + laserCloudWidth * (j - 1) + laserCloudWidth * laserCloudHeight * k];
}
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeCornerPointer;
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeSurfPointer;
laserCloudCubeCornerPointer->clear();
laserCloudCubeSurfPointer->clear();
}
}
centerCubeJ++;
laserCloudCenHeight++;
}
while (centerCubeJ >= laserCloudHeight - 3)
{
for (int i = 0; i < laserCloudWidth; i++)
{
for (int k = 0; k < laserCloudDepth; k++)
{
int j = 0;
pcl::PointCloud<PointType>::Ptr laserCloudCubeCornerPointer =
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
pcl::PointCloud<PointType>::Ptr laserCloudCubeSurfPointer =
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
for (; j < laserCloudHeight - 1; j++)
{
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCornerArray[i + laserCloudWidth * (j + 1) + laserCloudWidth * laserCloudHeight * k];
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudSurfArray[i + laserCloudWidth * (j + 1) + laserCloudWidth * laserCloudHeight * k];
}
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeCornerPointer;
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeSurfPointer;
laserCloudCubeCornerPointer->clear();
laserCloudCubeSurfPointer->clear();
}
}
centerCubeJ--;
laserCloudCenHeight--;
}
while (centerCubeK < 3)
{
for (int i = 0; i < laserCloudWidth; i++)
{
for (int j = 0; j < laserCloudHeight; j++)
{
int k = laserCloudDepth - 1;
pcl::PointCloud<PointType>::Ptr laserCloudCubeCornerPointer =
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
pcl::PointCloud<PointType>::Ptr laserCloudCubeSurfPointer =
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
for (; k >= 1; k--)
{
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * (k - 1)];
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * (k - 1)];
}
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeCornerPointer;
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeSurfPointer;
laserCloudCubeCornerPointer->clear();
laserCloudCubeSurfPointer->clear();
}
}
centerCubeK++;
laserCloudCenDepth++;
}
while (centerCubeK >= laserCloudDepth - 3)
{
for (int i = 0; i < laserCloudWidth; i++)
{
for (int j = 0; j < laserCloudHeight; j++)
{
int k = 0;
pcl::PointCloud<PointType>::Ptr laserCloudCubeCornerPointer =
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
pcl::PointCloud<PointType>::Ptr laserCloudCubeSurfPointer =
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k];
for (; k < laserCloudDepth - 1; k++)
{
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * (k + 1)];
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * (k + 1)];
}
laserCloudCornerArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeCornerPointer;
laserCloudSurfArray[i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k] =
laserCloudCubeSurfPointer;
laserCloudCubeCornerPointer->clear();
laserCloudCubeSurfPointer->clear();
}
}
centerCubeK--;
laserCloudCenDepth--;
}
for (int i = centerCubeI - 2; i <= centerCubeI + 2; i++)
{
for (int j = centerCubeJ - 2; j <= centerCubeJ + 2; j++)
{
for (int k = centerCubeK - 1; k <= centerCubeK + 1; k++)
{
if (i >= 0 && i < laserCloudWidth && j >= 0 && j < laserCloudHeight && k >= 0 && k < laserCloudDepth)
{
laserCloudValidInd[laserCloudValidNum] = i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k;
laserCloudValidNum++;
laserCloudSurroundInd[laserCloudSurroundNum] =
i + laserCloudWidth * j + laserCloudWidth * laserCloudHeight * k;
laserCloudSurroundNum++;
}
}
}
}
laserCloudCornerFromMap->clear();
laserCloudSurfFromMap->clear();
for (int i = 0; i < laserCloudValidNum; i++)
{
*laserCloudCornerFromMap += *laserCloudCornerArray[laserCloudValidInd[i]];
*laserCloudSurfFromMap += *laserCloudSurfArray[laserCloudValidInd[i]];
}
int laserCloudCornerFromMapNum = laserCloudCornerFromMap->points.size();
int laserCloudSurfFromMapNum = laserCloudSurfFromMap->points.size();
pcl::PointCloud<PointType>::Ptr laserCloudCornerStack(new pcl::PointCloud<PointType>());
downSizeFilterCorner.setInputCloud(laserCloudCornerLast);
downSizeFilterCorner.filter(*laserCloudCornerStack);
int laserCloudCornerStackNum = laserCloudCornerStack->points.size();
pcl::PointCloud<PointType>::Ptr laserCloudSurfStack(new pcl::PointCloud<PointType>());
downSizeFilterSurf.setInputCloud(laserCloudSurfLast);
downSizeFilterSurf.filter(*laserCloudSurfStack);
int laserCloudSurfStackNum = laserCloudSurfStack->points.size();
if (verbose_level > 1)
{
ROS_INFO("map prepare time %f ms\n", t_shift.toc());
ROS_INFO("map corner num %d surf num %d \n", laserCloudCornerFromMapNum, laserCloudSurfFromMapNum);
}
if (laserCloudCornerFromMapNum > 10 && laserCloudSurfFromMapNum > 50)
{
TicToc t_opt;
TicToc t_tree;
kdtreeCornerFromMap->setInputCloud(laserCloudCornerFromMap);
kdtreeSurfFromMap->setInputCloud(laserCloudSurfFromMap);
if (verbose_level > 1)
ROS_INFO("build tree time %f ms \n", t_tree.toc());
for (int iterCount = 0; iterCount < 2; iterCount++)
{
// ceres::LossFunction *loss_function = NULL;
ceres::LossFunction *loss_function = new ceres::HuberLoss(0.1);
ceres::LocalParameterization *q_parameterization = new ceres::EigenQuaternionParameterization();
ceres::Problem::Options problem_options;
ceres::Problem problem(problem_options);
problem.AddParameterBlock(parameters, 4, q_parameterization);
problem.AddParameterBlock(parameters + 4, 3);
TicToc t_data;
int corner_num = 0;
for (int i = 0; i < laserCloudCornerStackNum; i++)
{
pointOri = laserCloudCornerStack->points[i];
// double sqrtDis = pointOri.x * pointOri.x + pointOri.y * pointOri.y + pointOri.z * pointOri.z;
pointAssociateToMap(&pointOri, &pointSel);
kdtreeCornerFromMap->nearestKSearch(pointSel, 5, pointSearchInd, pointSearchSqDis);
if (pointSearchSqDis[4] < 1.0)
{
std::vector<Eigen::Vector3d> nearCorners;
Eigen::Vector3d center(0, 0, 0);
for (int j = 0; j < 5; j++)
{
Eigen::Vector3d tmp(laserCloudCornerFromMap->points[pointSearchInd[j]].x,
laserCloudCornerFromMap->points[pointSearchInd[j]].y,
laserCloudCornerFromMap->points[pointSearchInd[j]].z);
center = center + tmp;
nearCorners.push_back(tmp);
}
center = center / 5.0;
Eigen::Matrix3d covMat = Eigen::Matrix3d::Zero();
for (int j = 0; j < 5; j++)
{
Eigen::Matrix<double, 3, 1> tmpZeroMean = nearCorners[j] - center;
covMat = covMat + tmpZeroMean * tmpZeroMean.transpose();
}
Eigen::SelfAdjointEigenSolver<Eigen::Matrix3d> saes(covMat);
// if is indeed line feature
// note Eigen library sort eigenvalues in increasing order
Eigen::Vector3d unit_direction = saes.eigenvectors().col(2);
Eigen::Vector3d curr_point(pointOri.x, pointOri.y, pointOri.z);
if (saes.eigenvalues()[2] > 3 * saes.eigenvalues()[1])
{
Eigen::Vector3d point_on_line = center;
Eigen::Vector3d point_a, point_b;
point_a = 0.1 * unit_direction + point_on_line;
point_b = -0.1 * unit_direction + point_on_line;
ceres::CostFunction *cost_function = LidarEdgeFactor::Create(curr_point, point_a, point_b, 1.0);
problem.AddResidualBlock(cost_function, loss_function, parameters, parameters + 4);
corner_num++;
}
}
/*
else if(pointSearchSqDis[4] < 0.01 * sqrtDis)
{
Eigen::Vector3d center(0, 0, 0);
for (int j = 0; j < 5; j++)
{
Eigen::Vector3d tmp(laserCloudCornerFromMap->points[pointSearchInd[j]].x,
laserCloudCornerFromMap->points[pointSearchInd[j]].y,
laserCloudCornerFromMap->points[pointSearchInd[j]].z);
center = center + tmp;
}
center = center / 5.0;
Eigen::Vector3d curr_point(pointOri.x, pointOri.y, pointOri.z);
ceres::CostFunction *cost_function = LidarDistanceFactor::Create(curr_point, center);
problem.AddResidualBlock(cost_function, loss_function, parameters, parameters + 4);
}
*/
}
int surf_num = 0;
for (int i = 0; i < laserCloudSurfStackNum; i++)
{
pointOri = laserCloudSurfStack->points[i];
// double sqrtDis = pointOri.x * pointOri.x + pointOri.y * pointOri.y + pointOri.z * pointOri.z;
pointAssociateToMap(&pointOri, &pointSel);
kdtreeSurfFromMap->nearestKSearch(pointSel, 5, pointSearchInd, pointSearchSqDis);
Eigen::Matrix<double, 5, 3> matA0;
Eigen::Matrix<double, 5, 1> matB0 = -1 * Eigen::Matrix<double, 5, 1>::Ones();
if (pointSearchSqDis[4] < 1.0)
{
for (int j = 0; j < 5; j++)
{
matA0(j, 0) = laserCloudSurfFromMap->points[pointSearchInd[j]].x;
matA0(j, 1) = laserCloudSurfFromMap->points[pointSearchInd[j]].y;
matA0(j, 2) = laserCloudSurfFromMap->points[pointSearchInd[j]].z;
// ROS_INFO(" pts %f %f %f ", matA0(j, 0), matA0(j, 1), matA0(j, 2));
}
// find the norm of plane
Eigen::Vector3d norm = matA0.colPivHouseholderQr().solve(matB0);
double negative_OA_dot_norm = 1 / norm.norm();
norm.normalize();
// Here n(pa, pb, pc) is unit norm of plane
bool planeValid = true;
for (int j = 0; j < 5; j++)
{
// if OX * n > 0.2, then plane is not fit well
if (fabs(norm(0) * laserCloudSurfFromMap->points[pointSearchInd[j]].x +
norm(1) * laserCloudSurfFromMap->points[pointSearchInd[j]].y +
norm(2) * laserCloudSurfFromMap->points[pointSearchInd[j]].z + negative_OA_dot_norm) > 0.2)
{
planeValid = false;
break;
}
}
Eigen::Vector3d curr_point(pointOri.x, pointOri.y, pointOri.z);
if (planeValid)
{
ceres::CostFunction *cost_function = LidarPlaneNormFactor::Create(curr_point, norm, negative_OA_dot_norm);
problem.AddResidualBlock(cost_function, loss_function, parameters, parameters + 4);
surf_num++;
}
}
/*
else if(pointSearchSqDis[4] < 0.01 * sqrtDis)
{
Eigen::Vector3d center(0, 0, 0);
for (int j = 0; j < 5; j++)
{
Eigen::Vector3d tmp(laserCloudSurfFromMap->points[pointSearchInd[j]].x,
laserCloudSurfFromMap->points[pointSearchInd[j]].y,
laserCloudSurfFromMap->points[pointSearchInd[j]].z);
center = center + tmp;
}
center = center / 5.0;
Eigen::Vector3d curr_point(pointOri.x, pointOri.y, pointOri.z);
ceres::CostFunction *cost_function = LidarDistanceFactor::Create(curr_point, center);
problem.AddResidualBlock(cost_function, loss_function, parameters, parameters + 4);
}
*/
}
if (verbose_level > 1)
{
// ROS_INFO("corner num %d used corner num %d \n", laserCloudCornerStackNum, corner_num);
// ROS_INFO("surf num %d used surf num %d \n", laserCloudSurfStackNum, surf_num);
ROS_INFO("mapping data assosiation time %f ms \n", t_data.toc());
}
TicToc t_solver;
ceres::Solver::Options options;
options.linear_solver_type = ceres::DENSE_QR;
options.max_num_iterations = 4;
options.minimizer_progress_to_stdout = false;
options.check_gradients = false;
options.gradient_check_relative_precision = 1e-4;
ceres::Solver::Summary summary;
ceres::Solve(options, &problem, &summary);
if (verbose_level > 1)
ROS_INFO("mapping solver time %f ms \n", t_solver.toc());
// ROS_INFO("time %f \n", timeLaserOdometry);
// ROS_INFO("corner factor num %d surf factor num %d\n", corner_num, surf_num);
// ROS_INFO("result q %f %f %f %f result t %f %f %f\n", parameters[3], parameters[0], parameters[1],
// parameters[2], parameters[4], parameters[5], parameters[6]);
}
if (verbose_level > 1)
ROS_INFO("mapping optimization time %f \n", t_opt.toc());
}
else
{
if (verbose_level > 1)
ROS_WARN("time Map corner and surf num are not enough");
}
transformUpdate();
TicToc t_add;
for (int i = 0; i < laserCloudCornerStackNum; i++)
{
pointAssociateToMap(&laserCloudCornerStack->points[i], &pointSel);
int cubeI = int((pointSel.x + 25.0) / 50.0) + laserCloudCenWidth;
int cubeJ = int((pointSel.y + 25.0) / 50.0) + laserCloudCenHeight;
int cubeK = int((pointSel.z + 25.0) / 50.0) + laserCloudCenDepth;
if (pointSel.x + 25.0 < 0)
cubeI--;
if (pointSel.y + 25.0 < 0)
cubeJ--;
if (pointSel.z + 25.0 < 0)
cubeK--;
if (cubeI >= 0 && cubeI < laserCloudWidth && cubeJ >= 0 && cubeJ < laserCloudHeight && cubeK >= 0 &&
cubeK < laserCloudDepth)
{
int cubeInd = cubeI + laserCloudWidth * cubeJ + laserCloudWidth * laserCloudHeight * cubeK;
laserCloudCornerArray[cubeInd]->push_back(pointSel);
}
}
for (int i = 0; i < laserCloudSurfStackNum; i++)
{
pointAssociateToMap(&laserCloudSurfStack->points[i], &pointSel);
int cubeI = int((pointSel.x + 25.0) / 50.0) + laserCloudCenWidth;
int cubeJ = int((pointSel.y + 25.0) / 50.0) + laserCloudCenHeight;
int cubeK = int((pointSel.z + 25.0) / 50.0) + laserCloudCenDepth;
if (pointSel.x + 25.0 < 0)
cubeI--;
if (pointSel.y + 25.0 < 0)
cubeJ--;
if (pointSel.z + 25.0 < 0)
cubeK--;
if (cubeI >= 0 && cubeI < laserCloudWidth && cubeJ >= 0 && cubeJ < laserCloudHeight && cubeK >= 0 &&
cubeK < laserCloudDepth)
{
int cubeInd = cubeI + laserCloudWidth * cubeJ + laserCloudWidth * laserCloudHeight * cubeK;
laserCloudSurfArray[cubeInd]->push_back(pointSel);
}
}
if (verbose_level > 1)
ROS_INFO("add points time %f ms\n", t_add.toc());
TicToc t_filter;
for (int i = 0; i < laserCloudValidNum; i++)
{
int ind = laserCloudValidInd[i];
pcl::PointCloud<PointType>::Ptr tmpCorner(new pcl::PointCloud<PointType>());
downSizeFilterCorner.setInputCloud(laserCloudCornerArray[ind]);
downSizeFilterCorner.filter(*tmpCorner);
laserCloudCornerArray[ind] = tmpCorner;
pcl::PointCloud<PointType>::Ptr tmpSurf(new pcl::PointCloud<PointType>());
downSizeFilterSurf.setInputCloud(laserCloudSurfArray[ind]);
downSizeFilterSurf.filter(*tmpSurf);
laserCloudSurfArray[ind] = tmpSurf;
}
if (verbose_level > 1)
ROS_INFO("filter time %f ms \n", t_filter.toc());
frameCount++;
}
void LaserMapping::publish()
{ // TODO: take in global skip frame
TicToc t_pub;
nav_msgs::Odometry odomAftMapped;
odomAftMapped.header.frame_id = "map";
odomAftMapped.child_frame_id = "aft_mapped";
odomAftMapped.header.stamp = ros::Time::now(); // TODO: globally config time stamp
if (!skip_frame)
{
odomAftMapped.pose.pose.orientation.x = q_w_curr.x();
odomAftMapped.pose.pose.orientation.y = q_w_curr.y();
odomAftMapped.pose.pose.orientation.z = q_w_curr.z();
odomAftMapped.pose.pose.orientation.w = q_w_curr.w();
odomAftMapped.pose.pose.position.x = t_w_curr.x();
odomAftMapped.pose.pose.position.y = t_w_curr.y();
odomAftMapped.pose.pose.position.z = t_w_curr.z();
vloam_tf->world_MOT_base_last.setOrigin(tf2::Vector3(t_w_curr.x(), t_w_curr.y(), t_w_curr.z()));
vloam_tf->world_MOT_base_last.setRotation(tf2::Quaternion(q_w_curr.x(), q_w_curr.y(), q_w_curr.z(), q_w_curr.w()));
// ROS_INFO(
// "lidar_mapping: q = %.4f, %.4f, %.4f, %.4f; t = %.4f, %.4f, %.4f",
// q_w_curr.x(),
// q_w_curr.y(),
// q_w_curr.z(),
// q_w_curr.w(),
// t_w_curr.x(),
// t_w_curr.y(),
// t_w_curr.z()
// );
}
else
{
odomAftMapped.pose.pose.orientation.x = q_w_curr_highfreq.x();
odomAftMapped.pose.pose.orientation.y = q_w_curr_highfreq.y();
odomAftMapped.pose.pose.orientation.z = q_w_curr_highfreq.z();
odomAftMapped.pose.pose.orientation.w = q_w_curr_highfreq.w();
odomAftMapped.pose.pose.position.x = t_w_curr_highfreq.x();
odomAftMapped.pose.pose.position.y = t_w_curr_highfreq.y();
odomAftMapped.pose.pose.position.z = t_w_curr_highfreq.z();
vloam_tf->world_MOT_base_last.setOrigin(
tf2::Vector3(t_w_curr_highfreq.x(), t_w_curr_highfreq.y(), t_w_curr_highfreq.z()));
vloam_tf->world_MOT_base_last.setRotation(
tf2::Quaternion(q_w_curr_highfreq.x(), q_w_curr_highfreq.y(), q_w_curr_highfreq.z(), q_w_curr_highfreq.w()));
}
pubOdomAftMapped.publish(odomAftMapped);
geometry_msgs::PoseStamped laserAfterMappedPose;
laserAfterMappedPose.header = odomAftMapped.header;
laserAfterMappedPose.pose = odomAftMapped.pose.pose;
laserAfterMappedPath.header.stamp = odomAftMapped.header.stamp;
laserAfterMappedPath.header.frame_id = "map";
laserAfterMappedPath.poses.push_back(laserAfterMappedPose);
pubLaserAfterMappedPath.publish(laserAfterMappedPath);
static tf::TransformBroadcaster br;
tf::Transform transform;
tf::Quaternion q;
transform.setOrigin(tf::Vector3(t_w_curr(0), t_w_curr(1), t_w_curr(2)));
q.setW(q_w_curr.w());
q.setX(q_w_curr.x());
q.setY(q_w_curr.y());
q.setZ(q_w_curr.z());
transform.setRotation(q);
br.sendTransform(tf::StampedTransform(transform, odomAftMapped.header.stamp, "map", "aft_mapped"));
if ((frameCount * mapping_skip_frame) % map_pub_number == 0) // 0.5 Hz?
{
pcl::PointCloud<PointType> laserCloudMap;
for (int i = 0; i < 4851; i++)
{
laserCloudMap += *laserCloudCornerArray[i];
laserCloudMap += *laserCloudSurfArray[i];
}
sensor_msgs::PointCloud2 laserCloudMsg;
pcl::toROSMsg(laserCloudMap, laserCloudMsg);
laserCloudMsg.header.stamp = ros::Time::now(); // TODO: globally config time stamp
laserCloudMsg.header.frame_id = "velo_origin";
pubLaserCloudMap.publish(laserCloudMsg);
if (verbose_level > 1)
ROS_INFO("publishing the map, with %ld points", laserCloudMap.size());
}
int laserCloudFullResNum = laserCloudFullRes->points.size();
for (int i = 0; i < laserCloudFullResNum; i++)
{
pointAssociateToMap(&laserCloudFullRes->points[i], &laserCloudFullRes->points[i]);
}
sensor_msgs::PointCloud2 laserCloudFullRes3; // Q: what's the difference between laserCloudFullRes 1 2 and 3?
pcl::toROSMsg(*laserCloudFullRes, laserCloudFullRes3);
laserCloudFullRes3.header.stamp = ros::Time::now(); // TODO: globally config time stamp
laserCloudFullRes3.header.frame_id = "map";
pubLaserCloudFullRes.publish(laserCloudFullRes3);
if (verbose_level > 1)
{
ROS_INFO("mapping pub time %f ms \n", t_pub.toc());
ROS_INFO("whole mapping time %f ms +++++\n",
t_whole.toc()); // TODO check if t_whole tictoc obj can be shared in class
}
}
} // namespace vloam
================================================
FILE: src/lidar_odometry_mapping/src/laser_odometry.cpp
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#include <lidar_odometry_mapping/laser_odometry.h>
namespace vloam
{
// init before a new rosbag comes
void LaserOdometry::init(std::shared_ptr<VloamTF>& vloam_tf_)
{
vloam_tf = vloam_tf_;
if (!ros::param::get("loam_verbose_level", verbose_level))
ROS_BREAK();
if (!ros::param::get("detach_VO_LO", detach_VO_LO))
ROS_BREAK();
corner_correspondence = 0;
plane_correspondence = 0;
if (!ros::param::get("mapping_skip_frame", mapping_skip_frame))
ROS_BREAK();
systemInited = false;
timeCornerPointsSharp = 0;
timeCornerPointsLessSharp = 0;
timeSurfPointsFlat = 0;
timeSurfPointsLessFlat = 0;
timeLaserCloudFullRes = 0;
kdtreeCornerLast = boost::make_shared<pcl::KdTreeFLANN<pcl::PointXYZI>>();
kdtreeSurfLast = boost::make_shared<pcl::KdTreeFLANN<pcl::PointXYZI>>();
cornerPointsSharp = boost::make_shared<pcl::PointCloud<PointType>>();
cornerPointsLessSharp = boost::make_shared<pcl::PointCloud<PointType>>();
surfPointsFlat = boost::make_shared<pcl::PointCloud<PointType>>();
surfPointsLessFlat = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudCornerLast = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudSurfLast = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudFullRes = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudCornerLastNum = 0;
laserCloudSurfLastNum = 0;
// Transformation from current frame to world frame
q_w_curr = Eigen::Quaterniond(1, 0, 0, 0);
t_w_curr = Eigen::Vector3d(0, 0, 0);
// q_curr_last(x, y, z, w), t_curr_last
para_q[0] = 0.0;
para_q[1] = 0.0;
para_q[2] = 0.0;
para_q[3] = 1.0;
para_t[0] = 0.0;
para_t[1] = 0.0;
para_t[2] = 0.0;
// // q_last_curr = Eigen::Map<Eigen::Quaterniond>(para_q);
// // t_last_curr = Eigen::Map<Eigen::Vector3d>(para_t);
new (&q_last_curr) Eigen::Map<Eigen::Quaterniond>(para_q);
new (&t_last_curr) Eigen::Map<Eigen::Vector3d>(para_t);
// ROS_INFO("INIT raw: q_last_curr: w = %f x = %f, y = %f, z = %f", para_q[3], para_q[0], para_q[1], para_q[2]);
// ROS_INFO("INIT: q_last_curr: w = %f x = %f, y = %f, z = %f", q_last_curr.w(), q_last_curr.x(), q_last_curr.y(),
// q_last_curr.z()); ROS_INFO("INIT: t_last_curr: x = %f, y = %f, z = %f", t_last_curr.x(), t_last_curr.y(),
// t_last_curr.z());
// Eigen::Map<Eigen::Quaterniond> q_last_curr2(para_q);
// Eigen::Map<Eigen::Vector3d> t_last_curr2(para_t);
// ROS_INFO("INIT2: q_last_curr: w = %f x = %f, y = %f, z = %f", q_last_curr2.w(), q_last_curr2.x(), q_last_curr2.y(),
// q_last_curr2.z()); ROS_INFO("INIT2: t_last_curr: x = %f, y = %f, z = %f", t_last_curr2.x(), t_last_curr2.y(),
// t_last_curr2.z());
pubLaserCloudCornerLast = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_corner_last", 100);
pubLaserCloudSurfLast = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_surf_last", 100);
pubLaserCloudFullRes = nh.advertise<sensor_msgs::PointCloud2>("/velodyne_cloud_3", 100);
pubLaserOdometry = nh.advertise<nav_msgs::Odometry>("/laser_odom_to_init", 100);
pubLaserPath = nh.advertise<nav_msgs::Path>("/laser_odom_path", 100);
laserPath.poses.clear();
frameCount = 0;
}
// // reset after a new message comes
// void LaserOdometry::reset() {
// // LOAM must use previous f2f odom as a prior
// // para_q[0] = 0.0;
// // para_q[1] = 0.0;
// // para_q[2] = 0.0;
// // para_q[3] = 1.0;
// // para_t[0] = 0.0;
// // para_t[1] = 0.0;
// // para_t[2] = 0.0;
// // new (&q_last_curr) Eigen::Map<Eigen::Quaterniond>(para_q);
// // new (&t_last_curr) Eigen::Map<Eigen::Vector3d>(para_t);
// }
void LaserOdometry::input(const pcl::PointCloud<PointType>::Ptr& laserCloud_,
const pcl::PointCloud<PointType>::Ptr& cornerPointsSharp_,
const pcl::PointCloud<PointType>::Ptr& cornerPointsLessSharp_,
const pcl::PointCloud<PointType>::Ptr& surfPointsFlat_,
const pcl::PointCloud<PointType>::Ptr& surfPointsLessFlat_)
{
laserCloudFullRes = boost::make_shared<pcl::PointCloud<PointType>>(*laserCloud_);
cornerPointsSharp = boost::make_shared<pcl::PointCloud<PointType>>(*cornerPointsSharp_);
cornerPointsLessSharp = boost::make_shared<pcl::PointCloud<PointType>>(*cornerPointsLessSharp_);
surfPointsFlat = boost::make_shared<pcl::PointCloud<PointType>>(*surfPointsFlat_);
surfPointsLessFlat = boost::make_shared<pcl::PointCloud<PointType>>(*surfPointsLessFlat_);
}
// undistort lidar point
void LaserOdometry::TransformToStart(PointType const* const pi, PointType* const po)
{
// interpolation ratio
double s;
if (DISTORTION)
s = (pi->intensity - int(pi->intensity)) / SCAN_PERIOD;
else
s = 1.0;
// s = 1;
Eigen::Quaterniond q_point_last = Eigen::Quaterniond::Identity().slerp(s, q_last_curr);
Eigen::Vector3d t_point_last = s * t_last_curr;
Eigen::Vector3d point(pi->x, pi->y, pi->z);
Eigen::Vector3d un_point = q_point_last * point + t_point_last;
po->x = un_point.x();
po->y = un_point.y();
po->z = un_point.z();
po->intensity = pi->intensity;
}
// transform all lidar points to the start of the next frame
void LaserOdometry::TransformToEnd(PointType const* const pi, PointType* const po)
{
// undistort point first
pcl::PointXYZI un_point_tmp;
TransformToStart(pi, &un_point_tmp);
Eigen::Vector3d un_point(un_point_tmp.x, un_point_tmp.y, un_point_tmp.z);
Eigen::Vector3d point_end = q_last_curr.inverse() * (un_point - t_last_curr);
po->x = point_end.x();
po->y = point_end.y();
po->z = point_end.z();
// Remove distortion time info
po->intensity = int(pi->intensity);
}
void LaserOdometry::solveLO()
{
// this->reset();
// ROS_INFO("LO: Finished reset");
TicToc t_whole;
// initializing
if (!systemInited)
{
systemInited = true;
if (verbose_level > 1)
{
ROS_INFO("Initialization finished \n");
}
}
else
{
int cornerPointsSharpNum = cornerPointsSharp->points.size();
int surfPointsFlatNum = surfPointsFlat->points.size();
TicToc t_opt;
for (size_t opti_counter = 0; opti_counter < 2; ++opti_counter)
{
corner_correspondence = 0;
plane_correspondence = 0;
// ceres::LossFunction *loss_function = NULL;
ceres::LossFunction* loss_function = new ceres::HuberLoss(0.1);
ceres::LocalParameterization* q_parameterization = new ceres::EigenQuaternionParameterization();
ceres::Problem::Options problem_options;
ceres::Problem problem(problem_options);
if (!detach_VO_LO)
{
para_q[0] = vloam_tf->velo_last_VOT_velo_curr.getRotation().x();
para_q[1] = vloam_tf->velo_last_VOT_velo_curr.getRotation().y();
para_q[2] = vloam_tf->velo_last_VOT_velo_curr.getRotation().z();
para_q[3] = vloam_tf->velo_last_VOT_velo_curr.getRotation().w();
para_t[0] = vloam_tf->velo_last_VOT_velo_curr.getOrigin().x();
para_t[1] = vloam_tf->velo_last_VOT_velo_curr.getOrigin().y();
para_t[2] = vloam_tf->velo_last_VOT_velo_curr.getOrigin().z();
// new (&q_last_curr) Eigen::Map<Eigen::Quaterniond>(para_q);
// new (&t_last_curr) Eigen::Map<Eigen::Vector3d>(para_t);
}
if (verbose_level > 1)
{
ROS_INFO("\nq_last_curr.x = %f, velo_last_T_velo_curr.q.x = %f", para_q[0],
vloam_tf->velo_last_VOT_velo_curr.getRotation().x());
ROS_INFO("q_last_curr.y = %f, velo_last_T_velo_curr.q.y = %f", para_q[1],
vloam_tf->velo_last_VOT_velo_curr.getRotation().y());
ROS_INFO("q_last_curr.z = %f, velo_last_T_velo_curr.q.z = %f", para_q[2],
vloam_tf->velo_last_VOT_velo_curr.getRotation().z());
ROS_INFO("q_last_curr.w = %f, velo_last_T_velo_curr.q.w = %f", para_q[3],
vloam_tf->velo_last_VOT_velo_curr.getRotation().w());
ROS_INFO("t_last_curr.x = %f, velo_last_T_velo_curr.t.x = %f", para_t[0],
vloam_tf->velo_last_VOT_velo_curr.getOrigin().x());
ROS_INFO("t_last_curr.y = %f, velo_last_T_velo_curr.t.y = %f", para_t[1],
vloam_tf->velo_last_VOT_velo_curr.getOrigin().y());
ROS_INFO("t_last_curr.z = %f, velo_last_T_velo_curr.t.z = %f", para_t[2],
vloam_tf->velo_last_VOT_velo_curr.getOrigin().z());
}
problem.AddParameterBlock(para_q, 4, q_parameterization);
problem.AddParameterBlock(para_t, 3);
pcl::PointXYZI pointSel;
std::vector<int> pointSearchInd;
std::vector<float> pointSearchSqDis;
TicToc t_data;
// find correspondence for corner features
for (int i = 0; i < cornerPointsSharpNum; ++i)
{
TransformToStart(&(cornerPointsSharp->points[i]), &pointSel);
kdtreeCornerLast->nearestKSearch(pointSel, 1, pointSearchInd, pointSearchSqDis);
int closestPointInd = -1, minPointInd2 = -1;
if (pointSearchSqDis[0] < DISTANCE_SQ_THRESHOLD)
{
closestPointInd = pointSearchInd[0];
int closestPointScanID = int(laserCloudCornerLast->points[closestPointInd].intensity);
double minPointSqDis2 = DISTANCE_SQ_THRESHOLD;
// search in the direction of increasing scan line
for (int j = closestPointInd + 1; j < (int)laserCloudCornerLast->points.size(); ++j)
{
// if in the same scan line, continue
if (int(laserCloudCornerLast->points[j].intensity) <= closestPointScanID)
continue;
// if not in nearby scans, end the loop
if (int(laserCloudCornerLast->points[j].intensity) > (closestPointScanID + NEARBY_SCAN))
break;
double pointSqDis =
(laserCloudCornerLast->points[j].x - pointSel.x) * (laserCloudCornerLast->points[j].x - pointSel.x) +
(laserCloudCornerLast->points[j].y - pointSel.y) * (laserCloudCornerLast->points[j].y - pointSel.y) +
(laserCloudCornerLast->points[j].z - pointSel.z) * (laserCloudCornerLast->points[j].z - pointSel.z);
if (pointSqDis < minPointSqDis2)
{
// find nearer point
minPointSqDis2 = pointSqDis;
minPointInd2 = j;
}
}
// search in the direction of decreasing scan line
for (int j = closestPointInd - 1; j >= 0; --j)
{
// if in the same scan line, continue
if (int(laserCloudCornerLast->points[j].intensity) >= closestPointScanID)
continue;
// if not in nearby scans, end the loop
if (int(laserCloudCornerLast->points[j].intensity) < (closestPointScanID - NEARBY_SCAN))
break;
double pointSqDis =
(laserCloudCornerLast->points[j].x - pointSel.x) * (laserCloudCornerLast->points[j].x - pointSel.x) +
(laserCloudCornerLast->points[j].y - pointSel.y) * (laserCloudCornerLast->points[j].y - pointSel.y) +
(laserCloudCornerLast->points[j].z - pointSel.z) * (laserCloudCornerLast->points[j].z - pointSel.z);
if (pointSqDis < minPointSqDis2)
{
// find nearer point
minPointSqDis2 = pointSqDis;
minPointInd2 = j;
}
}
}
if (minPointInd2 >= 0) // both closestPointInd and minPointInd2 is valid
{
Eigen::Vector3d curr_point(cornerPointsSharp->points[i].x, cornerPointsSharp->points[i].y,
cornerPointsSharp->points[i].z);
Eigen::Vector3d last_point_a(laserCloudCornerLast->points[closestPointInd].x,
laserCloudCornerLast->points[closestPointInd].y,
laserCloudCornerLast->points[closestPointInd].z);
Eigen::Vector3d last_point_b(laserCloudCornerLast->points[minPointInd2].x,
laserCloudCornerLast->points[minPointInd2].y,
laserCloudCornerLast->points[minPointInd2].z);
double s;
if (DISTORTION)
s = (cornerPointsSharp->points[i].intensity - int(cornerPointsSharp->points[i].intensity)) / SCAN_PERIOD;
else
s = 1.0;
ceres::CostFunction* cost_function = LidarEdgeFactor::Create(curr_point, last_point_a, last_point_b, s);
// if (corner_correspondence < 20) {
// std::cout << "Corner feature No." << corner_correspondence << ", curr_point: " << curr_point << ",
// last_point_a: " << last_point_a << ", last_point_b: " << last_point_b;
// } // TODO: remove this later (test only code)
problem.AddResidualBlock(cost_function, loss_function, para_q, para_t);
corner_correspondence++;
}
}
// find correspondence for plane features
for (int i = 0; i < surfPointsFlatNum; ++i)
{
TransformToStart(&(surfPointsFlat->points[i]), &pointSel);
kdtreeSurfLast->nearestKSearch(pointSel, 1, pointSearchInd, pointSearchSqDis);
int closestPointInd = -1, minPointInd2 = -1, minPointInd3 = -1;
if (pointSearchSqDis[0] < DISTANCE_SQ_THRESHOLD)
{
closestPointInd = pointSearchInd[0];
// get closest point's scan ID
int closestPointScanID = int(laserCloudSurfLast->points[closestPointInd].intensity);
double minPointSqDis2 = DISTANCE_SQ_THRESHOLD, minPointSqDis3 = DISTANCE_SQ_THRESHOLD;
// search in the direction of increasing scan line
for (int j = closestPointInd + 1; j < (int)laserCloudSurfLast->points.size(); ++j)
{
// if not in nearby scans, end the loop
if (int(laserCloudSurfLast->points[j].intensity) > (closestPointScanID + NEARBY_SCAN))
break;
double pointSqDis =
(laserCloudSurfLast->points[j].x - pointSel.x) * (laserCloudSurfLast->points[j].x - pointSel.x) +
(laserCloudSurfLast->points[j].y - pointSel.y) * (laserCloudSurfLast->points[j].y - pointSel.y) +
(laserCloudSurfLast->points[j].z - pointSel.z) * (laserCloudSurfLast->points[j].z - pointSel.z);
// if in the same or lower scan line
if (int(laserCloudSurfLast->points[j].intensity) <= closestPointScanID && pointSqDis < minPointSqDis2)
{
minPointSqDis2 = pointSqDis;
minPointInd2 = j;
}
// if in the higher scan line
else if (int(laserCloudSurfLast->points[j].intensity) > closestPointScanID && pointSqDis < minPointSqDis3)
{
minPointSqDis3 = pointSqDis;
minPointInd3 = j;
}
}
// search in the direction of decreasing scan line
for (int j = closestPointInd - 1; j >= 0; --j)
{
// if not in nearby scans, end the loop
if (int(laserCloudSurfLast->points[j].intensity) < (closestPointScanID - NEARBY_SCAN))
break;
double pointSqDis =
(laserCloudSurfLast->points[j].x - pointSel.x) * (laserCloudSurfLast->points[j].x - pointSel.x) +
(laserCloudSurfLast->points[j].y - pointSel.y) * (laserCloudSurfLast->points[j].y - pointSel.y) +
(laserCloudSurfLast->points[j].z - pointSel.z) * (laserCloudSurfLast->points[j].z - pointSel.z);
// if in the same or higher scan line
if (int(laserCloudSurfLast->points[j].intensity) >= closestPointScanID && pointSqDis < minPointSqDis2)
{
minPointSqDis2 = pointSqDis;
minPointInd2 = j;
}
else if (int(laserCloudSurfLast->points[j].intensity) < closestPointScanID && pointSqDis < minPointSqDis3)
{
// find nearer point
minPointSqDis3 = pointSqDis;
minPointInd3 = j;
}
}
if (minPointInd2 >= 0 && minPointInd3 >= 0)
{
Eigen::Vector3d curr_point(surfPointsFlat->points[i].x, surfPointsFlat->points[i].y,
surfPointsFlat->points[i].z);
Eigen::Vector3d last_point_a(laserCloudSurfLast->points[closestPointInd].x,
laserCloudSurfLast->points[closestPointInd].y,
laserCloudSurfLast->points[closestPointInd].z);
Eigen::Vector3d last_point_b(laserCloudSurfLast->points[minPointInd2].x,
laserCloudSurfLast->points[minPointInd2].y,
laserCloudSurfLast->points[minPointInd2].z);
Eigen::Vector3d last_point_c(laserCloudSurfLast->points[minPointInd3].x,
laserCloudSurfLast->points[minPointInd3].y,
laserCloudSurfLast->points[minPointInd3].z);
double s;
if (DISTORTION)
s = (surfPointsFlat->points[i].intensity - int(surfPointsFlat->points[i].intensity)) / SCAN_PERIOD;
else
s = 1.0;
ceres::CostFunction* cost_function =
LidarPlaneFactor::Create(curr_point, last_point_a, last_point_b, last_point_c, s);
problem.AddResidualBlock(cost_function, loss_function, para_q, para_t);
plane_correspondence++;
}
}
}
if (verbose_level > 1)
{
ROS_INFO("coner_correspondance %d, plane_correspondence %d \n", corner_correspondence, plane_correspondence);
ROS_INFO("data association time %f ms \n", t_data.toc());
}
if ((corner_correspondence + plane_correspondence) < 10)
{
ROS_INFO("less correspondence! *************************************************\n");
}
TicToc t_solver;
ceres::Solver::Options options;
options.linear_solver_type = ceres::DENSE_QR;
options.max_num_iterations = 4;
options.minimizer_progress_to_stdout = false;
ceres::Solver::Summary summary;
ceres::Solve(options, &problem, &summary);
// ROS_INFO("full report: \n%s", summary.FullReport().c_str());
// ROS_INFO("raw: q_last_curr: w = %f x = %f, y = %f, z = %f", para_q[3], para_q[0], para_q[1], para_q[2]);
if (verbose_level > 1)
{
ROS_INFO("solver time %f ms \n", t_solver.toc());
}
}
if (verbose_level > 1)
{
ROS_INFO("optimization twice time %f \n", t_opt.toc());
}
t_w_curr = t_w_curr + q_w_curr * t_last_curr;
q_w_curr = q_w_curr * q_last_curr;
}
// ROS_INFO("raw: q_last_curr: w = %f x = %f, y = %f, z = %f", para_q[3], para_q[0], para_q[1], para_q[2]);
// ROS_INFO("q_last_curr: w = %f x = %f, y = %f, z = %f", q_last_curr.w(), q_last_curr.x(), q_last_curr.y(),
// q_last_curr.z()); ROS_INFO("q_w_curr: w = %f x = %f, y = %f, z = %f", q_w_curr.w(), q_w_curr.x(), q_w_curr.y(),
// q_w_curr.z());
// ROS_INFO("t_last_curr: x = %f, y = %f, z = %f", t_last_curr.x(), t_last_curr.y(), t_last_curr.z());
// ROS_INFO("t_w_curr: x = %f, y = %f, z = %f", t_w_curr.x(), t_w_curr.y(), t_w_curr.z());
// transform corner features and plane features to the scan end point
if (0)
{
int cornerPointsLessSharpNum = cornerPointsLessSharp->points.size();
for (int i = 0; i < cornerPointsLessSharpNum; i++)
{
TransformToEnd(&cornerPointsLessSharp->points[i], &cornerPointsLessSharp->points[i]);
}
int surfPointsLessFlatNum = surfPointsLessFlat->points.size();
for (int i = 0; i < surfPointsLessFlatNum; i++)
{
TransformToEnd(&surfPointsLessFlat->points[i], &surfPointsLessFlat->points[i]);
}
int laserCloudFullResNum = laserCloudFullRes->points.size();
for (int i = 0; i < laserCloudFullResNum; i++)
{
TransformToEnd(&laserCloudFullRes->points[i], &laserCloudFullRes->points[i]);
}
}
pcl::PointCloud<PointType>::Ptr laserCloudTemp = cornerPointsLessSharp;
cornerPointsLessSharp = laserCloudCornerLast;
laserCloudCornerLast = laserCloudTemp;
laserCloudTemp = surfPointsLessFlat;
surfPointsLessFlat = laserCloudSurfLast;
laserCloudSurfLast = laserCloudTemp;
laserCloudCornerLastNum = laserCloudCornerLast->points.size();
laserCloudSurfLastNum = laserCloudSurfLast->points.size();
// std::cout << "the size of corner last is " << laserCloudCornerLastNum << ", and the size of surf last is " <<
// laserCloudSurfLastNum << '\n';
kdtreeCornerLast->setInputCloud(laserCloudCornerLast);
kdtreeSurfLast->setInputCloud(laserCloudSurfLast);
if (verbose_level > 1)
{
ROS_INFO("whole laserOdometry time %f ms \n \n", t_whole.toc());
if (t_whole.toc() > 100)
ROS_WARN("odometry process over 100ms");
}
frameCount++;
}
void LaserOdometry::publish()
{
TicToc t_pub;
// publish odometry
nav_msgs::Odometry laserOdometry;
laserOdometry.header.frame_id = "map";
laserOdometry.child_frame_id = "laser_odom";
laserOdometry.header.stamp = ros::Time::now();
laserOdometry.pose.pose.orientation.x = q_w_curr.x();
laserOdometry.pose.pose.orientation.y = q_w_curr.y();
laserOdometry.pose.pose.orientation.z = q_w_curr.z();
laserOdometry.pose.pose.orientation.w = q_w_curr.w();
laserOdometry.pose.pose.position.x = t_w_curr.x();
laserOdometry.pose.pose.position.y = t_w_curr.y();
laserOdometry.pose.pose.position.z = t_w_curr.z();
pubLaserOdometry.publish(laserOdometry);
ROS_INFO("lidar_odometry: q = %.4f, %.4f, %.4f, %.4f; t = %.4f, %.4f, %.4f", q_last_curr.x(), q_last_curr.y(),
q_last_curr.z(), q_last_curr.w(), t_last_curr.x(), t_last_curr.y(), t_last_curr.z());
// vloam_tf->cam0_init_eigen_LOT_cam0_last.translation() = t_w_curr;
// vloam_tf->cam0_init_eigen_LOT_cam0_last.linear() = q_w_curr.normalized().toRotationMatrix();
// frame2frame estimation recording
vloam_tf->base_prev_LOT_base_curr.setOrigin(tf2::Vector3(t_last_curr.x(), t_last_curr.y(), t_last_curr.z()));
vloam_tf->base_prev_LOT_base_curr.setRotation(
tf2::Quaternion(q_last_curr.x(), q_last_curr.y(), q_last_curr.z(), q_last_curr.w()));
vloam_tf->cam0_curr_LOT_cam0_prev =
vloam_tf->base_T_cam0.inverse() * vloam_tf->base_prev_LOT_base_curr.inverse() * vloam_tf->base_T_cam0;
// world2frame estimation
vloam_tf->world_LOT_base_last.setOrigin(tf2::Vector3(t_w_curr.x(), t_w_curr.y(), t_w_curr.z()));
vloam_tf->world_LOT_base_last.setRotation(tf2::Quaternion(q_w_curr.x(), q_w_curr.y(), q_w_curr.z(), q_w_curr.w()));
geometry_msgs::PoseStamped laserPose;
laserPose.header = laserOdometry.header;
laserPose.pose = laserOdometry.pose.pose;
laserPath.header.stamp = laserOdometry.header.stamp;
laserPath.poses.push_back(laserPose);
laserPath.header.frame_id = "map";
pubLaserPath.publish(laserPath);
// if (frameCount % mapping_skip_frame == 0) // no need to publish
// {
// frameCount = 0;
// sensor_msgs::PointCloud2 laserCloudCornerLast2;
// pcl::toROSMsg(*laserCloudCornerLast, laserCloudCornerLast2);
// laserCloudCornerLast2.header.stamp = ros::Time::now();
// laserCloudCornerLast2.header.frame_id = "velo";
// pubLaserCloudCornerLast.publish(laserCloudCornerLast2);
// sensor_msgs::PointCloud2 laserCloudSurfLast2;
// pcl::toROSMsg(*laserCloudSurfLast, laserCloudSurfLast2);
// laserCloudSurfLast2.header.stamp = ros::Time::now();
// laserCloudSurfLast2.header.frame_id = "velo";
// pubLaserCloudSurfLast.publish(laserCloudSurfLast2);
// sensor_msgs::PointCloud2 laserCloudFullRes3;
// pcl::toROSMsg(*laserCloudFullRes, laserCloudFullRes3);
// laserCloudFullRes3.header.stamp = ros::Time::now();
// laserCloudFullRes3.header.frame_id = "velo";
// pubLaserCloudFullRes.publish(laserCloudFullRes3);
// }
if (verbose_level > 1)
{
ROS_INFO("publication time %f ms \n", t_pub.toc());
}
}
void LaserOdometry::output(Eigen::Quaterniond& q_w_curr_, Eigen::Vector3d& t_w_curr_,
pcl::PointCloud<PointType>::Ptr& laserCloudCornerLast_,
pcl::PointCloud<PointType>::Ptr& laserCloudSurfLast_,
pcl::PointCloud<PointType>::Ptr& laserCloudFullRes_, bool& skip_frame)
{
q_w_curr_ = q_w_curr;
t_w_curr_ = t_w_curr;
if (frameCount % mapping_skip_frame == 0)
{
laserCloudCornerLast_ = laserCloudCornerLast;
laserCloudSurfLast_ = laserCloudSurfLast;
laserCloudFullRes_ = laserCloudFullRes;
skip_frame = false;
} // TODO: check the meaning of skip frame.
else
{
skip_frame = true;
}
}
} // namespace vloam
================================================
FILE: src/lidar_odometry_mapping/src/lidar_odometry_mapping.cpp
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#include <lidar_odometry_mapping/lidar_odometry_mapping.h>
namespace vloam
{
void LidarOdometryMapping::init(std::shared_ptr<VloamTF>& vloam_tf_)
{
vloam_tf = vloam_tf_;
if (!ros::param::get("loam_verbose_level", verbose_level))
ROS_BREAK();
scan_registration.init();
laser_odometry.init(vloam_tf);
laser_mapping.init(vloam_tf);
// subLaserCloud = nh->subscribe<sensor_msgs::PointCloud2>("/velodyne_points", 100,
// &LidarOdometryMapping::laserCloudHandler, this);
laserCloud = boost::make_shared<pcl::PointCloud<PointType>>();
cornerPointsSharp = boost::make_shared<pcl::PointCloud<PointType>>();
cornerPointsLessSharp = boost::make_shared<pcl::PointCloud<PointType>>();
surfPointsFlat = boost::make_shared<pcl::PointCloud<PointType>>();
surfPointsLessFlat = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudCornerLast = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudSurfLast = boost::make_shared<pcl::PointCloud<PointType>>();
laserCloudFullRes = boost::make_shared<pcl::PointCloud<PointType>>();
}
void LidarOdometryMapping::reset()
{
// prepare: laserCloud, cornerPointsSharp, cornerPointsLessSharp, surfPointsFlat, surfPointsLessFlat
scan_registration.reset();
// laser_odometry.reset();
laser_mapping.reset();
}
void LidarOdometryMapping::scanRegistrationIO(const pcl::PointCloud<pcl::PointXYZ>& laserCloudIn)
{
loam_timer.tic();
frame_time = 0.0;
scan_registration.input(laserCloudIn);
// scan_registration.publish(); // no need to publish the point cloud feature
scan_registration.output(laserCloud, // 10Hz
cornerPointsSharp, // 10Hz
cornerPointsLessSharp, // 10Hz
surfPointsFlat, // 10Hz
surfPointsLessFlat // 10Hz
);
if (verbose_level > 0)
{
ROS_INFO("Scan Registration takes %f ms \n", loam_timer.toc());
}
frame_time += loam_timer.toc();
}
void LidarOdometryMapping::laserOdometryIO()
{
loam_timer.tic();
laser_odometry.input(laserCloud, // 10Hz
cornerPointsSharp, // 10Hz
cornerPointsLessSharp, // 10Hz
surfPointsFlat, // 10Hz
surfPointsLessFlat // 10Hz
);
laser_odometry.solveLO(); // 10Hz
laser_odometry.publish();
laser_odometry.output(q_wodom_curr, // 10Hz
t_wodom_curr, // 10Hz
laserCloudCornerLast, // 2Hz // no change if skip_frame
laserCloudSurfLast, // 2Hz // no change if skip_frame
laserCloudFullRes, // 2Hz // no change if skip_frameee
skip_frame);
if (verbose_level > 0)
{
ROS_INFO("Laser Odometry takes %f ms \n", loam_timer.toc());
}
frame_time += loam_timer.toc();
}
void LidarOdometryMapping::laserMappingIO()
{
loam_timer.tic();
laser_mapping.input(laserCloudCornerLast, // 2Hz
laserCloudSurfLast, // 2Hz
laserCloudFullRes, // 2Hz
q_wodom_curr, // 10Hz
t_wodom_curr, // 10Hz
skip_frame);
if (!skip_frame)
laser_mapping.solveMapping(); // 2Hz
laser_mapping.publish();
// laser_odometry.output(
// );
if (verbose_level > 0)
{
ROS_INFO("Laser Mapping takes %f ms \n", loam_timer.toc());
}
frame_time += loam_timer.toc();
if (frame_time > 100)
{
ROS_WARN("LOAM takes %f ms (>100 ms)", frame_time);
}
}
// void LidarOdometryMapping::solveLOAM() {
// laser_odometry.solveLO();
// laser_mapping.solveMapping();
// // TODO: check the running time constraint: 0.1s
// }
} // namespace vloam
================================================
FILE: src/lidar_odometry_mapping/src/scan_registration.cpp
================================================
// This is an advanced implementation of the algorithm described in the following paper:
// J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time.
// Robotics: Science and Systems Conference (RSS). Berkeley, CA, July 2014.
// Modifier: Tong Qin qintonguav@gmail.com
// Shaozu Cao saozu.cao@connect.ust.hk
// Copyright 2013, Ji Zhang, Carnegie Mellon University
// Further contributions copyright (c) 2016, Southwest Research Institute
// All rights reserved.
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
// 2. Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
// 3. Neither the name of the copyright holder nor the names of its
// contributors may be used to endorse or promote products derived from this
// software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
#include <lidar_odometry_mapping/scan_registration.h>
namespace vloam
{
void ScanRegistration::init()
{
ros::param::get("loam_verbose_level", verbose_level);
systemInitCount = 0;
systemInited = false;
N_SCANS = 0;
if (!ros::param::get("scan_line", N_SCANS))
ROS_BREAK();
if (!ros::param::get("minimum_range", MINIMUM_RANGE))
ROS_BREAK();
ROS_INFO("scan line number %d \n", N_SCANS);
if (N_SCANS != 16 && N_SCANS != 32 && N_SCANS != 64)
{
ROS_ERROR("only support velodyne with 16, 32 or 64 scan line!");
}
// // NOTE: publishers and subscribers won't necessarily used, but are kept for forward compatibility
// subLaserCloud = nh->subscribe<sensor_msgs::PointCloud2>("/velodyne_points", 100,
// &ScanRegistration::laserCloudHandler, this);
pubLaserCloud = nh.advertise<sensor_msgs::PointCloud2>("/velodyne_cloud_2", 100);
pubCornerPointsSharp = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_sharp", 100);
pubCornerPointsLessSharp = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_less_sharp", 100);
pubSurfPointsFlat = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_flat", 100);
pubSurfPointsLessFlat = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_less_flat", 100);
// pubRemovePoints = nh->advertise<sensor_msgs::PointCloud2>("/laser_remove_points", 100);
PUB_EACH_LINE = false;
// // std::vector<ros::Publisher> pubEachScan;
// // if(PUB_EACH_LINE)
// // {
// // for(int i = 0; i < N_SCANS; i++)
// // {
// // ros::Publisher tmp = nh.advertise<sensor_msgs::PointCloud2>("/laser_scanid_" + std::to_string(i), 100);
// // pubEachScan.push_back(tmp);
// // }
// // }
laserCloud = boost::make_shared<pcl::PointCloud<PointType>>();
cornerPointsSharp = boost::make_shared<pcl::PointCloud<PointType>>();
cornerPointsLessSharp = boost::make_shared<pcl::PointCloud<PointType>>();
surfPointsFlat = boost::make_shared<pcl::PointCloud<PointType>>();
surfPointsLessFlat = boost::make_shared<pcl::PointCloud<PointType>>();
}
void ScanRegistration::reset()
{
laserCloud->clear();
cornerPointsSharp->clear();
cornerPointsLessSharp->clear();
surfPointsFlat->clear();
surfPointsLessFlat->clear();
laserCloudScans = std::vector<pcl::PointCloud<PointType>>(N_SCANS); // N_SCANS = 64 for kitti
}
template <typename PointT>
void ScanRegistration::removeClosedPointCloud(const pcl::PointCloud<PointT>& cloud_in,
pcl::PointCloud<PointT>& cloud_out, float thres)
{
if (&cloud_in != &cloud_out)
{
cloud_out.header = cloud_in.header;
cloud_out.points.resize(cloud_in.points.size());
}
size_t j = 0;
for (size_t i = 0; i < cloud_in.points.size(); ++i)
{
if (cloud_in.points[i].x * cloud_in.points[i].x + cloud_in.points[i].y * cloud_in.points[i].y +
cloud_in.points[i].z * cloud_in.points[i].z <
thres * thres)
continue;
cloud_out.points[j] = cloud_in.points[i];
j++;
}
if (j != cloud_in.points.size())
{
cloud_out.points.resize(j);
}
cloud_out.height = 1;
cloud_out.width = static_cast<uint32_t>(j);
cloud_out.is_dense = true;
}
void ScanRegistration::input(const pcl::PointCloud<pcl::PointXYZ>& laserCloudIn_)
{
if (!systemInited)
{
systemInitCount++;
if (systemInitCount >= systemDelay)
{
systemInited = true;
}
else
return;
}
if (verbose_level > 1)
{
ROS_INFO("scan registration starts");
t_whole.tic();
}
std::vector<int> scanStartInd(N_SCANS, 0);
std::vector<int> scanEndInd(N_SCANS, 0);
pcl::PointCloud<pcl::PointXYZ> laserCloudIn = laserCloudIn_;
// pcl::fromROSMsg(laserCloudMsg, laserCloudIn); // TODO: see if the input of the function should have another type
// from vloam_main
std::vector<int> indices;
pcl::removeNaNFromPointCloud(laserCloudIn, laserCloudIn, indices);
removeClosedPointCloud(laserCloudIn, laserCloudIn, MINIMUM_RANGE);
if (verbose_level > 1)
{
ROS_INFO("end of closed point removal");
}
int cloudSize = laserCloudIn.points.size();
float startOri = -atan2(laserCloudIn.points[0].y, laserCloudIn.points[0].x);
float endOri = -atan2(laserCloudIn.points[cloudSize - 1].y, laserCloudIn.points[cloudSize - 1].x) + 2 * M_PI;
if (endOri - startOri > 3 * M_PI)
{
endOri -= 2 * M_PI;
}
else if (endOri - startOri < M_PI)
{
endOri += 2 * M_PI;
}
if (verbose_level > 1)
{
ROS_INFO("end Ori %f\n", endOri);
}
bool halfPassed = false;
int count = cloudSize;
PointType point;
for (int i = 0; i < cloudSize; i++)
{
point.x = laserCloudIn.points[i].x;
point.y = laserCloudIn.points[i].y;
point.z = laserCloudIn.points[i].z;
float angle = atan(point.z / sqrt(point.x * point.x + point.y * point.y)) * 180 / M_PI;
int scanID = 0;
if (N_SCANS == 16)
{
scanID = int((angle + 15) / 2 + 0.5);
if (scanID > (N_SCANS - 1) || scanID < 0)
{
count--;
continue;
}
}
else if (N_SCANS == 32)
{
scanID = int((angle + 92.0 / 3.0) * 3.0 / 4.0);
if (scanID > (N_SCANS - 1) || scanID < 0)
{
count--;
continue;
}
}
else if (N_SCANS == 64)
{
if (angle >= -8.83)
scanID = int((2 - angle) * 3.0 + 0.5);
else
scanID = N_SCANS / 2 + int((-8.83 - angle) * 2.0 + 0.5);
// use [0 50] > 50 remove outlies
if (angle > 2 || angle < -24.33 || scanID > 50 || scanID < 0)
{
count--;
continue;
}
}
else
{
ROS_INFO("wrong scan number\n");
ROS_BREAK();
}
// ROS_INFO("angle %f scanID %d \n", angle, scanID);
float ori = -atan2(point.y, point.x);
if (!halfPassed)
{
if (ori < startOri - M_PI / 2)
{
ori += 2 * M_PI;
}
else if (ori > startOri + M_PI * 3 / 2)
{
ori -= 2 * M_PI;
}
if (ori - startOri > M_PI)
{
halfPassed = true;
}
}
else
{
ori += 2 * M_PI;
if (ori < endOri - M_PI * 3 / 2)
{
ori += 2 * M_PI;
}
else if (ori > endOri + M_PI / 2)
{
ori -= 2 * M_PI;
}
}
float relTime = (ori - startOri) / (endOri - startOri);
point.intensity = scanID + scanPeriod * relTime;
laserCloudScans[scanID].push_back(point);
}
cloudSize = count;
if (verbose_level > 1)
{
ROS_INFO("points size %d \n", cloudSize);
}
for (int i = 0; i < N_SCANS; i++)
{
scanStartInd[i] = laserCloud->size() + 5;
*laserCloud += laserCloudScans[i];
scanEndInd[i] = laserCloud->size() - 6;
}
if (verbose_level > 1)
{
ROS_INFO("prepare time %f \n", t_prepare.toc());
}
for (int i = 5; i < cloudSize - 5; i++)
{
float diffX = laserCloud->points[i - 5].x + laserCloud->points[i - 4].x + laserCloud->points[i - 3].x +
laserCloud->points[i - 2].x + laserCloud->points[i - 1].x - 10 * laserCloud->points[i].x +
laserCloud->points[i + 1].x + laserCloud->points[i + 2].x + laserCloud->points[i + 3].x +
laserCloud->points[i + 4].x + laserCloud->points[i + 5].x;
float diffY = laserCloud->points[i - 5].y + laserCloud->points[i - 4].y + laserCloud->points[i - 3].y +
laserCloud->points[i - 2].y + laserCloud->points[i - 1].y - 10 * laserCloud->points[i].y +
laserCloud->points[i + 1].y + laserCloud->points[i + 2].y + laserCloud->points[i + 3].y +
laserCloud->points[i + 4].y + laserCloud->points[i + 5].y;
float diffZ = laserCloud->points[i - 5].z + laserCloud->points[i - 4].z + laserCloud->points[i - 3].z +
laserCloud->points[i - 2].z + laserCloud->points[i - 1].z - 10 * laserCloud->points[i].z +
laserCloud->points[i + 1].z + laserCloud->points[i + 2].z + laserCloud->points[i + 3].z +
laserCloud->points[i + 4].z + laserCloud->points[i + 5].z;
cloudCurvature[i] = diffX * diffX + diffY * diffY + diffZ * diffZ;
cloudSortInd[i] = i;
cloudNeighborPicked[i] = 0;
cloudLabel[i] = 0;
}
TicToc t_pts;
float t_q_sort = 0;
for (int i = 0; i < N_SCANS; i++)
{
if (scanEndInd[i] - scanStartInd[i] < 6)
continue;
pcl::PointCloud<PointType>::Ptr surfPointsLessFlatScan(new pcl::PointCloud<PointType>);
for (int j = 0; j < 6; j++)
{
int sp = scanStartInd[i] + (scanEndInd[i] - scanStartInd[i]) * j / 6;
int ep = scanStartInd[i] + (scanEndInd[i] - scanStartInd[i]) * (j + 1) / 6 - 1;
TicToc t_tmp;
std::sort(cloudSortInd + sp, cloudSortInd + ep + 1,
[&](const int& i, const int& j) { return cloudCurvature[i] < cloudCurvature[j]; });
t_q_sort += t_tmp.toc();
int largestPickedNum = 0;
for (int k = ep; k >= sp; k--)
{
int ind = cloudSortInd[k];
if (cloudNeighborPicked[ind] == 0 && cloudCurvature[ind] > 0.1)
{
largestPickedNum++;
if (largestPickedNum <= 2)
{
cloudLabel[ind] = 2;
cornerPointsSharp->push_back(laserCloud->points[ind]);
cornerPointsLessSharp->push_back(laserCloud->points[ind]);
}
else if (largestPickedNum <= 20)
{
cloudLabel[ind] = 1;
cornerPointsLessSharp->push_back(laserCloud->points[ind]);
}
else
{
break;
}
cloudNeighborPicked[ind] = 1;
for (int l = 1; l <= 5; l++)
{
float diffX = laserCloud->points[ind + l].x - laserCloud->points[ind + l - 1].x;
float diffY = laserCloud->points[ind + l].y - laserCloud->points[ind + l - 1].y;
float diffZ = laserCloud->points[ind + l].z - laserCloud->points[ind + l - 1].z;
if (diffX * diffX + diffY * diffY + diffZ * diffZ > 0.05)
{
break;
}
cloudNeighborPicked[ind + l] = 1;
}
for (int l = -1; l >= -5; l--)
{
float diffX = laserCloud->points[ind + l].x - laserCloud->points[ind + l + 1].x;
float diffY = laserCloud->points[ind + l].y - laserCloud->points[ind + l + 1].y;
float diffZ = laserCloud->points[ind + l].z - laserCloud->points[ind + l + 1].z;
if (diffX * diffX + diffY * diffY + diffZ * diffZ > 0.05)
{
break;
}
cloudNeighborPicked[ind + l] = 1;
}
}
}
int smallestPickedNum = 0;
for (int k = sp; k <= ep; k++)
{
int ind = cloudSortInd[k];
if (cloudNeighborPicked[ind] == 0 && cloudCurvature[ind] < 0.1)
{
cloudLabel[ind] = -1;
surfPointsFlat->push_back(laserCloud->points[ind]);
smallestPickedNum++;
if (smallestPickedNum >= 4)
{
break;
}
cloudNeighborPicked[ind] = 1;
for (int l = 1; l <= 5; l++)
{
float diffX = laserCloud->points[ind + l].x - laserCloud->points[ind + l - 1].x;
float diffY = laserCloud->points[ind + l].y - laserCloud->points[ind + l - 1].y;
float diffZ = laserCloud->points[ind + l].z - laserCloud->points[ind + l - 1].z;
if (diffX * diffX + diffY * diffY + diffZ * diffZ > 0.05)
{
break;
}
cloudNeighborPicked[ind + l] = 1;
}
for (int l = -1; l >= -5; l--)
{
float diffX = laserCloud->points[ind + l].x - laserCloud->points[ind + l + 1].x;
float diffY = laserCloud->points[ind + l].y - laserCloud->points[ind + l + 1].y;
float diffZ = laserCloud->points[ind + l].z - laserCloud->points[ind + l + 1].z;
if (diffX * diffX + diffY * diffY + diffZ * diffZ > 0.05)
{
break;
}
cloudNeighborPicked[ind + l] = 1;
}
}
}
for (int k = sp; k <= ep; k++)
{
if (cloudLabel[k] <= 0)
{
surfPointsLessFlatScan->push_back(laserCloud->points[k]);
}
}
}
pcl::PointCloud<PointType> surfPointsLessFlatScanDS;
pcl::VoxelGrid<PointType> downSizeFilter;
downSizeFilter.setInputCloud(surfPointsLessFlatScan);
downSizeFilter.setLeafSize(0.2, 0.2, 0.2);
downSizeFilter.filter(surfPointsLessFlatScanDS);
*surfPointsLessFlat += surfPointsLessFlatScanDS;
}
if (verbose_level > 1)
{
ROS_INFO("sort q time %f \n", t_q_sort);
ROS_INFO("seperate points time %f \n", t_pts.toc());
ROS_INFO("scan registration time %f ms *************\n", t_whole.toc());
}
}
void ScanRegistration::publish()
{
sensor_msgs::PointCloud2 laserCloudOutMsg;
pcl::toROSMsg(*laserCloud, laserCloudOutMsg);
laserCloudOutMsg.header.stamp = ros::Time::now();
laserCloudOutMsg.header.frame_id = "velo";
pubLaserCloud.publish(laserCloudOutMsg);
sensor_msgs::PointCloud2 cornerPointsSharpMsg;
pcl::toROSMsg(*cornerPointsSharp, cornerPointsSharpMsg);
cornerPointsSharpMsg.header.stamp = ros::Time::now();
cornerPointsSharpMsg.header.frame_id = "velo";
pubCornerPointsSharp.publish(cornerPointsSharpMsg);
sensor_msgs::PointCloud2 cornerPointsLessSharpMsg;
pcl::toROSMsg(*cornerPointsLessSharp, cornerPointsLessSharpMsg);
cornerPointsLessSharpMsg.header.stamp = ros::Time::now();
cornerPointsLessSharpMsg.header.frame_id = "velo";
pubCornerPointsLessSharp.publish(cornerPointsLessSharpMsg);
sensor_msgs::PointCloud2 surfPointsFlat2;
pcl::toROSMsg(*surfPointsFlat, surfPointsFlat2);
surfPointsFlat2.header.stamp = ros::Time::now();
surfPointsFlat2.header.frame_id = "velo";
pubSurfPointsFlat.publish(surfPointsFlat2);
sensor_msgs::PointCloud2 surfPointsLessFlat2;
pcl::toROSMsg(*surfPointsLessFlat, surfPointsLessFlat2);
surfPointsLessFlat2.header.stamp = ros::Time::now();
surfPointsLessFlat2.header.frame_id = "velo";
pubSurfPointsLessFlat.publish(surfPointsLessFlat2);
// pub each scam
if (PUB_EACH_LINE)
{
for (int i = 0; i < N_SCANS; i++)
{
sensor_msgs::PointCloud2 scanMsg;
pcl::toROSMsg(laserCloudScans[i], scanMsg);
scanMsg.header.stamp = ros::Time::now();
scanMsg.header.frame_id = "map";
pubEachScan[i].publish(scanMsg);
}
}
// ROS_INFO("scan registration time %f ms *************\n", t_whole.toc());
// if(t_whole.toc() > 100)
// ROS_WARN("scan registration process over 100ms");
}
void ScanRegistration::output(pcl::PointCloud<PointType>::Ptr& laserCloud_,
pcl::PointCloud<PointType>::Ptr& cornerPointsSharp_,
pcl::PointCloud<PointType>::Ptr& cornerPointsLessSharp_,
pcl::PointCloud<PointType>::Ptr& surfPointsFlat_,
pcl::PointCloud<PointType>::Ptr& surfPointsLessFlat_)
{
laserCloud_ = this->laserCloud;
cornerPointsSharp_ = this->cornerPointsSharp;
cornerPointsLessSharp_ = this->cornerPointsLessSharp;
surfPointsFlat_ = this->surfPointsFlat;
surfPointsLessFlat_ = this->surfPointsLessFlat;
}
} // namespace vloam
================================================
FILE: src/visual_odometry/CMakeLists.txt
================================================
cmake_minimum_required(VERSION 3.0.2)
project(visual_odometry)
## Compile as C++11, supported in ROS Kinetic and newer
add_compile_options(-std=c++14 -O3)
# TODO: check if -O3 is needed -> Yes, Preprocessing 2 frames took 1000ms without -O3, but now takes 63ms with -O3
## Find catkin macros and libraries
## if COMPONENTS list like find_package(catkin REQUIRED COMPONENTS xyz)
## is used, also find other catkin packages
find_package(OpenCV 4.5.1)
find_package(PCL 1.2 REQUIRED)
include_directories(${PCL_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS})
# find_package(Eigen3 3.3 REQUIRED NO_MODULE)
find_package(Ceres REQUIRED PATHS "/usr/local/lib/cmake/Ceres/")
find_package(catkin REQUIRED COMPONENTS
cv_bridge
roscpp
sensor_msgs
std_msgs
roslib
tf2
vloam_tf
lidar_odometry_mapping
)
## System dependencies are found with CMake's conventions
# find_package(Boost REQUIRED COMPONENTS system)
## Uncomment this if the package has a setup.py. This macro ensures
## modules and global scripts declared therein get installed
## See http://ros.org/doc/api/catkin/html/user_guide/setup_dot_py.html
# catkin_python_setup()
################################################
## Declare ROS messages, services and actions ##
################################################
## To declare and build messages, services or actions from within this
## package, follow these steps:
## * Let MSG_DEP_SET be the set of packages whose message types you use in
## your messages/services/actions (e.g. std_msgs, actionlib_msgs, ...).
## * In the file package.xml:
## * add a build_depend tag for "message_generation"
## * add a build_depend and a exec_depend tag for each package in MSG_DEP_SET
## * If MSG_DEP_SET isn't empty the following dependency has been pulled in
## but can be declared for certainty nonetheless:
## * add a exec_depend tag for "message_runtime"
## * In this file (CMakeLists.txt):
## * add "message_generation" and every package in MSG_DEP_SET to
## find_package(catkin REQUIRED COMPONENTS ...)
## * add "message_runtime" and every package in MSG_DEP_SET to
## catkin_package(CATKIN_DEPENDS ...)
## * uncomment the add_*_files sections below as needed
## and list every .msg/.srv/.action file to be processed
## * uncomment the generate_messages entry below
## * add every package in MSG_DEP_SET to generate_messages(DEPENDENCIES ...)
## Generate messages in the 'msg' folder
# add_message_files(
# FILES
# Message1.msg
# Message2.msg
# )
## Generate services in the 'srv' folder
# add_service_files(
# FILES
# Service1.srv
# Service2.srv
# )
## Generate actions in the 'action' folder
# add_action_files(
# FILES
# Action1.action
# Action2.action
# )
## Generate added messages and services with any dependencies listed here
# generate_messages(
# DEPENDENCIES
# sensor_msgs# std_msgs
# )
################################################
## Declare ROS dynamic reconfigure parameters ##
################################################
## To declare and build dynamic reconfigure parameters within this
## package, follow these steps:
## * In the file package.xml:
## * add a build_depend and a exec_depend tag for "dynamic_reconfigure"
## * In this file (CMakeLists.txt):
## * add "dynamic_reconfigure" to
## find_package(catkin REQUIRED COMPONENTS ...)
## * uncomment the "generate_dynamic_reconfigure_options" section below
## and list every .cfg file to be processed
## Generate dynamic reconfigure parameters in the 'cfg' folder
# generate_dynamic_reconfigure_options(
# cfg/DynReconf1.cfg
# cfg/DynReconf2.cfg
# )
###################################
## catkin specific configuration ##
###################################
## The catkin_package macro generates cmake config files for your package
## Declare things to be passed to dependent projects
## INCLUDE_DIRS: uncomment this if your package contains header files
## LIBRARIES: libraries you create in this project that dependent projects also need
## CATKIN_DEPENDS: catkin_packages dependent projects also need
## DEPENDS: system dependencies of this project that dependent projects also need
catkin_package(
INCLUDE_DIRS include
# LIBRARIES visual_odometry
CATKIN_DEPENDS cv_bridge roscpp sensor_msgs std_msgs
DEPENDS EIGEN3 PCL
# DEPENDS system_lib
)
###########
## Build ##
###########
## Specify additional locations of header files
## Your package locations should be listed before other locations
include_directories(
include
${catkin_INCLUDE_DIRS}
${OpenCV_INCLUDE_DIRS}
# ${EIGEN3_INCLUDE_DIR}
)
# Declare a C++ library
add_library(image_util SHARED
src/image_util.cpp
)
target_link_libraries(image_util ${catkin_LIBRARIES})
set_target_properties(image_util PROPERTIES PUBLIC_HEADER include/${PROJECT_NAME}/image_util.h)
add_library(point_cloud_util SHARED
src/point_cloud_util.cpp
)
target_link_libraries(point_cloud_util ${catkin_LIBRARIES})
set_target_properties(point_cloud_util PROPERTIES PUBLIC_HEADER include/${PROJECT_NAME}/point_cloud_util.h)
add_library(visual_odometry SHARED
src/visual_odometry.cpp
)
target_link_libraries(visual_odometry
${catkin_LIBRARIES}
${OpenCV_LIBS}
Eigen3::Eigen
${CERES_LIBRARIES}
image_util
point_cloud_util
vloam_tf
)
set_target_properties(visual_odometry PROPERTIES PUBLIC_HEADER include/${PROJECT_NAME}/visual_odometry.h)
## Add cmake target dependencies of the library
## as an example, code may need to be generated before libraries
## either from message generation or dynamic reconfigure
# add_dependencies(${PROJECT_NAME} ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS})
## Declare a C++ executable
## With catkin_make all packages are built within a single CMake context
## The recommended prefix ensures that target names across packages don't collide
## Rename C++ executable without prefix
## The above recommended prefix causes long target names, the following renames the
## target back to the shorter version for ease of user use
## e.g. "rosrun someones_pkg node" instead of "rosrun someones_pkg someones_pkg_node"
# set_target_properties(${PROJECT_NAME}_node PROPERTIES OUTPUT_NAME node PREFIX "")
## Add cmake target dependencies of the executable
## same as for the library above
# add_dependencies(${PROJECT_NAME}_node ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS})
## Specify libraries to link a library or executable target against
#############
## Install ##
#############
# all install targets should use catkin DESTINATION variables
# See http://ros.org/doc/api/catkin/html/adv_user_guide/variables.html
## Mark executable scripts (Python etc.) for installation
## in contrast to setup.py, you can choose the destination
# catkin_install_python(PROGRAMS
# scripts/my_python_script
# DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
# )
## Mark executables for installation
## See http://docs.ros.org/melodic/api/catkin/html/howto/format1/building_executables.html
# install(TARGETS ${PROJECT_NAME}_node
# RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
# )
## Mark libraries for installation
## See http://docs.ros.org/melodic/api/catkin/html/howto/format1/building_libraries.html
# install(TARGETS ${PROJECT_NAME}
# ARCHIVE DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION}
# LIBRARY DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION}
# RUNTIME DESTINATION ${CATKIN_GLOBAL_BIN_DESTINATION}
# )
# Mark cpp header files for installation
install(
DIRECTORY include/${PROJECT_NAME}/
DESTINATION ${CATKIN_PACKAGE_INCLUDE_DESTINATION}
FILES_MATCHING
PATTERN "*.hpp"
PATTERN "*.h"
# PATTERN ".svn" EXCLUDE
)
## Mark other files for installation (e.g. launch and bag files, etc.)
# install(FILES
# # myfile1
# # myfile2
# DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION}
# )
#############
## Testing ##
#############
## Add gtest based cpp test target and link libraries
# catkin_add_gtest(${PROJECT_NAME}-test test/test_visual_odometry.cpp)
# if(TARGET ${PROJECT_NAME}-test)
# target_link_libraries(${PROJECT_NAME}-test ${PROJECT_NAME})
# endif()
## Add folders to be run by python nosetests
# catkin_add_nosetests(test)
================================================
FILE: src/visual_odometry/README.md
================================================
# Prerequisites
OpenCV 4.5.1
Eigen3 3.3
Ceres 2.0
# Troubleshooting
## 1. Catkin_Make fail with cv_bridge not found
Solution: Modify `_include_dirs` in cv_bridgeConfig.cmake (at line 96)
The default `_include_dirs` is set by `set(_include_dirs "include;/usr/include/opencv4")`, while my opencv is installed under `/usr/local/include/opencv4`.
Reference: https://github.com/cipherdev/src/issues/13
## 2. Installing Ceres
I installed the [latest stable release](http://ceres-solver.org/ceres-solver-2.0.0.tar.gz) as mentioned in http://ceres-solver.org/installation.html. At step "make test", my computer got stuck at step 31 "dynamic_sparsity_test". This happened to both latest stable release and latest github repository. This bug was ignored, and I went to the next command "make install".
## 3. Using catkin_make with Ceres
- Bug 0: Cmake error, `Could not find a package configuration file provided by "Ceres" with any of the following names: CeresConfig.cmake ceres-config.cmake`
- Reason: Ceres path is not recognized by Cmake
- Fix: Add `set(Ceres_DIR "/usr/local/lib/cmake/Ceres/")` to CMakeLists.txt
- Bug 1: fatal error: Eigen/Core: No such file or directory
- Reason: the installation folder of Eigen is under .../eigen3/Eigen/...
- Fix: `sudo ln -s /usr/local/include/eigen3/Eigen /usr/include/Eigen`
- Reference: https://github.com/opencv/opencv/issues/14868
- Bug 2: integer_sequence’ is not a member of ‘std’
- Reason: Ceres 2.0 relies on c++ 14
- Fix: change `add_compile_options(-std=c++11)` to `add_compile_options(-std=c++14)`
- Bug 3: test_ceres.cpp:(.text+0xe4): undefined reference to `ceres::Problem::Problem()'
- Reason: Ceres library was not linked
- Fix: Add `${CERES_LIBRARIES}` to target_link_libraries. Eg. `target_link_libraries(YOUR_TARGET_EXEC ${catkin_LIBRARIES} ${CERES_LIBRARIES})`
Hello world file then returns me the following results
>rosrun visual_odometry test_ceres
>
>iter cost cost_change |gradient| |step| tr_ratio tr_radius ls_iter iter_time total_time
>
> 0 4.512500e+01 0.00e+00 9.50e+00 0.00e+00 0.00e+00 1.00e+04 0 1.72e-05 1.09e-04
>
> 1 4.511598e-07 4.51e+01 9.50e-04 9.50e+00 1.00e+00 3.00e+04 1 4.20e-05 1.95e-04
>
> 2 5.012552e-16 4.51e-07 3.17e-08 9.50e-04 1.00e+00 9.00e+04 1 7.87e-06 2.11e-04
>
>Ceres Solver Report: Iterations: 3, Initial cost: 4.512500e+01, Final cost: 5.012552e-16, Termination: CONVERGENCE
>
>x : 0.5 -> 10
# Kitti
## Data in ROS BAG version
Data converted to ROS BAG can be downloaded from [google drive](https://drive.google.com/drive/folders/1Za3Csvq2ljI_88RoNnltZ1Dx4OWRlb8j?usp=sharing)
Right now it contains `2011_09_26_drive_{0001, 0002, 0005}`.
## raw data usage
Reference: http://www.cvlibs.net/datasets/kitti/raw_data.php
There're 4 image folders under each timestamp dataset: "image_00", "image_01", "image_02", "image_03". They correspond to grey left camera, grey right camera, RGB left camera, RGB right camera.
NOTE: Currently, the running of the test code relies on that catkin_ws follows the following tree structure. And please run code at the root of catkin_ws.
- catkin_ws
- build
- data
- 2011_09_26
- 2011_09_26_drive_0001_sync
- calib_cam_to_cam.txt
- calib_velo_to_cam.txt
- devel
- src
## calibration file
Reference: https://github.com/yanii/kitti-pcl/blob/master/KITTI_README.TXT
>calib_cam_to_cam.txt: Camera-to-camera calibration
>--------------------------------------------------
>
> - S_xx: 1x2 size of image xx before rectification
> - K_xx: 3x3 calibration matrix of camera xx before rectification
> - D_xx: 1x5 distortion vector of camera xx before rectification
> - R_xx: 3x3 rotation matrix of camera xx (extrinsic)
> - T_xx: 3x1 translation vector of camera xx (extrinsic)
> - S_rect_xx: 1x2 size of image xx after rectification
> - R_rect_xx: 3x3 rectifying rotation to make image planes co-planar
> - P_rect_xx: 3x4 projection matrix after rectification
>
>Note: When using this dataset you will most likely need to access only
>P_rect_xx, as this matrix is valid for the rectified image sequences.
>
>calib_velo_to_cam.txt: Velodyne-to-camera registration
>------------------------------------------------------
>
> - R: 3x3 rotation matrix
> - T: 3x1 translation vector
> - delta_f: deprecated
> - delta_c: deprecated
>
>R|T takes a point in Velodyne coordinates and transforms it into the
>coordinate system of the left video camera. Likewise it serves as a
>representation of the Velodyne coordinate frame in camera coordinates.
>
>calib_imu_to_velo.txt: GPS/IMU-to-Velodyne registration
>-------------------------------------------------------
>
> - R: 3x3 rotation matrix
> - T: 3x1 translation vector
>
>R|T takes a point in GPS/IMU coordinates and transforms it into the
>coordinate system of the Velodyne scanner. Likewise it serves as a
>representation of the GPS/IMU coordinate frame in Velodyne coordinates.
From the paper "Vision meets robotics: The KITTI dataset", the `P_rect` is defined with element(1,3) and element(2,3) to be 0, but in real calibration file, these two entries are not necessarily exactly zero, especailly element(1,3). Anyway, the value would be 2 orders smaller than element(0,3).
Eq.7 in the origin paper:
$$y = P_{rect}^{(i)} R_{rect}^{(0)} T_{velo}^{cam} x$$
$P_{rect}^{(i)}$ is 3x4, named as project, but is actually the intrinsics K.
$R_{rect}^{(0)}$ is 4x4, named as rotation, but is actually appended with an unit diagonal element. This transforms from unrectified camera i to rectified camera i.
<!-- Left to $T_{velo}^{cam}$, there should be one more transformation if $i \ne 0$, $\begin{bmatrix} R_{0i} & T_{0i} \\ 0 & 1 \end{bmatrix}$, transforming points from camera 0 to camera i. (Or in the inverse direction? This should be further examined.) -->
<!-- use R_rect_00 everywhere, reference https://github.com/utiasSTARS/pykitti/issues/34 -->
$T_{velo}^{cam}$ is 4x4, consistent with the name. It transforms points from velodyne coordinate to (unrectified) camera 0 coordinate.
## odometry data
From http://www.cvlibs.net/datasets/kitti/eval_odometry.php:
- Download odometry data set (grayscale, 22 GB)
- Download odometry data set (color, 65 GB)
- Download odometry data set (velodyne laser data, 80 GB)
- Download odometry data set (calibration files, 1 MB)
- Download odometry ground truth poses (4 MB)
- Download odometry development kit (1 MB)
OpenCV load dataset: https://docs.opencv.org/3.4/dc/dfb/group__datasets__slam.html
## Processing KITTI data
### Image with Keypoints


Performance:
- Image0: Shi-Tomasi detection with n=2767 keypoints in 15.5039 ms
- Image1: Shi-Tomasi detection with n=3062 keypoints in 12.8016 ms
- Image0: ORB descriptor extraction in 6.35671 ms
- Image1: ORB descriptor extraction in 5.10074 ms
### Image with Point Cloud

### Image with Downsampled Point Cloud (grid size = 5)

### Depth Image Estimated by averaging the 3 Nearest Neighbors' Depth in the Downsampled Point Cloud

### Depth Image Estimated by Fitting a Plane from the 3 Nearest Neighbors in the Downsampled Point Cloud

This method is recommended in the original DEMO paper, but in the current downsampling method and hyperparameter setting, it's estimation quality is visually lower than naive averaing estimation.
================================================
FILE: src/visual_odometry/include/visual_odometry/ceres_cost_function.h
================================================
#include <ceres/ceres.h>
#include <ceres/loss_function.h>
#include <ceres/rotation.h>
#ifndef CERES_COST_FUNCTION_H
#define CERES_COST_FUNCTION_H
namespace vloam
{
struct CostFunctor33
{ // 33 means 3d - 3d observation pair
CostFunctor33(double observed_x0, double observed_y0, double observed_z0, double observed_x1, double observed_y1,
double observed_z1)
: // TODO: check if const & is necessary
observed_x0(observed_x0)
, observed_y0(observed_y0)
, observed_z0(observed_z0)
, // 3
observed_x1(observed_x1)
, observed_y1(observed_y1)
, observed_z1(observed_z1)
{
} // 3
template <typename T>
bool operator()(const T* const angles, const T* const t, T* residuals) const
{
T point3d_0to1[3];
T point3d_0[3] = { T(observed_x0), T(observed_y0), T(observed_z0) };
ceres::AngleAxisRotatePoint(angles, point3d_0, point3d_0to1);
point3d_0to1[0] += t[0];
point3d_0to1[1] += t[1];
point3d_0to1[2] += t[2];
residuals[0] = point3d_0to1[0] - T(observed_x1);
residuals[1] = point3d_0to1[1] - T(observed_y1);
residuals[2] = point3d_0to1[2] - T(observed_z1);
return true;
}
static ceres::CostFunction* Create(const double observed_x0, const double observed_y0, const double observed_z0,
const double observed_x1, const double observed_y1, const double observed_z1)
{
return (new ceres::AutoDiffCostFunction<CostFunctor33, 3, 3, 3>(
new CostFunctor33(observed_x0, observed_y0, observed_z0, observed_x1, observed_y1, observed_z1)));
}
double observed_x0, observed_y0, observed_z0, observed_x1, observed_y1,
observed_z1; // in rectified camera 0 coordinate
// TODO: check if the repeated creations of cost functions will decreases the performance?
};
struct CostFunctor32
{ // 32 means 3d - 2d observation pair
CostFunctor32(double observed_x0, double observed_y0, double observed_z0, double observed_x1_bar,
double observed_y1_bar)
: // TODO: check if const & is necessary
observed_x0(observed_x0)
, observed_y0(observed_y0)
, observed_z0(observed_z0)
, // 3
observed_x1_bar(observed_x1_bar)
, observed_y1_bar(observed_y1_bar)
{
} // 2
template <typename T>
bool operator()(const T* const angles, const T* const t, T* residuals) const
{
T X0[3] = { T(observed_x0), T(observed_y0), T(observed_z0) };
T observed_x1_bar_T = T(observed_x1_bar);
T observed_y1_bar_T = T(observed_y1_bar);
T R_dot_X0[3];
ceres::AngleAxisRotatePoint(angles, X0, R_dot_X0);
R_dot_X0[0] += t[0];
R_dot_X0[1] += t[1];
R_dot_X0[2] += t[2];
residuals[0] = R_dot_X0[0] - R_dot_X0[2] * observed_x1_bar_T;
residuals[1] = R_dot_X0[1] - R_dot_X0[2] * observed_y1_bar_T;
return true;
}
static ceres::CostFunction* Create(const double observed_x0, const double observed_y0, const double observed_z0,
const double observed_x1_bar, const double observed_y1_bar)
{
return (new ceres::AutoDiffCostFunction<CostFunctor32, 2, 3, 3>(
new CostFunctor32(observed_x0, observed_y0, observed_z0, observed_x1_bar, observed_y1_bar)));
}
double observed_x0, observed_y0, observed_z0, observed_x1_bar, observed_y1_bar;
// TODO: check if the repeated creations of cost functions will decreases the performance?
};
struct CostFunctor23
{ // 23 means 2d - 3d observation pair
CostFunctor23(double observed_x0_bar, double observed_y0_bar, double observed_x1, double observed_y1,
double observed_z1)
: // TODO: check if const & is necessary
observed_x0_bar(observed_x0_bar)
, observed_y0_bar(observed_y0_bar)
, // 2
observed_x1(observed_x1)
, observed_y1(observed_y1)
, observed_z1(observed_z1)
{
} // 3
template <typename T>
bool operator()(const T* const angles, const T* const t, T* residuals) const
{
T observed_x0_bar_T = T(observed_x0_bar);
T observed_y0_bar_T = T(observed_y0_bar);
T angles_inv[3] = { -angles[0], -angles[1], -angles[2] };
T X1[3] = { T(observed_x1), T(observed_y1), T(observed_z1) };
T RT_dot_X1[3];
ceres::AngleAxisRotatePoint(angles_inv, X1, RT_dot_X1);
T RT_dot_t[3];
ceres::AngleAxisRotatePoint(angles_inv, t, RT_dot_t);
T RT_dot_X1_minus_RT_dot_t[3] = { RT_dot_X1[0] - RT_dot_t[0], RT_dot_X1[1] - RT_dot_t[1],
RT_dot_X1[2] - RT_dot_t[2] };
residuals[0] = RT_dot_X1_minus_RT_dot_t[0] - RT_dot_X1_minus_RT_dot_t[2] * observed_x0_bar_T;
residuals[1] = RT_dot_X1_minus_RT_dot_t[1] - RT_dot_X1_minus_RT_dot_t[2] * observed_y0_bar_T;
return true;
}
static ceres::CostFunction* Create(const double observed_x0_bar, const double observed_y0_bar,
const double observed_x1, const double observed_y1, const double observed_z1)
{
return (new ceres::AutoDiffCostFunction<CostFunctor23, 2, 3, 3>(
new CostFunctor23(observed_x0_bar, observed_y0_bar, observed_x1, observed_y1, observed_z1)));
}
double observed_x0_bar, observed_y0_bar, observed_x1, observed_y1, observed_z1;
// TODO: check if the repeated creations of cost functions will decreases the performance?
};
struct CostFunctor22
{ // 22 means 2d - 2d observation pair
CostFunctor22(double observed_x0_bar, double observed_y0_bar, double observed_x1_bar, double observed_y1_bar)
: // TODO: check if const & is necessary
observed_x0_bar(observed_x0_bar)
, observed_y0_bar(observed_y0_bar)
, // 2
observed_x1_bar(observed_x1_bar)
, observed_y1_bar(observed_y1_bar)
{
} // 2
template <typename T>
bool operator()(const T* const angles, const T* const t, T* residuals) const
{
T observed_X0_bar_T[3] = { T(observed_x0_bar), T(observed_y0_bar), T(1.0) };
T observed_X1_bar_T[3] = { T(observed_x1_bar), T(observed_y1_bar), T(1.0) };
T observed_X0_bar_T_to1[3];
ceres::AngleAxisRotatePoint(angles, observed_X0_bar_T, observed_X0_bar_T_to1);
T t_cross_observed_X0_bar_T_to1[3];
ceres::CrossProduct(t, observed_X0_bar_T_to1, t_cross_observed_X0_bar_T_to1);
residuals[0] = ceres::DotProduct(observed_X1_bar_T, t_cross_observed_X0_bar_T_to1);
return true;
}
static ceres::CostFunction* Create(const double observed_x0_bar, const double observed_y0_bar,
const double observed_x1_bar, const double observed_y1_bar)
{
return (new ceres::AutoDiffCostFunction<CostFunctor22, 1, 3, 3>(
new CostFunctor22(observed_x0_bar, observed_y0_bar, observed_x1_bar, observed_y1_bar)));
}
double observed_x0_bar, observed_y0_bar, observed_x1_bar, observed_y1_bar;
// TODO: check if the repeated creations of cost functions will decreases the performance?
};
} // namespace vloam
#endif
================================================
FILE: src/visual_odometry/include/visual_odometry/image_util.h
================================================
#include <ros/package.h>
#include <ros/ros.h>
#include <algorithm>
#include <iostream>
#include <iterator>
#include <opencv4/opencv2/opencv.hpp>
#include <string>
#include <vector>
#ifndef IMAGE_UTIL_H
#define IMAGE_UTIL_H
namespace vloam
{
enum class DetectorType
{
ShiTomasi,
BRISK,
FAST,
ORB,
AKAZE,
SIFT
};
static const std::string DetectorType_str[] = { "ShiTomasi", "BRISK", "FAST", "ORB", "AKAZE", "SIFT" };
enum class DescriptorType
{
BRISK,
ORB,
BRIEF,
AKAZE,
FREAK,
SIFT
};
static const std::string DescriptorType_str[] = { "BRISK", "ORB", "BRIEF", "AKAZE", "FREAK", "SIFT" };
enum class MatcherType
{
BF,
FLANN
};
enum class SelectType
{
NN,
KNN
};
class ImageUtil
{
public:
ImageUtil()
{
print_result = false;
visualize_result = false;
detector_type = DetectorType::ShiTomasi;
descriptor_type = DescriptorType::ORB;
matcher_type = MatcherType::BF;
selector_type = SelectType::NN;
}
std::vector<cv::KeyPoint> detKeypoints(cv::Mat &img);
std::vector<cv::KeyPoint> keyPointsNMS(std::vector<cv::KeyPoint> &&keypoints,
const int bucket_width = 100, // width for horizontal direction in image plane
// => x, col
const int bucket_height = 100, // height for vertical direction in image plane
// => y, row
const int max_total_keypoints = 400);
void saveKeypointsImage(const std::string file_name);
cv::Mat descKeypoints(std::vector<cv::KeyPoint> &keypoints, cv::Mat &img);
std::vector<cv::DMatch> matchDescriptors(cv::Mat &desc_source, cv::Mat &desc_ref);
void visualizeMatchesCallBack(int event, int x, int y);
cv::Mat visualizeMatches(const cv::Mat &image0, const cv::Mat &image1, const std::vector<cv::KeyPoint> &keypoints0,
const std::vector<cv::KeyPoint> &keypoints1, const std::vector<cv::DMatch> &matches);
std::tuple<std::vector<cv::Point2f>, std::vector<cv::Point2f>, std::vector<uchar>>
calculateOpticalFlow(const cv::Mat &image0, const cv::Mat &image1, const std::vector<cv::KeyPoint> &keypoints0);
cv::Mat visualizeOpticalFlow(const cv::Mat &image1, const std::vector<cv::Point2f> &keypoints0_2f,
const std::vector<cv::Point2f> &keypoints1_2f,
const std::vector<uchar> &optical_flow_status);
cv::Mat visualizeOpticalFlow(const cv::Mat &image1, const std::vector<cv::KeyPoint> &keypoints0,
const std::vector<cv::KeyPoint> &keypoints1, const std::vector<cv::DMatch> &matches);
// std::string path_prefix;
bool print_result;
bool visualize_result;
DetectorType detector_type;
DescriptorType descriptor_type;
MatcherType matcher_type;
SelectType selector_type;
int remove_VO_outlier;
bool optical_flow_match;
private:
double time;
cv::Mat img_keypoints;
const int IMG_HEIGHT = 375;
const int IMG_WIDTH = 1242;
};
} // namespace vloam
#endif
================================================
FILE: src/visual_odometry/include/visual_odometry/point_cloud_util.h
================================================
#include <ros/package.h>
#include <ros/ros.h>
#include <boost/lexical_cast.hpp>
#include <eigen3/Eigen/Dense>
#include <fstream>
#include <iostream>
#include <opencv4/opencv2/opencv.hpp>
#include <string>
#include <vector>
#ifndef POINT_CLOUD_UTIL_H
#define POINT_CLOUD_UTIL_H
namespace vloam
{
class PointCloudUtil
{
public:
PointCloudUtil()
{
cam_T_velo = Eigen::Matrix4f::Zero();
rect0_T_cam = Eigen::Matrix4f::Zero();
P_rect0 = Eigen::MatrixXf::Zero(3, 4);
print_result = false;
downsample_grid_size = 5;
}
void loadTransformations(const std::string& calib_cam_to_cam_file_path,
const std::string& calib_velo_to_cam_file_path);
void loadPointCloud(const std::string& bin_file_path);
void projectPointCloud();
void printPointCloud2dStats() const;
void downsamplePointCloud();
void visualizePointCloud(const std::string image_file_path, const bool select_downsampled = true);
float queryDepth(const float x, const float y, const int searching_radius = 2) const;
void visualizeDepthCallBack(int event, int x, int y);
void visualizeDepth(const std::string image_file_path);
cv::Mat visualizeDepth(const cv::Mat& gray_image);
// private:
const int IMG_HEIGHT = 375;
const int IMG_WIDTH = 1242;
Eigen::Matrix4f cam_T_velo; // T_velo^cam or cam_T_velo // TODO: rewrite related functions using Eigen::isometry or
// Eigen::Transforma; https://eigen.tuxfamily.org/dox/classEigen_1_1Transform.html
Eigen::Matrix4f rect0_T_cam; // T_cam0^rect0 or rect0_T_cam0 // NOTE: typedef Transform<float,3,Isometry> Isometry3f
Eigen::MatrixXf P_rect0; // TODO: combine the 3 transformations into 1 3x4 matrix
Eigen::MatrixXf point_cloud_3d_tilde; // row size is dynamic, and will be decided when load the point cloud; column
// size is fixed as 4
Eigen::MatrixXf point_cloud_2d;
Eigen::MatrixXf point_cloud_2d_dnsp;
int downsample_grid_size;
Eigen::MatrixXf bucket_x;
Eigen::MatrixXf bucket_y;
Eigen::MatrixXf bucket_depth;
Eigen::MatrixXi bucket_count;
cv::Mat image_with_point_cloud;
cv::Mat image_with_depth;
bool print_result;
};
} // namespace vloam
#endif
================================================
FILE: src/visual_odometry/include/visual_odometry/visual_odometry.h
================================================
#pragma once
#include <ceres/ceres.h>
#include <ceres/loss_function.h>
#include <ceres/rotation.h>
#include <cv_bridge/cv_bridge.h>
#include <geometry_msgs/PoseStamped.h>
#include <geometry_msgs/TransformStamped.h>
#include <image_transport/image_transport.h>
#include <lidar_odometry_mapping/tic_toc.h>
#include <message_filters/subscriber.h>
#include <message_filters/sync_policies/approximate_time.h>
#include <message_filters/synchronizer.h>
#include <nav_msgs/Odometry.h>
#include <nav_msgs/Path.h>
#include <pcl/point_types.h>
#include <pcl_conversions/pcl_conversions.h>
#include <ros/ros.h>
#include <sensor_msgs/CameraInfo.h>
#include <sensor_msgs/Image.h>
#include <sensor_msgs/PointCloud2.h>
#include <tf2/LinearMath/Transform.h>
#include <visual_odometry/ceres_cost_function.h>
#include <visual_odometry/image_util.h>
#include <visual_odometry/point_cloud_util.h>
#include <vloam_tf/vloam_tf.h>
#include <eigen3/Eigen/Dense>
#include <opencv4/opencv2/core/eigen.hpp>
#include <opencv4/opencv2/opencv.hpp>
#ifndef VISUAL_ODOMETRY_H
#define VISUAL_ODOMETRY_H
namespace vloam
{
// typedef message_filters::sync_policies::ApproximateTime<sensor_msgs::Image, sensor_msgs::CameraInfo,
// sensor_msgs::PointCloud2> VO_policy;
class VisualOdometry
{
public:
VisualOdometry();
void init(std::shared_ptr<VloamTF>& vloam_tf_);
void reset();
void processImage(const cv::Mat& img00);
// void setUpPointCloud(const Eigen::Isometry3f& imu_eigen_T_cam0, const Eigen::Isometry3f& imu_eigen_T_velo, const
// sensor_msgs::CameraInfoConstPtr& camera_info_msg);
void setUpPointCloud(const sensor_msgs::CameraInfoConstPtr& camera_info_msg);
void processPointCloud(const sensor_msgs::PointCloud2ConstPtr& point_cloud_msg,
const pcl::PointCloud<pcl::PointXYZ>& point_cloud_pcl, const bool& visualize_depth,
const bool& publish_point_cloud);
void solveRANSAC();
void solveNlsAll();
void solveNls2dOnly();
void publish();
// private:
std::shared_ptr<VloamTF> vloam_tf;
int i, j, count;
vloam::ImageUtil image_util;
std::vector<cv::Mat> images;
std::vector<std::vector<cv::KeyPoint>> keypoints;
std::vector<cv::Mat> descriptors;
std::vector<cv::DMatch> matches;
std::vector<std::vector<cv::Point2f>> keypoints_2f;
std::vector<uchar> optical_flow_status;
std::vector<vloam::PointCloudUtil> point_cloud_utils;
Eigen::MatrixXf point_cloud_3d_tilde;
pcl::PointCloud<pcl::PointXYZ> point_cloud_pcl;
float depth0, depth1;
double angles_0to1[3];
double t_0to1[3];
Eigen::Vector3f point_3d_image0_0;
Eigen::Vector3f point_3d_image0_1;
Eigen::Vector3f point_3d_rect0_0;
Eigen::Vector3f point_3d_rect0_1;
ros::Publisher pub_point_cloud;
// ceres::LossFunction *loss_function = new ceres::CauchyLoss(0.5);
ceres::Solver::Options options;
ceres::Solver::Summary summary;
float angle;
tf2::Transform cam0_curr_T_cam0_last;
tf2::Quaternion cam0_curr_q_cam0_last;
private:
ros::NodeHandle nh;
int verbose_level;
bool reset_VO_to_identity;
int remove_VO_outlier;
bool keypoint_NMS;
bool CLAHE;
cv::Ptr<cv::CLAHE> clahe;
bool visualize_optical_flow;
bool optical_flow_match;
nav_msgs::Odometry visualOdometry;
ros::Publisher pubvisualOdometry;
nav_msgs::Path visualPath;
ros::Publisher pubvisualPath;
image_transport::Publisher pub_matches_viz;
cv_bridge::CvImage matches_viz_cvbridge;
image_transport::Publisher pub_depth_viz;
cv_bridge::CvImage depth_viz_cvbridge;
image_transport::Publisher pub_optical_flow_viz;
cv_bridge::CvImage optical_flow_viz_cvbridge;
};
} // namespace vloam
#endif
================================================
FILE: src/visual_odometry/package.xml
================================================
<?xml version="1.0"?>
<package format="2">
<name>visual_odometry</name>
<version>0.0.0</version>
<description>The visual_odometry package</description>
<!-- One maintainer tag required, multiple allowed, one person per tag -->
<!-- Example: -->
<!-- <maintainer email="jane.doe@example.com">Jane Doe</maintainer> -->
<maintainer email="yukun@cs.cmu.edu">yukun</maintainer>
<!-- One license tag required, multiple allowed, one license per tag -->
<!-- Commonly used license strings: -->
<!-- BSD, MIT, Boost Software License, GPLv2, GPLv3, LGPLv2.1, LGPLv3 -->
<license>MIT</license>
<!-- Url tags are optional, but multiple are allowed, one per tag -->
<!-- Optional attribute type can be: website, bugtracker, or repository -->
<!-- Example: -->
<!-- <url type="website">http://wiki.ros.org/visual_odometry</url> -->
<!-- Author tags are optional, multiple are allowed, one per tag -->
<!-- Authors do not have to be maintainers, but could be -->
<!-- Example: -->
<!-- <author email="jane.doe@example.com">Jane Doe</author> -->
<!-- The *depend tags are used to specify dependencies -->
<!-- Dependencies can be catkin packages or system dependencies -->
<!-- Examples: -->
<!-- Use depend as a shortcut for packages that are both build and exec dependencies -->
<!-- <depend>roscpp</depend> -->
<!-- Note that this is equivalent to the following: -->
<!-- <build_depend>roscpp</build_depend> -->
<!-- <exec_depend>roscpp</exec_depend> -->
<!-- Use build_depend for packages you need at compile time: -->
<!-- <build_depend>message_generation</build_depend> -->
<!-- Use build_export_depend for packages you need in order to build against this package: -->
<!-- <build_export_depend>message_generation</build_export_depend> -->
<!-- Use buildtool_depend for build tool packages: -->
<!-- <buildtool_depend>catkin</buildtool_depend> -->
<!-- Use exec_depend for packages you need at runtime: -->
<!-- <exec_depend>message_runtime</exec_depend> -->
<!-- Use test_depend for packages you need only for testing: -->
<!-- <test_depend>gtest</test_depend> -->
<!-- Use doc_depend for packages you need only for building documentation: -->
<!-- <doc_depend>doxygen</doc_depend> -->
<buildtool_depend>catkin</buildtool_depend>
<build_depend>cv_bridge</build_depend>
<build_depend>roscpp</build_depend>
<build_depend>sensor_msgs</build_depend>
<build_depend>std_msgs</build_depend>
<build_depend>roslib</build_depend>
<build_depend>Ceres</build_depend>
<build_depend>vloam_tf</build_depend>
<build_export_depend>cv_bridge</build_export_depend>
<build_export_depend>roscpp</build_export_depend>
<build_export_depend>sensor_msgs</build_export_depend>
<build_export_depend>std_msgs</build_export_depend>
<exec_depend>cv_bridge</exec_depend>
<exec_depend>roscpp</exec_depend>
<exec_depend>sensor_msgs</exec_depend>
<exec_depend>std_msgs</exec_depend>
<exec_depend>roslib</exec_depend>
<exec_depend>Ceres</exec_depend>
<exec_depend>vloam_tf</exec_depend>
<!-- The export tag contains other, unspecified, tags -->
<export>
<!-- Other tools can request additional information be placed here -->
</export>
</package>
================================================
FILE: src/visual_odometry/src/image_util.cpp
================================================
#include <visual_odometry/image_util.h>
namespace vloam
{
std::vector<cv::KeyPoint> ImageUtil::detKeypoints(cv::Mat& img)
{
std::vector<cv::KeyPoint> keypoints;
if (print_result)
time = (double)cv::getTickCount();
if (detector_type == DetectorType::ShiTomasi)
{
int block_size =
5; // size of an average block for computing a derivative covariation matrix over each pixel neighborhood
// double max_overlap = 0.0; // max. permissible overlap between two features in %
// double min_distance = (1.0 - max_overlap) * block_size;
double min_distance = block_size * 1.5;
// int maxCorners = img.rows * img.cols / std::max(1.0, min_distance); // max. num. of keypoints
int maxCorners = 1024;
double quality_level = 0.03; // minimal accepted quality of image corners
double k = 0.04;
std::vector<cv::Point2f> corners;
cv::goodFeaturesToTrack(img, corners, maxCorners, quality_level, min_distance, cv::Mat(), block_size, false, k);
// add corners to result vector
for (auto it = corners.begin(); it != corners.end(); ++it)
{
cv::KeyPoint new_keypoint;
new_keypoint.pt = cv::Point2f((*it).x, (*it).y);
new_keypoint.size = block_size;
keypoints.push_back(new_keypoint);
}
}
else if (detector_type == DetectorType::FAST)
{
int threshold = 100;
cv::FAST(img, keypoints, threshold, true);
}
else
{
cv::Ptr<cv::FeatureDetector> detector;
if (detector_type == DetectorType::BRISK)
detector = cv::BRISK::create();
else if (detector_type == DetectorType::ORB)
{
int num_features = 2000;
float scaleFactor = 1.2f;
int nlevels = 8;
int edgeThreshold = 31;
int firstLevel = 0;
int WTA_K = 2;
cv::ORB::ScoreType scoreType = cv::ORB::FAST_SCORE;
int patchSize = 31;
int fastThreshold = 20;
detector = cv::ORB::create(num_features, scaleFactor, nlevels, edgeThreshold, firstLevel, WTA_K, scoreType,
patchSize, fastThreshold);
}
else if (detector_type == DetectorType::AKAZE)
detector = cv::AKAZE::create();
else if (detector_type == DetectorType::SIFT)
detector = cv::SIFT::create();
else
{
std::cerr << "Detector is not implemented" << std::endl;
exit(EXIT_FAILURE);
}
detector->detect(img, keypoints);
}
if (print_result)
{
time = ((double)cv::getTickCount() - time) / cv::getTickFrequency();
std::cout << DetectorType_str[static_cast<int>(detector_type)] + "detection with n=" << keypoints.size()
<< " keypoints in " << 1000 * time / 1.0 << " ms" << std::endl;
}
if (visualize_result)
{
// std::vector<cv::KeyPoint> fake_keypoints;
// fake_keypoints.push_back(keypoints[0]);
// std::cout << "fake keypoints 0: " << keypoints[0].pt.x << ", " << keypoints[0].pt.y << std::endl;
img_keypoints = img.clone();
// cv::drawKeypoints(img, fake_keypoints, img_keypoints, cv::Scalar::all(-1),
// cv::DrawMatchesFlags::DRAW_RICH_KEYPOINTS);
cv::drawKeypoints(img, keypoints, img_keypoints, cv::Scalar::all(-1), cv::DrawMatchesFlags::DRAW_RICH_KEYPOINTS);
std::string window_name = DetectorType_str[static_cast<int>(detector_type)] + " Detector Results";
cv::namedWindow(window_name, 6);
cv::imshow(window_name, img_keypoints);
cv::waitKey(0);
}
return keypoints;
}
std::vector<cv::KeyPoint> ImageUtil::keyPointsNMS( // TODO: check if opencv detector minDistance helps here
std::vector<cv::KeyPoint>&& keypoints,
const int bucket_width, // width for horizontal direction in image plane => x, col
const int bucket_height, // height for vertical direction in image plane => y, row
const int max_total_keypoints)
{
const int bucket_shape_x = std::ceil(static_cast<float>(IMG_WIDTH) / static_cast<float>(bucket_width)); // 13
const int bucket_shape_y = std::ceil(static_cast<float>(IMG_HEIGHT) / static_cast<float>(bucket_height)); // 4
const int max_bucket_keypoints = max_total_keypoints / (bucket_shape_x * bucket_shape_y); // 7
std::vector<std::vector<std::vector<cv::KeyPoint>>> bucket(
bucket_shape_x, std::vector<std::vector<cv::KeyPoint>>(bucket_shape_y, std::vector<cv::KeyPoint>()));
// put all keypoints into buckets
for (const auto& keypoint : keypoints)
{
bucket[static_cast<int>(keypoint.pt.x / static_cast<float>(bucket_width))]
[static_cast<int>(keypoint.pt.y / static_cast<float>(bucket_height))]
.push_back(keypoint);
}
std::vector<cv::KeyPoint> keypoints_after_NMS;
keypoints_after_NMS.reserve(max_total_keypoints);
auto keypoints_sort = [](const cv::KeyPoint& kp0, const cv::KeyPoint& kp1) { return kp0.response > kp1.response; };
// iterate all bucket, sort and put keypoints with top response to the return
int col, row;
for (col = 0; col < bucket_shape_x; ++col)
{
for (row = 0; row < bucket_shape_y; ++row)
{
if (bucket[col][row].empty())
continue;
if (bucket[col][row].size() <= max_bucket_keypoints)
{
std::copy(bucket[col][row].begin(), bucket[col][row].end(), std::back_inserter(keypoints_after_NMS));
continue;
}
std::sort(bucket[col][row].begin(), bucket[col][row].end(),
keypoints_sort); // ascending order of keypoint response
std::copy(bucket[col][row].begin(), bucket[col][row].begin() + max_bucket_keypoints,
std::back_inserter(keypoints_after_NMS));
}
}
return keypoints_after_NMS;
}
void ImageUtil::saveKeypointsImage(const std::string file_name)
{
if (!img_keypoints.data)
{
printf("No keypoints data \n");
return;
}
cv::imwrite(ros::package::getPath("visual_odometry") + "/figures/" + file_name + ".png", img_keypoints);
}
cv::Mat ImageUtil::descKeypoints(std::vector<cv::KeyPoint>& keypoints, cv::Mat& img)
{
cv::Mat descriptors;
cv::Ptr<cv::DescriptorExtractor> extractor;
if (descriptor_type == DescriptorType::BRISK)
{
int threshold = 30; // FAST/AGAST detection threshold score.
int octaves = 3; // detection octaves (use 0 to do single scale)
float pattern_scale = 1.0f; // apply this scale to the pattern used for sampling the neighbourhood of a keypoint.
extractor = cv::BRISK::create(threshold, octaves, pattern_scale);
}
// else if (descriptor_type == DescriptorType::BRIEF) {
// extractor = cv::xfeatures2d::BriefDescriptorExtractor::create();
// }
else if (descriptor_type == DescriptorType::ORB)
{
extractor = cv::ORB::create();
}
// else if (descriptor_type == DescriptorType::FREAK) {
// extractor = cv::xfeatures2d::FREAK::create();
// }
else if (descriptor_type == DescriptorType::AKAZE)
{
extractor = cv::AKAZE::create();
}
else if (descriptor_type == DescriptorType::SIFT)
{
extractor = cv::SIFT::create();
}
else
{
std::cerr << "Decscriptor is not implemented" << std::endl;
exit(EXIT_FAILURE);
}
if (print_result)
time = (double)cv::getTickCount();
extractor->compute(img, keypoints, descriptors);
if (print_result)
{
time = ((double)cv::getTickCount() - time) / cv::getTickFrequency();
std::cout << DescriptorType_str[static_cast<int>(descriptor_type)] << " descriptor extraction in "
<< 1000 * time / 1.0 << " ms" << std::endl;
}
return descriptors;
}
std::vector<cv::DMatch> ImageUtil::matchDescriptors(cv::Mat& descriptors0, cv::Mat& descriptors1)
{
std::vector<cv::DMatch> matches;
bool crossCheck = (selector_type == SelectType::NN);
cv::Ptr<cv::DescriptorMatcher> matcher;
// Reference: https://docs.opencv.org/master/dc/dc3/tutorial_py_matcher.html
if (matcher_type == MatcherType::BF)
{
int normType;
if (descriptor_type == DescriptorType::AKAZE or descriptor_type == DescriptorType::BRISK or
descriptor_type == DescriptorType::ORB)
{
normType = cv::NORM_HAMMING;
}
else if (descriptor_type == DescriptorType::SIFT)
{
normType = cv::NORM_L2;
}
else
{
std::cerr << "Decscriptor is not implemented" << std::endl;
}
matcher = cv::BFMatcher::create(normType, crossCheck);
}
else if (matcher_type == MatcherType::FLANN)
{
if (descriptors0.type() != CV_32F)
{ // OpenCV bug workaround : convert binary descriptors to floating point due to a bug in current OpenCV
// implementation
descriptors0.convertTo(descriptors0, CV_32F);
}
if (descriptors1.type() != CV_32F)
{ // OpenCV bug workaround : convert binary descriptors to floating point due to a bug in current OpenCV
// implementation
descriptors1.convertTo(descriptors1, CV_32F);
}
matcher = cv::DescriptorMatcher::create(cv::DescriptorMatcher::FLANNBASED);
}
if (print_result)
time = (double)cv::getTickCount();
if (selector_type == SelectType::NN)
{
matcher->match(descriptors0, descriptors1, matches);
if (print_result)
{
time = ((double)cv::getTickCount() - time) / cv::getTickFrequency();
std::cout << " (NN) with n=" << matches.size() << " matches in " << 1000 * time / 1.0 << " ms" << std::endl;
}
}
else if (selector_type == SelectType::KNN)
{ // k nearest neighbors (k=2)
// double t = (double)cv::getTickCount();
std::vector<std::vector<cv::DMatch>> knn_matches;
matcher->knnMatch(descriptors0, descriptors1, knn_matches, 2);
for (const auto& knn_match : knn_matches)
{
if (knn_match[0].distance < 0.8 * knn_match[1].distance)
{
matches.push_back(knn_match[0]);
}
}
if (print_result)
{
time = ((double)cv::getTickCount() - time) / cv::getTickFrequency();
std::cout << " (KNN) with n=" << matches.size() << " matches in " << 1000 * time / 1.0 << " ms" << std::endl;
std::cout << "KNN matches = " << knn_matches.size() << ", qualified matches = " << matches.size()
<< ", discard ratio = " << (float)(knn_matches.size() - matches.size()) / (float)knn_matches.size()
<< std::endl;
}
}
if (print_result)
std::cout << "MATCH KEYPOINT DESCRIPTORS done, and the number of matches is " << matches.size() << std::endl;
return matches;
}
void ImageUtil::visualizeMatchesCallBack(int event, int x, int y)
{
if (event == cv::EVENT_LBUTTONDOWN)
{
std::cout << "Left button of the mouse is clicked - position (" << x << ", " << y << ")" << std::endl;
}
}
void visualizeMatchesOnMouse(int ev, int x, int y, int, void* obj)
{
ImageUtil* iu = static_cast<ImageUtil*>(obj);
if (iu)
iu->visualizeMatchesCallBack(ev, x, y);
}
cv::Mat ImageUtil::visualizeMatches(const cv::Mat& image0, const cv::Mat& image1,
const std::vector<cv::KeyPoint>& keypoints0,
const std::vector<cv::KeyPoint>& keypoints1, const std::vector<cv::DMatch>& matches)
{
std::vector<cv::DMatch> matches_dnsp;
const int stride = std::ceil(static_cast<float>(matches.size()) / 100.0f); // at most 100 points
int prev_pt_x, prev_pt_y, curr_pt_x, curr_pt_y;
for (int i = 0; i < matches.size(); i += stride)
{
prev_pt_x = keypoints0[matches[i].queryIdx].pt.x;
prev_pt_y = keypoints0[matches[i].queryIdx].pt.y;
curr_pt_x = keypoints1[matches[i].trainIdx].pt.x;
curr_pt_y = keypoints1[matches[i].trainIdx].pt.y;
if (remove_VO_outlier > 0)
{
if (std::pow(prev_pt_x - curr_pt_x, 2) + std::pow(prev_pt_y - curr_pt_y, 2) >
remove_VO_outlier * remove_VO_outlier)
continue;
}
matches_dnsp.push_back(matches[i]);
}
cv::Mat matchImg = image1.clone();
cv::drawMatches(image0, keypoints0, image1, keypoints1, matches_dnsp, matchImg, cv::Scalar::all(-1),
cv::Scalar::all(-1), std::vector<char>(), cv::DrawMatchesFlags::DEFAULT);
// std::string windowName = "Matching keypoints between two camera images";
// cv::namedWindow(windowName, 7);
// cv::setMouseCallback(windowName, visualizeMatchesOnMouse, this);
// cv::imshow(windowName, matchImg);
// std::cout << "Press key to continue to next image" << std::endl;
// cv::waitKey(0); // wait for key to be pressed
// ROS_INFO("image showed");
return matchImg;
}
std::tuple<std::vector<cv::Point2f>, std::vector<cv::Point2f>, std::vector<uchar>> ImageUtil::calculateOpticalFlow(
const cv::Mat& image0, const cv::Mat& image1, const std::vector<cv::KeyPoint>& keypoints0)
{
std::vector<cv::Point2f> keypoints0_2f;
std::vector<cv::Point2f> keypoints1_2f;
std::ve
gitextract_grozf6ne/
├── .gitignore
├── LICENSE
├── README.md
└── src/
├── lidar_odometry_mapping/
│ ├── CMakeLists.txt
│ ├── LICENSE
│ ├── include/
│ │ └── lidar_odometry_mapping/
│ │ ├── common.h
│ │ ├── laser_mapping.h
│ │ ├── laser_odometry.h
│ │ ├── lidarFactor.hpp
│ │ ├── lidar_odometry_mapping.h
│ │ ├── scan_registration.h
│ │ └── tic_toc.h
│ ├── launch/
│ │ ├── loam_velodyne_HDL_32.launch
│ │ ├── loam_velodyne_HDL_64.launch
│ │ ├── loam_velodyne_HDL_64_kitti.launch
│ │ └── loam_velodyne_VLP_16.launch
│ ├── package.xml
│ ├── rviz_cfg/
│ │ └── lidar_odometry_mapping.rviz
│ └── src/
│ ├── laser_mapping.cpp
│ ├── laser_odometry.cpp
│ ├── lidar_odometry_mapping.cpp
│ └── scan_registration.cpp
├── visual_odometry/
│ ├── CMakeLists.txt
│ ├── README.md
│ ├── include/
│ │ └── visual_odometry/
│ │ ├── ceres_cost_function.h
│ │ ├── image_util.h
│ │ ├── point_cloud_util.h
│ │ └── visual_odometry.h
│ ├── package.xml
│ └── src/
│ ├── image_util.cpp
│ ├── point_cloud_util.cpp
│ └── visual_odometry.cpp
├── vloam_main/
│ ├── CMakeLists.txt
│ ├── README.md
│ ├── action/
│ │ └── vloam_main.action
│ ├── launch/
│ │ └── vloam_main.launch
│ ├── package.xml
│ ├── results/
│ │ ├── 2011_09_26_drive_0001/
│ │ │ ├── LO0.txt
│ │ │ ├── MO0.txt
│ │ │ └── VO0.txt
│ │ ├── 2011_09_26_drive_0002/
│ │ │ ├── LO0.txt
│ │ │ ├── MO0.txt
│ │ │ └── VO0.txt
│ │ ├── 2011_09_26_drive_0005/
│ │ │ ├── LO0.txt
│ │ │ ├── LO1.txt
│ │ │ ├── MO0.txt
│ │ │ ├── MO1.txt
│ │ │ ├── VO0.txt
│ │ │ └── VO1.txt
│ │ ├── 2011_09_30_drive_0016/
│ │ │ ├── LO0.txt
│ │ │ ├── MO0.txt
│ │ │ └── VO0.txt
│ │ ├── 2011_09_30_drive_0027/
│ │ │ ├── LO1.txt
│ │ │ ├── MO1.txt
│ │ │ └── VO1.txt
│ │ ├── 2011_10_03_drive_0027/
│ │ │ ├── LO0.txt
│ │ │ ├── MO0.txt
│ │ │ └── VO0.txt
│ │ └── 2011_10_03_drive_0042/
│ │ ├── LO0.txt
│ │ ├── LO1.txt
│ │ ├── MO0.txt
│ │ ├── MO1.txt
│ │ ├── VO0.txt
│ │ └── VO1.txt
│ ├── rviz/
│ │ ├── vloam.rviz
│ │ └── vloam_combined.rviz
│ ├── scripts/
│ │ └── plotTrajectory.py
│ └── src/
│ └── vloam_main_node.cpp
└── vloam_tf/
├── CMakeLists.txt
├── include/
│ └── vloam_tf/
│ └── vloam_tf.h
├── package.xml
└── src/
└── vloam_tf.cpp
SYMBOL INDEX (45 symbols across 21 files)
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/common.h
type pcl (line 42) | typedef pcl::PointXYZI PointType;
function rad2deg (line 44) | inline double rad2deg(double radians)
function deg2rad (line 49) | inline double deg2rad(double degrees)
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/laser_mapping.h
function namespace (line 70) | namespace vloam
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/laser_odometry.h
function namespace (line 61) | namespace vloam
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/lidarFactor.hpp
type LidarEdgeFactor (line 14) | struct LidarEdgeFactor
method LidarEdgeFactor (line 16) | LidarEdgeFactor(Eigen::Vector3d curr_point_, Eigen::Vector3d last_poin...
type LidarPlaneFactor (line 58) | struct LidarPlaneFactor
method LidarPlaneFactor (line 60) | LidarPlaneFactor(Eigen::Vector3d curr_point_, Eigen::Vector3d last_poi...
type LidarPlaneNormFactor (line 108) | struct LidarPlaneNormFactor
method LidarPlaneNormFactor (line 110) | LidarPlaneNormFactor(Eigen::Vector3d curr_point_, Eigen::Vector3d plan...
type LidarDistanceFactor (line 141) | struct LidarDistanceFactor
method LidarDistanceFactor (line 143) | LidarDistanceFactor(Eigen::Vector3d curr_point_, Eigen::Vector3d close...
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/lidar_odometry_mapping.h
function namespace (line 43) | namespace vloam
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/scan_registration.h
function namespace (line 62) | namespace vloam
FILE: src/lidar_odometry_mapping/include/lidar_odometry_mapping/tic_toc.h
function class (line 10) | class TicToc
FILE: src/lidar_odometry_mapping/src/laser_mapping.cpp
type vloam (line 38) | namespace vloam
FILE: src/lidar_odometry_mapping/src/laser_odometry.cpp
type vloam (line 38) | namespace vloam
FILE: src/lidar_odometry_mapping/src/lidar_odometry_mapping.cpp
type vloam (line 38) | namespace vloam
FILE: src/lidar_odometry_mapping/src/scan_registration.cpp
type vloam (line 38) | namespace vloam
FILE: src/visual_odometry/include/visual_odometry/ceres_cost_function.h
function namespace (line 8) | namespace vloam
function ceres (line 42) | static ceres::CostFunction* Create(const double observed_x0, const doubl...
type CostFunctor32 (line 54) | struct CostFunctor32
function ceres (line 87) | static ceres::CostFunction* Create(const double observed_x0, const doubl...
type CostFunctor23 (line 98) | struct CostFunctor23
function ceres (line 136) | static ceres::CostFunction* Create(const double observed_x0_bar, const d...
type CostFunctor22 (line 147) | struct CostFunctor22
function ceres (line 176) | static ceres::CostFunction* Create(const double observed_x0_bar, const d...
FILE: src/visual_odometry/include/visual_odometry/image_util.h
type class (line 16) | enum class
type class (line 26) | enum class
type class (line 36) | enum class
function SelectType (line 41) | enum class SelectType
FILE: src/visual_odometry/include/visual_odometry/point_cloud_util.h
function namespace (line 15) | namespace vloam
FILE: src/visual_odometry/include/visual_odometry/visual_odometry.h
function namespace (line 35) | namespace vloam
FILE: src/visual_odometry/src/image_util.cpp
type vloam (line 3) | namespace vloam
function visualizeMatchesOnMouse (line 306) | void visualizeMatchesOnMouse(int ev, int x, int y, int, void* obj)
FILE: src/visual_odometry/src/point_cloud_util.cpp
type vloam (line 3) | namespace vloam
function visualizeDepthOnMouse (line 418) | void visualizeDepthOnMouse(int ev, int x, int y, int, void* obj)
FILE: src/visual_odometry/src/visual_odometry.cpp
type vloam (line 3) | namespace vloam
FILE: src/vloam_main/src/vloam_main_node.cpp
function init (line 76) | void init(const vloam_main::vloam_mainGoalConstPtr& goal)
function callback (line 125) | void callback(const sensor_msgs::Image::ConstPtr& image_msg, const senso...
function execute (line 182) | void execute(const vloam_main::vloam_mainGoalConstPtr& goal, Server* as)
function main (line 224) | int main(int argc, char** argv)
FILE: src/vloam_tf/include/vloam_tf/vloam_tf.h
function namespace (line 12) | namespace vloam
FILE: src/vloam_tf/src/vloam_tf.cpp
type vloam (line 3) | namespace vloam
Condensed preview — 72 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (596K chars).
[
{
"path": ".gitignore",
"chars": 855,
"preview": "devel/\nlogs/\nbuild/\nbin/\nlib/\nmsg_gen/\nsrv_gen/\nmsg/*Action.msg\nmsg/*ActionFeedback.msg\nmsg/*ActionGoal.msg\nmsg/*ActionR"
},
{
"path": "LICENSE",
"chars": 1066,
"preview": "MIT License\n\nCopyright (c) 2021 Yukun Xia\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\n"
},
{
"path": "README.md",
"chars": 1248,
"preview": "# Introduction\n\nThis repository is a reimplementation of the VLOAM algorithm [1]. The LOAM/Lidar Odometry part is adapte"
},
{
"path": "src/lidar_odometry_mapping/CMakeLists.txt",
"chars": 2066,
"preview": "cmake_minimum_required(VERSION 2.8.3)\nproject(lidar_odometry_mapping)\n\nset(CMAKE_BUILD_TYPE \"Release\")\nset(CMAKE_CXX_FLA"
},
{
"path": "src/lidar_odometry_mapping/LICENSE",
"chars": 1922,
"preview": "This is an advanced implementation of the algorithm described in the following paper:\n J. Zhang and S. Singh. LOAM: Lid"
},
{
"path": "src/lidar_odometry_mapping/include/lidar_odometry_mapping/common.h",
"chars": 2272,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/lidar_odometry_mapping/include/lidar_odometry_mapping/laser_mapping.h",
"chars": 6351,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/lidar_odometry_mapping/include/lidar_odometry_mapping/laser_odometry.h",
"chars": 5396,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/lidar_odometry_mapping/include/lidar_odometry_mapping/lidarFactor.hpp",
"chars": 6151,
"preview": "// Author: Tong Qin qintonguav@gmail.com\n// \t Shaozu Cao \t\t saozu.cao@connect.ust.hk\n#pragma "
},
{
"path": "src/lidar_odometry_mapping/include/lidar_odometry_mapping/lidar_odometry_mapping.h",
"chars": 3411,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/lidar_odometry_mapping/include/lidar_odometry_mapping/scan_registration.h",
"chars": 4442,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/lidar_odometry_mapping/include/lidar_odometry_mapping/tic_toc.h",
"chars": 557,
"preview": "// Author: Tong Qin qintonguav@gmail.com\n// \t Shaozu Cao \t\t saozu.cao@connect.ust.hk\n\n#pragma"
},
{
"path": "src/lidar_odometry_mapping/launch/loam_velodyne_HDL_32.launch",
"chars": 1044,
"preview": "<launch>\n \n <param name=\"scan_line\" type=\"int\" value=\"32\" />\n\n <!-- if 1, do mapping 10 Hz, if 2, do mapping 5 "
},
{
"path": "src/lidar_odometry_mapping/launch/loam_velodyne_HDL_64.launch",
"chars": 1042,
"preview": "<launch>\n \n <param name=\"scan_line\" type=\"int\" value=\"64\" />\n\n <!-- if 1, do mapping 10 Hz, if 2, do mapping 5 "
},
{
"path": "src/lidar_odometry_mapping/launch/loam_velodyne_HDL_64_kitti.launch",
"chars": 1019,
"preview": "<launch>\n <!-- <remap from=\"/velodyne_points\" to=\"/kitti/velo/pointcloud\"/> -->\n <param name=\"scan_line\" type=\"int"
},
{
"path": "src/lidar_odometry_mapping/launch/loam_velodyne_VLP_16.launch",
"chars": 1044,
"preview": "<launch>\n \n <param name=\"scan_line\" type=\"int\" value=\"16\" />\n\n <!-- if 1, do mapping 10 Hz, if 2, do mapping 5 "
},
{
"path": "src/lidar_odometry_mapping/package.xml",
"chars": 1373,
"preview": "<?xml version=\"1.0\"?>\n<package>\n <name>lidar_odometry_mapping</name>\n <version>0.0.0</version>\n\n <description>\n Th"
},
{
"path": "src/lidar_odometry_mapping/rviz_cfg/lidar_odometry_mapping.rviz",
"chars": 23091,
"preview": "Panels:\n - Class: rviz/Displays\n Help Height: 78\n Name: Displays\n Property Tree Widget:\n Expanded:\n "
},
{
"path": "src/lidar_odometry_mapping/src/laser_mapping.cpp",
"chars": 33038,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/lidar_odometry_mapping/src/laser_odometry.cpp",
"chars": 26806,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/lidar_odometry_mapping/src/lidar_odometry_mapping.cpp",
"chars": 5877,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/lidar_odometry_mapping/src/scan_registration.cpp",
"chars": 17284,
"preview": "// This is an advanced implementation of the algorithm described in the following paper:\n// J. Zhang and S. Singh. LOA"
},
{
"path": "src/visual_odometry/CMakeLists.txt",
"chars": 8243,
"preview": "cmake_minimum_required(VERSION 3.0.2)\nproject(visual_odometry)\n\n## Compile as C++11, supported in ROS Kinetic and newer\n"
},
{
"path": "src/visual_odometry/README.md",
"chars": 7812,
"preview": "# Prerequisites\n\nOpenCV 4.5.1\nEigen3 3.3\nCeres 2.0\n\n# Troubleshooting\n\n## 1. Catkin_Make fail with cv_bridge not found \n"
},
{
"path": "src/visual_odometry/include/visual_odometry/ceres_cost_function.h",
"chars": 6887,
"preview": "#include <ceres/ceres.h>\n#include <ceres/loss_function.h>\n#include <ceres/rotation.h>\n\n#ifndef CERES_COST_FUNCTION_H\n#de"
},
{
"path": "src/visual_odometry/include/visual_odometry/image_util.h",
"chars": 3162,
"preview": "#include <ros/package.h>\n#include <ros/ros.h>\n\n#include <algorithm>\n#include <iostream>\n#include <iterator>\n#include <op"
},
{
"path": "src/visual_odometry/include/visual_odometry/point_cloud_util.h",
"chars": 2244,
"preview": "#include <ros/package.h>\n#include <ros/ros.h>\n\n#include <boost/lexical_cast.hpp>\n#include <eigen3/Eigen/Dense>\n#include "
},
{
"path": "src/visual_odometry/include/visual_odometry/visual_odometry.h",
"chars": 3650,
"preview": "#pragma once\n\n#include <ceres/ceres.h>\n#include <ceres/loss_function.h>\n#include <ceres/rotation.h>\n#include <cv_bridge/"
},
{
"path": "src/visual_odometry/package.xml",
"chars": 3263,
"preview": "<?xml version=\"1.0\"?>\n<package format=\"2\">\n <name>visual_odometry</name>\n <version>0.0.0</version>\n <description>The "
},
{
"path": "src/visual_odometry/src/image_util.cpp",
"chars": 15239,
"preview": "#include <visual_odometry/image_util.h>\n\nnamespace vloam\n{\nstd::vector<cv::KeyPoint> ImageUtil::detKeypoints(cv::Mat& im"
},
{
"path": "src/visual_odometry/src/point_cloud_util.cpp",
"chars": 19146,
"preview": "#include <visual_odometry/point_cloud_util.h>\n\nnamespace vloam\n{\nvoid PointCloudUtil::loadTransformations(const std::str"
},
{
"path": "src/visual_odometry/src/visual_odometry.cpp",
"chars": 22320,
"preview": "#include <visual_odometry/visual_odometry.h>\n\nnamespace vloam\n{\nVisualOdometry::VisualOdometry()\n{\n pub_point_cloud = n"
},
{
"path": "src/vloam_main/CMakeLists.txt",
"chars": 7764,
"preview": "cmake_minimum_required(VERSION 3.0.2)\nproject(vloam_main)\n\n## Compile as C++11, supported in ROS Kinetic and newer\nadd_c"
},
{
"path": "src/vloam_main/README.md",
"chars": 3848,
"preview": "# Prerequisites\n\nOpenCV 4.5.1\nPCL 1.2\n\n# Usage 1\n\n- Step 1, create and move bag files to \"REPO_ROOT/src/vloam_main/bags\""
},
{
"path": "src/vloam_main/action/vloam_main.action",
"chars": 187,
"preview": "# Define the goal\nstring date\nstring seq\nint32 start_frame\nint32 end_frame\n---\n# Define the result\nbool loading_finished"
},
{
"path": "src/vloam_main/launch/vloam_main.launch",
"chars": 1008,
"preview": "<launch>\n <include file=\"$(find lidar_odometry_mapping)/launch/loam_velodyne_HDL_64_kitti.launch\" />\n\n <param name"
},
{
"path": "src/vloam_main/package.xml",
"chars": 4164,
"preview": "<?xml version=\"1.0\"?>\n<package format=\"2\">\n <name>vloam_main</name>\n <version>0.0.0</version>\n <description>The vloam"
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0001/LO0.txt",
"chars": 1010,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999997 -0."
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0001/MO0.txt",
"chars": 995,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n1.000000 0.0"
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0001/VO0.txt",
"chars": 999,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999993 0.0"
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0002/LO0.txt",
"chars": 1219,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999990 -0."
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0002/MO0.txt",
"chars": 1210,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n1.000000 0.0"
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0002/VO0.txt",
"chars": 1219,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n1.000000 -0."
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0005/LO0.txt",
"chars": 1230,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999884 -0."
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0005/LO1.txt",
"chars": 1233,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999908 0.0"
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0005/MO0.txt",
"chars": 1218,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n1.000000 0.0"
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0005/MO1.txt",
"chars": 1237,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999891 0.0"
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0005/VO0.txt",
"chars": 1228,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999684 0.0"
},
{
"path": "src/vloam_main/results/2011_09_26_drive_0005/VO1.txt",
"chars": 1218,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999871 0.0"
},
{
"path": "src/vloam_main/results/2011_09_30_drive_0016/LO0.txt",
"chars": 5794,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n1.000000 -0."
},
{
"path": "src/vloam_main/results/2011_09_30_drive_0016/MO0.txt",
"chars": 5801,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n1.000000 -0."
},
{
"path": "src/vloam_main/results/2011_09_30_drive_0016/VO0.txt",
"chars": 5774,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n1.000000 -0."
},
{
"path": "src/vloam_main/results/2011_09_30_drive_0027/LO1.txt",
"chars": 4279,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999981 0.0"
},
{
"path": "src/vloam_main/results/2011_09_30_drive_0027/MO1.txt",
"chars": 4280,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999981 0.0"
},
{
"path": "src/vloam_main/results/2011_09_30_drive_0027/VO1.txt",
"chars": 4278,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999977 0.0"
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0027/LO0.txt",
"chars": 4883,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999993 -0."
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0027/MO0.txt",
"chars": 4883,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999991 -0."
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0027/VO0.txt",
"chars": 4885,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999991 -0."
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0042/LO0.txt",
"chars": 6914,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999029 -0."
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0042/LO1.txt",
"chars": 63243,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999756 0.0"
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0042/MO0.txt",
"chars": 6919,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999040 -0."
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0042/MO1.txt",
"chars": 62944,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999572 -0."
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0042/VO0.txt",
"chars": 6928,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999001 -0."
},
{
"path": "src/vloam_main/results/2011_10_03_drive_0042/VO1.txt",
"chars": 63693,
"preview": "1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000\n0.999001 -0."
},
{
"path": "src/vloam_main/rviz/vloam.rviz",
"chars": 10078,
"preview": "Panels:\n - Class: rviz/Displays\n Help Height: 138\n Name: Displays\n Property Tree Widget:\n Expanded:\n "
},
{
"path": "src/vloam_main/rviz/vloam_combined.rviz",
"chars": 13246,
"preview": "Panels:\n - Class: rviz/Displays\n Help Height: 138\n Name: Displays\n Property Tree Widget:\n Expanded:\n "
},
{
"path": "src/vloam_main/scripts/plotTrajectory.py",
"chars": 388,
"preview": "import numpy as np\nimport matplotlib.pyplot as plt\n\nlo = np.loadtxt('../results/seq1/lo.txt')\nvo = np.loadtxt('../result"
},
{
"path": "src/vloam_main/src/vloam_main_node.cpp",
"chars": 9207,
"preview": "#include <sstream>\n#include <string>\n// #include <fstream>\n#include <actionlib/server/simple_action_server.h>\n#include <"
},
{
"path": "src/vloam_tf/CMakeLists.txt",
"chars": 7300,
"preview": "cmake_minimum_required(VERSION 3.0.2)\nproject(vloam_tf)\n\n## Compile as C++11, supported in ROS Kinetic and newer\n# add_c"
},
{
"path": "src/vloam_tf/include/vloam_tf/vloam_tf.h",
"chars": 2037,
"preview": "#pragma once\n\n#include <geometry_msgs/TransformStamped.h>\n#include <tf2/LinearMath/Transform.h>\n#include <tf2_eigen/tf2_"
},
{
"path": "src/vloam_tf/package.xml",
"chars": 3297,
"preview": "<?xml version=\"1.0\"?>\n<package format=\"2\">\n <name>vloam_tf</name>\n <version>0.0.0</version>\n <description>The vloam_t"
},
{
"path": "src/vloam_tf/src/vloam_tf.cpp",
"chars": 7059,
"preview": "#include <vloam_tf/vloam_tf.h>\n\nnamespace vloam\n{\nvoid VloamTF::init()\n{\n tf_buffer_ptr = std::make_shared<tf2_ros::Buf"
}
]
About this extraction
This page contains the full source code of the YukunXia/VLOAM-CMU-16833 GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 72 files (564.9 KB), approximately 230.5k tokens, and a symbol index with 45 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.