Full Code of StephLin/LIO-SEGMOT for AI

master 9d8ad1398d44 cached
48 files
381.1 KB
102.7k tokens
160 symbols
1 requests
Download .txt
Showing preview only (399K chars total). Download the full file or copy to clipboard to get everything.
Repository: StephLin/LIO-SEGMOT
Branch: master
Commit: 9d8ad1398d44
Files: 48
Total size: 381.1 KB

Directory structure:
gitextract_j229x4hx/

├── .clang-format
├── .github/
│   └── stale.yml
├── .gitignore
├── CITATION.bib
├── CMakeLists.txt
├── LICENSE
├── README.md
├── config/
│   ├── doc/
│   │   └── kitti2bag/
│   │       ├── README.md
│   │       └── kitti2bag.py
│   ├── params.yaml
│   ├── params_hsinchu.yaml
│   └── params_kitti.yaml
├── include/
│   ├── factor.h
│   ├── solver.h
│   └── utility.h
├── launch/
│   ├── debug.launch
│   ├── gdbserver_debug.launch
│   ├── include/
│   │   ├── config/
│   │   │   ├── robot.urdf.xacro
│   │   │   └── rviz.rviz
│   │   ├── module_loam.launch
│   │   ├── module_loam_debug.launch
│   │   ├── module_loam_gdbserver_debug.launch
│   │   ├── module_navsat.launch
│   │   ├── module_robot_state_publisher.launch
│   │   ├── module_rviz.launch
│   │   └── rosconsole/
│   │       ├── rosconsole_error.conf
│   │       ├── rosconsole_info.conf
│   │       └── rosconsole_warn.conf
│   ├── run.launch
│   ├── run_hsinchu.launch
│   └── run_kitti.launch
├── msg/
│   ├── Diagnosis.msg
│   ├── ObjectState.msg
│   ├── ObjectStateArray.msg
│   ├── cloud_info.msg
│   └── flags.msg
├── package.xml
├── scripts/
│   └── diagnose.py
├── src/
│   ├── factor.cpp
│   ├── featureExtraction.cpp
│   ├── imageProjection.cpp
│   ├── imuPreintegration.cpp
│   ├── mapOptimization.cpp
│   ├── offlineBagPlayer.cpp
│   └── solver.cpp
└── srv/
    ├── detection.srv
    ├── save_estimation_result.srv
    └── save_map.srv

================================================
FILE CONTENTS
================================================

================================================
FILE: .clang-format
================================================
---
Language:        Cpp
# BasedOnStyle:  Google
AccessModifierOffset: -1
AlignAfterOpenBracket: Align
AlignConsecutiveAssignments: true
AlignConsecutiveDeclarations: false
AlignEscapedNewlines: Left
AlignOperands:   true
AlignTrailingComments: true
AllowAllParametersOfDeclarationOnNextLine: true
AllowShortBlocksOnASingleLine: false
AllowShortCaseLabelsOnASingleLine: false
AllowShortFunctionsOnASingleLine: All
AllowShortIfStatementsOnASingleLine: true
AllowShortLoopsOnASingleLine: true
AlwaysBreakAfterDefinitionReturnType: None
AlwaysBreakAfterReturnType: None
AlwaysBreakBeforeMultilineStrings: true
AlwaysBreakTemplateDeclarations: true
BinPackArguments: true
BinPackParameters: true
BraceWrapping:   
  AfterClass:      false
  AfterControlStatement: false
  AfterEnum:       false
  AfterFunction:   false
  AfterNamespace:  false
  AfterObjCDeclaration: false
  AfterStruct:     false
  AfterUnion:      false
  AfterExternBlock: false
  BeforeCatch:     false
  BeforeElse:      false
  IndentBraces:    false
  SplitEmptyFunction: true
  SplitEmptyRecord: true
  SplitEmptyNamespace: true
BreakBeforeBinaryOperators: None
BreakBeforeBraces: Attach
BreakBeforeInheritanceComma: false
BreakBeforeTernaryOperators: true
BreakConstructorInitializersBeforeComma: false
BreakConstructorInitializers: BeforeColon
BreakAfterJavaFieldAnnotations: false
BreakStringLiterals: true
ColumnLimit:     0
CommentPragmas:  '^ IWYU pragma:'
CompactNamespaces: false
ConstructorInitializerAllOnOneLineOrOnePerLine: true
ConstructorInitializerIndentWidth: 4
ContinuationIndentWidth: 4
Cpp11BracedListStyle: true
DerivePointerAlignment: true
DisableFormat:   false
ExperimentalAutoDetectBinPacking: false
FixNamespaceComments: true
ForEachMacros:   
  - foreach
  - Q_FOREACH
  - BOOST_FOREACH
IncludeBlocks:   Preserve
IncludeCategories: 
  - Regex:           '^<ext/.*\.h>'
    Priority:        2
  - Regex:           '^<.*\.h>'
    Priority:        1
  - Regex:           '^<.*'
    Priority:        2
  - Regex:           '.*'
    Priority:        3
IncludeIsMainRegex: '([-_](test|unittest))?$'
IndentCaseLabels: true
IndentPPDirectives: None
IndentWidth:     2
IndentWrappedFunctionNames: false
JavaScriptQuotes: Leave
JavaScriptWrapImports: true
KeepEmptyLinesAtTheStartOfBlocks: false
MacroBlockBegin: ''
MacroBlockEnd:   ''
MaxEmptyLinesToKeep: 1
NamespaceIndentation: None
ObjCBlockIndentWidth: 2
ObjCSpaceAfterProperty: false
ObjCSpaceBeforeProtocolList: false
PenaltyBreakAssignment: 2
PenaltyBreakBeforeFirstCallParameter: 1
PenaltyBreakComment: 300
PenaltyBreakFirstLessLess: 120
PenaltyBreakString: 1000
PenaltyExcessCharacter: 1000000
PenaltyReturnTypeOnItsOwnLine: 200
PointerAlignment: Left
RawStringFormats: 
  - Delimiters:       [pb]
    Language:        TextProto
    BasedOnStyle:    google
ReflowComments:  true
SortIncludes:    true
SortUsingDeclarations: true
SpaceAfterCStyleCast: false
SpaceAfterTemplateKeyword: true
SpaceBeforeAssignmentOperators: true
SpaceBeforeParens: ControlStatements
SpaceInEmptyParentheses: false
SpacesBeforeTrailingComments: 2
SpacesInAngles:  false
SpacesInContainerLiterals: true
SpacesInCStyleCastParentheses: false
SpacesInParentheses: false
SpacesInSquareBrackets: false
Standard:        Auto
TabWidth:        8
UseTab:          Never
...



================================================
FILE: .github/stale.yml
================================================
# Number of days of inactivity before an issue becomes stale
daysUntilStale: 14
# Number of days of inactivity before a stale issue is closed
daysUntilClose: 1
# Issues with these labels will never be considered stale
exemptLabels:
  - pinned
  - security
# Label to use when marking an issue as stale
staleLabel: stale
# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: >
  This issue has been automatically marked as stale because it has not had
  recent activity. It will be closed if no further activity occurs. Thank you
  for your contributions.
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: false


================================================
FILE: .gitignore
================================================
devel/
logs/
build/
bin/
lib/
msg_gen/
srv_gen/
msg/*Action.msg
msg/*ActionFeedback.msg
msg/*ActionGoal.msg
msg/*ActionResult.msg
msg/*Feedback.msg
msg/*Goal.msg
msg/*Result.msg
msg/_*.py
build_isolated/
devel_isolated/

# Generated by dynamic reconfigure
*.cfgc
/cfg/cpp/
/cfg/*.py

# Ignore generated docs
*.dox
*.wikidoc

# eclipse stuff
.project
.cproject

# qcreator stuff
CMakeLists.txt.user

srv/_*.py
*.pcd
*.pyc
qtcreator-*
*.user

/planning/cfg
/planning/docs
/planning/src

*~

# Emacs
.#*

# Catkin custom files
CATKIN_IGNORE

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
#  Usually these files are written by a python script from a template
#  before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
.pybuilder/
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
#   For a library or package, you might want to ignore these files since the code is
#   intended to run in multiple environments; otherwise, check them in:
# .python-version

# pipenv
#   According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
#   However, in case of collaboration, if having platform-specific dependencies or dependencies
#   having no cross-platform support, pipenv may install dependencies that don't work, or not
#   install all needed dependencies.
#Pipfile.lock

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# pytype static type analyzer
.pytype/

# Cython debug symbols
cython_debug/

# PyCharm
#  JetBrains specific template is maintainted in a separate JetBrains.gitignore that can
#  be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
#  and can be added to the global gitignore or merged into this file.  For a more nuclear
#  option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/

# VSCode
.vscode

================================================
FILE: CITATION.bib
================================================
@article{lin2023lio-segmot,
  title={Asynchronous State Estimation of Simultaneous Ego-motion Estimation and Multiple Object Tracking for LiDAR-Inertial Odometry},
  author={Lin, Yu-Kai and Lin, Wen-Chieh and Wang, Chieh-Chih},
  booktitle = {2023 International Conference on Robotics and Automation, {ICRA} 2023,
               London, UK, May 2023},
  pages     = {10616--10622},
  year      = {2023},
}

================================================
FILE: CMakeLists.txt
================================================
cmake_minimum_required(VERSION 2.8.3)
project(lio_segmot)

set(CMAKE_BUILD_TYPE "Release")
set(CMAKE_CXX_FLAGS "-std=c++11")
set(CMAKE_CXX_FLAGS_RELEASE "-O3 -Wall -g0 -pthread")

find_package(catkin REQUIRED COMPONENTS
  tf
  roscpp
  rospy
  cv_bridge
  rosbag
  # pcl library
  pcl_conversions
  # msgs
  std_msgs
  sensor_msgs
  geometry_msgs
  nav_msgs
  message_generation
  visualization_msgs
  jsk_recognition_msgs
  jsk_topic_tools
)

find_package(OpenMP REQUIRED)
find_package(PCL REQUIRED QUIET)
find_package(OpenCV REQUIRED QUIET)
find_package(GTSAM REQUIRED QUIET)

add_message_files(
  DIRECTORY msg
  FILES
  cloud_info.msg
  ObjectState.msg
  ObjectStateArray.msg
  Diagnosis.msg
  flags.msg
)

add_service_files(
  DIRECTORY srv
  FILES
  save_map.srv
  detection.srv
  save_estimation_result.srv
)

generate_messages(
  DEPENDENCIES
  geometry_msgs
  std_msgs
  nav_msgs
  sensor_msgs
  jsk_recognition_msgs
)

catkin_package(
  INCLUDE_DIRS include
  DEPENDS PCL GTSAM

  CATKIN_DEPENDS
  std_msgs
  nav_msgs
  geometry_msgs
  sensor_msgs
  message_runtime
  message_generation
  visualization_msgs
  jsk_recognition_msgs
  jsk_topic_tools
)

# include directories
include_directories(
	include
	${catkin_INCLUDE_DIRS}
	${PCL_INCLUDE_DIRS}
  ${OpenCV_INCLUDE_DIRS}
	${GTSAM_INCLUDE_DIR}
)

# link directories
link_directories(
	include
	${PCL_LIBRARY_DIRS}
  ${OpenCV_LIBRARY_DIRS}
  ${GTSAM_LIBRARY_DIRS}
)

###########
## Build ##
###########

# Range Image Projection
add_executable(${PROJECT_NAME}_imageProjection src/imageProjection.cpp)
add_dependencies(${PROJECT_NAME}_imageProjection ${catkin_EXPORTED_TARGETS} ${PROJECT_NAME}_generate_messages_cpp)
target_link_libraries(${PROJECT_NAME}_imageProjection ${catkin_LIBRARIES} ${PCL_LIBRARIES} ${OpenCV_LIBRARIES})

# Feature Association
add_executable(${PROJECT_NAME}_featureExtraction src/featureExtraction.cpp)
add_dependencies(${PROJECT_NAME}_featureExtraction ${catkin_EXPORTED_TARGETS} ${PROJECT_NAME}_generate_messages_cpp)
target_link_libraries(${PROJECT_NAME}_featureExtraction ${catkin_LIBRARIES} ${PCL_LIBRARIES} ${OpenCV_LIBRARIES})

# Mapping Optimization
add_executable(${PROJECT_NAME}_mapOptimization src/mapOptimization.cpp src/factor.cpp src/solver.cpp)
add_dependencies(${PROJECT_NAME}_mapOptimization ${catkin_EXPORTED_TARGETS} ${PROJECT_NAME}_generate_messages_cpp)
target_compile_options(${PROJECT_NAME}_mapOptimization PRIVATE ${OpenMP_CXX_FLAGS})
target_link_libraries(${PROJECT_NAME}_mapOptimization ${catkin_LIBRARIES} ${PCL_LIBRARIES} ${OpenCV_LIBRARIES} ${OpenMP_CXX_FLAGS} gtsam)

# IMU Preintegration
add_executable(${PROJECT_NAME}_imuPreintegration src/imuPreintegration.cpp)
target_link_libraries(${PROJECT_NAME}_imuPreintegration ${catkin_LIBRARIES} ${PCL_LIBRARIES} ${OpenCV_LIBRARIES} gtsam)

# Offline ROSBag Player
add_executable(${PROJECT_NAME}_offlineBagPlayer src/offlineBagPlayer.cpp)
target_link_libraries(${PROJECT_NAME}_offlineBagPlayer ${catkin_LIBRARIES} ${PCL_LIBRARIES} ${OpenCV_LIBRARIES} gtsam)


================================================
FILE: LICENSE
================================================
BSD 3-Clause License

Copyright (c) 2020, Tixiao Shan
Copyright (c) 2022-2023, Yu-Kai Lin
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

- Redistributions of source code must retain the above copyright notice, this
  list of conditions and the following disclaimer.

- Redistributions in binary form must reproduce the above copyright notice,
  this list of conditions and the following disclaimer in the documentation
  and/or other materials provided with the distribution.

- Neither the name of the copyright holder nor the names of its
  contributors may be used to endorse or promote products derived from
  this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.


================================================
FILE: README.md
================================================
# LIO-SEGMOT

The official implementation of LIO-SEGMOT (**L**iDAR-**I**nertial **O**dometry
via **S**imultaneous **Eg**o-motion Estimation and **M**ultiple **O**bject
**T**racking), an optimization-based odometry approach targeted for dynamic
environments. LIO-SEGMOT can provide continuous object tracking results while
preserving the keyframe selection mechanism in the odometry system. This work is
accepted for publication in [ICRA 2023](https://www.icra2023.org/).

![TEASER_GIF](./docs/_static/images/LIO-SEGMOT_KITTI_teaser.GIF)

You can check out [our video](https://youtu.be/5HtnDFPerVo) to understand the
main idea of LIO-SEGMOT. For more, please refer to our paper:

- Yu-Kai Lin, Wen-Chieh Lin, Chieh-Chih Wang, **Asynchronous State Estimation of Simultaneous Ego-motion Estimation and Multiple Object Tracking for LiDAR-Inertial Odometry**. _2023 International Conference on Robotics and Automation (ICRA)_, pp. 10616--10622, May 2023. ([paper](https://doi.org/10.1109/ICRA48891.2023.10161269)) ([preprint](https://gpl.cs.nctu.edu.tw/Steve-Lin/LIO-SEGMOT/preprint.pdf)) ([code](https://github.com/StephLin/LIO-SEGMOT)) ([video](https://youtu.be/5HtnDFPerVo))

If you use this project in your research, please cite:

```bibtex
@article{lin2023lio-segmot,
  title={Asynchronous State Estimation of Simultaneous Ego-motion Estimation and Multiple Object Tracking for LiDAR-Inertial Odometry},
  author={Lin, Yu-Kai and Lin, Wen-Chieh and Wang, Chieh-Chih},
  booktitle = {2023 International Conference on Robotics and Automation, {ICRA} 2023,
               London, UK, May 2023},
  pages     = {10616--10622},
  year      = {2023},
}
```

- [:newspaper: Poster](#newspaper-poster)
- [:gear: Installation](#gear-installation)
  - [Step 1. Preparing the Dependencies](#step-1-preparing-the-dependencies)
  - [Step 2. Building the LIO-SEGMOT Project](#step-2-building-the-lio-segmot-project)
  - [Step 3. Preparing Object Detection Services](#step-3-preparing-object-detection-services)
- [:card_file_box: Sample Datasets](#card_file_box-sample-datasets)
- [:running_man: Run](#running_man-run)
- [:wheelchair: Services of LIO-SEGMOT](#wheelchair-services-of-lio-segmot)
  - [`/lio_segmot/save_map`](#lio_segmotsave_map)
  - [`/lio_segmot/save_estimation_result`](#lio_segmotsave_estimation_result)
- [:memo: Remarks](#memo-remarks)
  - [Hyperparameters](#hyperparameters)
    - [Hierarchical Criterion](#hierarchical-criterion)
    - [Factor Graph Optimization](#factor-graph-optimization)
    - [High-speed Moving Object Supports in Early Steps](#high-speed-moving-object-supports-in-early-steps)
    - [Lifecycle Management of Tracking Objects](#lifecycle-management-of-tracking-objects)
  - [Limitations of LIO-SEGMOT](#limitations-of-lio-segmot)
  - [Possible Future Research Directions](#possible-future-research-directions)
- [:gift: Acknowledgement](#gift-acknowledgement)

## :newspaper: Poster

![Poster](./docs/_static/images/poster.png)

## :gear: Installation

The project is originally developed in **Ubuntu 18.04**, and the following
instruction supposes that you are using Ubuntu 18.04 as well. I am not sure if
it also works with other Ubuntu versions or other Linux distributions, but maybe
you can give it a try :+1:

Also, please feel free to open an issue if you encounter any problems of the
following instruction.

### Step 1. Preparing the Dependencies

Please prepare the following packages or libraries used in LIO-SEGMOT:

1. [ROS Melodic](http://wiki.ros.org/melodic/Installation/Ubuntu) (w/ Desktop-Full
   Install) and the following dependencies:
   ```bash
   #!/bin/bash
   sudo apt update
   sudo apt install -y "ros-${ROS_DISTRO}-navigation" \
                       "ros-${ROS_DISTRO}-robot-localization" \
                       "ros-${ROS_DISTRO}-robot-state-publisher" \
                       "ros-${ROS_DISTRO}-jsk-recognition-msgs" \
                       "ros-${ROS_DISTRO}-jsk-rviz-plugins"
   ```
2. [gtsam 4.0.3](https://github.com/borglab/gtsam/tree/4.0.3)
   ```bash
   #!/bin/bash
   cd ~
   git clone -b 4.0.3 https://github.com/borglab/gtsam && cd gtsam
   mkdir build && cd build
   cmake ..
   sudo make install
   ```

### Step 2. Building the LIO-SEGMOT Project

You can use the following command to build the project:

```bash
#!/bin/bash
source "/opt/ros/${ROS_DISTRO}/setup.bash"
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
git clone git@github.com:StephLin/LIO-SEGMOT.git
cd ..
catkin_make
```

### Step 3. Preparing Object Detection Services

We provide two object detection services for LIO-SEGMOT:

- [StephLin/SE-SSD-ROS (Apache-2.0 license)](https://github.com/StephLin/SE-SSD-ROS)
  (based on [Vegeta2020/SE-SSD](https://github.com/Vegeta2020/SE-SSD))
- [StephLin/livox_detection_lio_segmot (GPL-3.0 license)](https://github.com/StephLin/livox_detection_lio_segmot)
  (based on [Livox-SDK/livox_detection](https://github.com/Livox-SDK/livox_detection))

Please refer to their installation instructions accordingly.

## :card_file_box: Sample Datasets

We provide the following pre-built bag files for KITTI raw sequences and the Hsinchu
dataset (GuangfuRoad sequence):

| Dataset | Sequence    | Bag File                                             | Ground Truth Trajectory                              |
| ------- | ----------- | ---------------------------------------------------- | ---------------------------------------------------- |
| KITTI   | 0926-0009   | [bag](http://140.113.150.180:5000/sharing/BrHtqyElq) | [tum](http://140.113.150.180:5000/sharing/HGebyDSCR) |
| KITTI   | 0926-0013   | [bag](http://140.113.150.180:5000/sharing/AQQa3kMSH) | [tum](http://140.113.150.180:5000/sharing/gREpr4xdI) |
| KITTI   | 0926-0014   | [bag](http://140.113.150.180:5000/sharing/HOgV5T79H) | [tum](http://140.113.150.180:5000/sharing/POuFRJBQI) |
| KITTI   | 0926-0015   | [bag](http://140.113.150.180:5000/sharing/XnoNLKSUQ) | [tum](http://140.113.150.180:5000/sharing/RBw1BeftU) |
| KITTI   | 0926-0032   | [bag](http://140.113.150.180:5000/sharing/ikbtkpWve) | [tum](http://140.113.150.180:5000/sharing/aQdaEnVjc) |
| KITTI   | 0926-0051   | [bag](http://140.113.150.180:5000/sharing/N1o9NcgU4) | [tum](http://140.113.150.180:5000/sharing/Wzu8QWoEC) |
| KITTI   | 0926-0101   | [bag](http://140.113.150.180:5000/sharing/lhhohwfJT) | [tum](http://140.113.150.180:5000/sharing/JCOaJHw04) |
| Hsinchu | GuangfuRoad | [bag](http://gofile.me/56KxB/mz6HOYOMG)              | [tum](http://gofile.me/56KxB/t5smaOAZk)              |

If you cannot download sample datasets from above links, please refer to
[this alternative link (Google Drive)](https://drive.google.com/drive/folders/17QwfTMCv-2XdgOLxGdrnCwybcSYQhs7S?usp=drive_link).

Ground truth robot trajectories (based on GPS data provided by KITTI) are stored
as [the TUM format](https://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats#ground-truth_trajectories).
Each row has 8 components containing timestamps (sec), xyz-position (meter), and
xyzw-orientation (quaternion):

```
timestamp x y z qx qy qz qw
```

## :running_man: Run

Please follow the steps to execute LIO-SEGMOT properly:

1. (Optional) Launch the ROS core:

   ```bash
   roscore
   ```

2. Launch the core LIO-SEGMOT service:

   ```bash
   #!/bin/bash
   # Please select one of the following configs to launch the service properly:
   # 1. With KITTI configuration
   roslaunch lio_segmot run_kitti.launch

   # 2. With Hsinchu configuration
   roslaunch lio_segmot run_hsinchu.launch

   # 3. Undefined (same as KITTI configuration)
   roslaunch lio_segmot run.launch
   ```

3. Launch the selected object detection service:

   ```bash
   #!/bin/bash
   # SE-SSD-ROS & livox_detection_lio_segmot
   # Please check their documentation to see how they are launched
   ```

4. Start the customized ROS bag player:

   ```bash
   #!/bin/bash
   rosrun lio_segmot lio_segmot_offlineBagPlayer _bag_filename:="path/to/your/sequence.bag"
   ```

   The default registered LiDAR and IMU topics are `/points_raw` and `/imu_raw`,
   respectively. If you want to register other LiDAR/IMU topics, please add
   additional options `_lidar_topic` and `_imu_topic`. For example, if you are
   using the GuangfuRoad sequence (`/velodyne_points` and `/imu/data` for LiDAR
   and IMU topics, respectively):

   ```bash
   rosrun lio_segmot lio_segmot_offlineBagPlayer _bag_filename:="GuangfuRoad-06-13.bag" \
                                                 _lidar_topic:="/velodyne_points" \
                                                 _imu_topic:="/imu/data"
   ```

## :wheelchair: Services of LIO-SEGMOT

### `/lio_segmot/save_map`

```txt
Usage: rosservice call /lio_segmot/save_map [RESOLUTION] [OUTPUT_DIR]
Example: rosservice call /lio_segmot/save_map 0.2 /path/to/a/directory/
```

This service saves LiDAR map to the local machine.

### `/lio_segmot/save_estimation_result`

```txt
Usage: rosservice call /lio_segmot/save_estimation_result
```

This service outputs current estimation results including

- `nav_msgs::Path robotTrajectory`: The robot trajectory
- $\color{gray}\textsf{(INTERNAL USE)}$ `nav_msgs::Path[] objectTrajectories`: Trajectories for each object (indexed by the factor graph)
- $\color{gray}\textsf{(INTERNAL USE)}$ `nav_msgs::Path[] objectVelocities`: Linear and angular velocities for each object (indexed by the factor graph)
- `nav_msgs::Path[] trackingObjectTrajectories`: Trajectories for each object (indexed by LIO-SEGMOT)
- `nav_msgs::Path[] trackingObjectVelocities`: Linear and angular velocities for each object (indexed by LIO-SEGMOT)
- `lio_segmot::ObjectStateArray[] trackingObjectStates`: States for each object during its lifetime (indexed by LIO-SEGMOT)
- $\color{gray}\textsf{(INTERNAL USE)}$ `lio_segmot::flags[] objectFlags`: Flags for each object during its lifetime (indexed by the factor graph)
- `lio_segmot::flags[] trackingObjectFlags`: Flags for each object during its lifetime (indexed by LIO-SEGMOT)

in which custom types `lio_segmot::ObjectStateArray` (underlying `lio_segmot::ObjectState`) and `lio_segmot::flags` are given by

- `lio_segmot::ObjectStateArray`

  ```cpp
  Header header
  lio_segmot::ObjectState[] objects
  ```

- `lio_segmot::ObjectState`

  ```cpp
  Header header

  // The corresponding detection (measurement)
  jsk_recognition_msgs::BoundingBox detection

  // States of object pose and velocity in the factor graph
  geometry_msgs::Pose pose
  geometry_msgs::Pose velocity

  // Residual and innovation of the tightly-coupled detection factor
  bool hasTightlyCoupledDetectionError
  float64 tightlyCoupledDetectionError         // Residual
  float64 initialTightlyCoupledDetectionError  // Innovation

  // Residual and innovation of the loosely-coupled detection factor
  bool hasLooselyCoupledDetectionError
  float64 looselyCoupledDetectionError         // Residual
  float64 initialLooselyCoupledDetectionError  // Innovation

  // Residual and innovation of the smooth movement factor
  bool hasMotionError
  float64 motionError         // Residual
  float64 initialMotionError  // Innovation

  int32 index            // Object index
  int32 lostCount        // Counter of losing detections
  float64 confidence     // Detection's confidence score (given by detection methods)
  bool isTightlyCoupled  // Is the object tightly-coupled at this moment?
  bool isFirst           // Is the object just initialized at this moment?
  ```

- `lio_segmot/flags`

  ```cpp
  // Flags of the object in its lifetime
  int32[] flags  // 1: the object is tightly-coupled
                 // 0: the object is loosely-coupled
  ```

## :memo: Remarks

### Hyperparameters

There are various covariance matrices designed for the hirarchical criterion
(innovation filtering) and factor graph optimization (increments of LIO-SEGMOT
w.r.t. LIO-SAM) in LIO-SEGMOT. This section is going to explain them.

> All covariance matrices in settings are expressed as diagonal vectors.

Taking [the KITTI configuration](./config/params_kitti.yaml) for example, we have the following settings:

#### Hierarchical Criterion

This section collects settings for the hierarchical criterion. It can be viewed
as a kind of innovation filtering. In brief, the criterion is designed to
progressively make the following decisions when a new detection
$\boldsymbol{z}\in SE(3)$ is coming into the system:

| ID       | Description                                                                 |
| -------- | --------------------------------------------------------------------------- |
| **(Q1)** | Does the detection belong to any existing object $\boldsymbol{x}_{t,i}$?    |
| **(Q2)** | If Q1 holds, does $\boldsymbol{z}$ follows the $i$-th object's motion?      |
| **(Q3)** | If Q1 and Q2 holds, should the tightly-coupled detection factor be applied? |

The first two questions **(Q1)** and **(Q2)** are determined by using the
Mahalanobis distance of the error vector,

$$
\Big\Vert\text{ detection error of }\boldsymbol{z}\text{ and the }i\text{-th object } \boldsymbol{x} _{t,i}\text{ }\Big\Vert _{\Sigma}
\leq \varepsilon,
$$

with given covariance with given covariance matrices
$\Sigma\in\{\Sigma_ {\text{Q}_ 1},\Sigma_ {\text{Q}_ 2}\}\subsetneq\mathbb{R}^{6\times 6}$
and a threshold $\varepsilon>0$. We assume that
$\Sigma_ {\text{Q}_ 2}-\Sigma_ {\text{Q}_ 1}$
is positive semidefinite (PSD), i.e.,
$\Sigma_ {\text{Q}_ 2}-\Sigma_{\text{Q}_ 1} \succeq 0$,
to prevent ambiguity of the hierarchical criterion that **(Q2)** holds but
**(Q1)** does not hold.

Two spatial information-based tests are conducted to determine **(Q3)**, which
are the detection constraint and the velocity constraint:

- **(Detection Constraint)** The above equation holds with another given
  covariance matrix
  $\Sigma_ {\text{Q}_ {3,1}}$
  that satisfies
  $\Sigma_ {\text{Q}_ {3,1}}-\Sigma_ {\text{Q}_ {2}} \succeq 0$.

- **(Velocity constraint)** The variance of velocities in previous steps is
  small enough. That is,

  $$\frac{1}{N}\sum_ {s=1}^{N} \Big\Vert \text{Log}(\boldsymbol{v}_ {t-s,i}) - \text{Log}(\bar{\boldsymbol{v}}_ {t,i}) \Big\Vert_ {\Sigma_{Q_ {3,2}}}^2 \leq \varepsilon$$

  with a given covariance matrix $\Sigma_{Q_{3,2}}$, where $N$ is the fixed
  number of previous velocities of object states and
  $\bar{\boldsymbol{v}}_{t,i}\in SE(3)$ is the mean of the $N$ previous
  velocities.

If **(Q1)** holds for the detection $\boldsymbol{z}$ and the corresponding
$i$-th object, the new state of the $i$-th object along with a loosely-coupled
detection factor would be added to the factor graph.

Furthermore, if **(Q2)**
holds, a constant velocity factor and a smooth movement factor would be also
added to the factor graph.

Finally, if **(Q3)** holds, the loosely-coupled
detection factor would be replaced with a tightly-coupled detection factor. It
means that the $i$-th object are regarded as a reliable object that are suitable
to refine the odometry.

| Notation                           | Setting                                             | Description                                                                                                                                                                                                                                   | Default Value                                      |
| ---------------------------------- | --------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------- |
| $\varepsilon$                      | `detectionMatchThreshold`                           | The threshold to classify all Mahalanobis distances in the hirarchical criterion (except for the tightly-coupled detection factor).                                                                                                           | 19.5                                               |
| $\Sigma_{\text{Q}_1}$              | `dataAssociationVarianceVector`                     | The covariance matrix to determine if a detection belongs to a given object. This covariance is used to maintain tracking ID.                                                                                                                 | `[3.0e-4, 3.0e-4, 3.0e-4, 5.0e-2, 3.0e-2, 3.0e-2]` |
| $\Sigma_{\text{Q}_2}$              | `looselyCoupledMatchingVarianceVector`              | The covariance matrix to determine if a detection follows the object's motion.                                                                                                                                                                | `[1.0e-4, 1.0e-4, 1.0e-4, 2.0e-3, 2.0e-3, 2.0e-3]` |
| $\varepsilon^\prime$               | `tightCouplingDetectionErrorThreshold`              | The threshold to classify the Mahalanobis distance in the detection constraint of the tightly-coupled detection factor checks.                                                                                                                | 26.0                                               |
| $\Sigma_{\text{Q}_{3,1}}^\prime$   | `tightlyCoupledMatchingVarianceVector`              | The covariance to determine if a detection satisfies the detection constraint in the tightly-coupled detection factor checks.                                                                                                                 | `[8.0e-6, 8.0e-6, 8.0e-6, 1.0e-4, 1.0e-4, 1.0e-4]` |
| $N$                                | `numberOfVelocityConsistencySteps`                  | The number of samples used in velocity constraint of the tightly-coupled detection checks.                                                                                                                                                    | 4                                                  |
| $N^\prime$                         | `numberOfPreLooseCouplingSteps`                     | The number of steps that objects should only use loosely-coupled detection factors (due to velocity constraint of the tightly-coupled detection checks, see [below](#high-speed-moving-object-supports-in-early-steps) for more information). | 6                                                  |
| $\sigma_{\text{Q}_{3,2}}^\text{A}$ | `objectAngularVelocityConsistencyVarianceThreshold` | The angular part of the covariance matrix to determine if the object satisfies the velocity constraint in the tightly-coupled detection factor checks.                                                                                        | 1.0e-5                                             |
| $\sigma_{\text{Q}_{3,2}}^\text{L}$ | `objectLinearVelocityConsistencyVarianceThreshold`  | The linear part of the covariance matrix to determine if the object satisfies the velocity constraint in the tightly-coupled detection factor checks.                                                                                         | 1.0e-2                                             |

For engineering purposes, we use two thresholds $\varepsilon$ and
$\varepsilon^\prime$ in the implementation. In addition, we decouple the angular
part and the linear part of $\Sigma_{\text{Q}_{3,2}}$. The following equations
are shown to coincide with the expression used in our paper:

$$
\begin{aligned}
\displaystyle\Sigma _{\text{Q} _{3,1}} &= \left(\frac{\varepsilon^\prime}{\varepsilon}\right)^2 \cdot \displaystyle\Sigma _{\text{Q} _{3,1}}^\prime, \\
\displaystyle\Sigma _{\text{Q} _{3,2}}^\prime &= \begin{bmatrix}\sigma _{\text{Q} _{3,2}}^\text{A} \\
& \sigma _{\text{Q} _{3,2}}^\text{A} \\
&& \sigma _{\text{Q} _{3,2}}^\text{A} \\
&&& \sigma _{\text{Q} _{3,2}}^\text{L} \\
&&&& \sigma _{\text{Q} _{3,2}}^\text{L} \\
&&&&& \sigma _{\text{Q} _{3,2}}^\text{L}
\end{bmatrix}, \\
\displaystyle\Sigma _{\text{Q} _{3,2}} &= \frac{1}{\varepsilon^2} \cdot \displaystyle\Sigma _{\text{Q} _{3,2}}^\prime.
\end{aligned}
$$

#### Factor Graph Optimization

Covariance matrices used in factor graph optimization. Different from the above
section, they are essential to "balance" different types of measurements in a
unified factor graph. Those matrices are relatively rare to be modified.

| Notation             | Setting                                  | Description                                                                                   | Default Value                                      |
| -------------------- | ---------------------------------------- | --------------------------------------------------------------------------------------------- | -------------------------------------------------- |
| $\Sigma_{\text{C}}$  | `constantVelocityDiagonalVarianceVector` | The covariance matrix of constant velocity factors used in factor graph optimization.         | `[2.0e-4, 2.0e-4, 1.0e-3, 2.0e-1, 1.0e-1, 1.0e-1]` |
| $\Sigma_{\text{M}}$  | `motionDiagonalVarianceVector`           | The covariance matrix of smooth movement factors used in factor graph optimization.           | `[2.0e-4, 2.0e-4, 1.0e-3, 1.0e-1, 1.0e-2, 1.0e-2]` |
| $\Sigma_{\text{LC}}$ | `looselyCoupledDetectionVarianceVector`  | The covariance matrix of loosely-coupled detection factors used in factor graph optimization. | `[2.0e-4, 2.0e-4, 2.0e-4, 1.5e-3, 1.5e-3, 1.5e-3]` |
| $\Sigma_{\text{TC}}$ | `tightlyCoupledDetectionVarianceVector`  | The covariance matrix of tightly-coupled detection factors used in factor graph optimization. | `[2.0e-4, 2.0e-4, 2.0e-4, 1.5e-3, 1.5e-3, 1.5e-3]` |

#### High-speed Moving Object Supports in Early Steps

As all tracking objects' velocities are initialized with zero-speed, it may be
hard to associate detections in different moments for high-speed moving objects
in the early stage. To mitigate this issue, a larger covariance matrix for
**(Q2)** in the hierarchical criterion is used in the first few steps to
accommodate objects that are moving fast.

Since those steps are used to figure out the inital speed of an object, we do
not account them for the velocity constraint in the tightly-coupled detection
factor checks. Therefore, we have $N^\prime = N + N^\text{E}$.

| Notation                       | Setting                                       | Description                                                                                   | Default Value                                      |
| ------------------------------ | --------------------------------------------- | --------------------------------------------------------------------------------------------- | -------------------------------------------------- |
| $N^\text{E}$                   | `numberOfEarlySteps`                          | Number of the first steps to accommodate high-speed moving objects.                           | 2                                                  |
| $\Sigma_{\text{Q}_2}^\text{E}$ | `earlyLooselyCoupledMatchingVarianceVector`   | The covariance matrix to determine if a detection follows the object's motion.                | `[3.0e-4, 3.0e-4, 3.0e-4, 5.0e-2, 5.0e-3, 5.0e-3]` |
| $\Sigma_{\text{C}}^\text{E}$   | `earlyConstantVelocityDiagonalVarianceVector` | The covariance matrix of loosely-coupled detection factors used in factor graph optimization. | `[2.0e-4, 2.0e-4, 1.0e-3, 2.0e-1, 1.0e-1, 1.0e-1]` |

#### Lifecycle Management of Tracking Objects

In real world applications, it's likely to lose detections of tracking objects
due to occlusion or other complicate environment circumstances. To mitigate
frequent ID-switching in multiple object tracking due to the above issue, we
still track each object for a little while, even though they do not have any
corresponding detections.

| Notation     | Setting                      | Description                                                                                | Default Value |
| ------------ | ---------------------------- | ------------------------------------------------------------------------------------------ | ------------- |
| $N^\text{L}$ | `trackingStepsForLostObject` | Number of steps that LIO-SEGMOT still keeps an object of missing detections in the system. | 3             |

### Limitations of LIO-SEGMOT

Currently, most state-of-the-art object detection approaches (e.g.,
[SE-SSD](https://arxiv.org/abs/2104.09804),
[PointPillars](https://arxiv.org/abs/1812.05784),
[PointRCNN](https://arxiv.org/abs/1812.04244),
[PV-RCNN](https://arxiv.org/abs/1912.13192),
[SPG](https://arxiv.org/abs/2108.06709), and
[ST3D](https://arxiv.org/abs/2103.05346)) are constructed under machine
learning-based neural network architectures, while the issue of domain
adaptation for data-driven object detection approaches still remains a
challenging open problem targeted by recent researches (e.g,
[SPG](https://arxiv.org/abs/2108.06709) and
[ST3D](https://arxiv.org/abs/2103.05346)). That is, the performance of object
detection models change varying under different geographic appearances or
weather conditions.

[The model weight of SE-SSD](https://github.com/Vegeta2020/SE-SSD) used in
LIO-SEGMOT is trained under a subset of the KITTI dataset, and thus it might
work well in other subsets of the KITTI dataset. However, we can observe that
there are numerous false positive detections in the Hsinchu dataset. Therefore,
it forces us to choose different detection models (i.e.,
[PointPillars w/ Livox's model weights](https://github.com/Livox-SDK/livox_detection))
when experimenting LIO-SEGMOT in different real world datasets. In addition,
since the model still cannot perform as good detection results in the Hsinchu
dataset as SE-SSD does in the KITTI dataset, we need to use a more strict
criterion for the detection constraint in the tightly-coupled detection factor
checks (by decreasing $\varepsilon^\prime$ from 26.0 to 19.0). This points out
the first limitation of LIO-SEGMOT. That is, covariance matrices and thresholds
related to object detections (mainly hyperparameters in the hierarchical
criterion) are required to be adjusted according to the stability of object
detections. Despite it affects generalization capability of the proposed method,
we believe that the problem can be mitigated with the breakthrough of the domain
adaptation for 3-D object detection.

The second limitation of LIO-SEGMOT is related to the motion model of tracking
objects. If an object does not move at constant velocity, LIO-SEGMOT may
miscalculate the object velocity, leading to inaccurate predicted object pose in
the subsequent state. The main reason is that objects' velocities are supposed
to be steady and constant in LIO-SEGMOT. It is possible to be resolved by
introducing multiple motion models to LIO-SEGMOT and adaptively selecting proper
models during the factor graph optimization, in which the concept is similar to
an interacting multiple model (IMM) in filtering-based object tracking
approaches.

### Possible Future Research Directions

There are two possible future research directions of LIO-SEGMOT:

- The optimization problem of LIO-SEGMOT produces a multi-robot architecturethat
  may break the efficiency of maintaining single root Bayes trees in iSAM2. It
  causes an unignorable computational cost, especially when there are lots of
  dynamic objects coupled in the system. In forthcoming researches, we would
  like to overcome the bottleneck by introducing the
  [multi-robot iSAM2 (MR-iSAM2)](http://doi.org/10.1109/iros51168.2021.9636687)
  algorithm.
- In addition, rule-based coupling conditions make the proposed method lack the
  ability to explore global optimality when considering combinatorial ambiguity
  as unknown integer variables. It raises a complicated mix-integer programming
  (MIP) problem, whereas a recent work called
  [multi-hypothesis iSAM (MH-iSAM2)](https://www.cs.cmu.edu/~kaess/pub/Hsiao19icra.pdf)
  still promotes us to investigate the challenging problem in the future.

## :gift: Acknowledgement

The project is mainly developed based on [Tixiao
Shan](https://github.com/TixiaoShan)'s excellent work
[LIO-SAM](https://github.com/TixiaoShan/LIO-SAM), which helps me a lot in
constructing LIO-SEGMOT. I would like to express my sincere thanks first. In
addition, I would like to thank [Wu Zheng](https://github.com/Vegeta2020) and
[Livox](https://github.com/Livox-SDK) for developing and releasing their awesome
object detection modules [SE-SSD](https://github.com/Vegeta2020/SE-SSD) and
[Livox Detection](https://github.com/Livox-SDK/livox_detection) respectively.


================================================
FILE: config/doc/kitti2bag/README.md
================================================
# kitti2bag

## How to run it?

```bash
wget https://s3.eu-central-1.amazonaws.com/avg-kitti/raw_data/2011_09_26_drive_0084/2011_09_26_drive_0084_sync.zip
wget https://s3.eu-central-1.amazonaws.com/avg-kitti/raw_data/2011_09_26_drive_0084/2011_09_26_drive_0084_extract.zip
wget https://s3.eu-central-1.amazonaws.com/avg-kitti/raw_data/2011_09_26_calib.zip
unzip 2011_09_26_drive_0084_sync.zip
unzip 2011_09_26_drive_0084_extract.zip
unzip 2011_09_26_calib.zip
python kitti2bag.py -t 2011_09_26 -r 0084 raw_synced .
```

That's it. You have a bag that contains your data.

Other source files can be found at [KITTI raw data](http://www.cvlibs.net/datasets/kitti/raw_data.php) page.

================================================
FILE: config/doc/kitti2bag/kitti2bag.py
================================================
#!env python
# -*- coding: utf-8 -*-

import sys

try:
    import pykitti
except ImportError as e:
    print('Could not load module \'pykitti\'. Please run `pip install pykitti`')
    sys.exit(1)

import argparse
import os
from datetime import datetime

import cv2
import numpy as np
import rosbag
import rospy
import sensor_msgs.point_cloud2 as pcl2
import tf
from cv_bridge import CvBridge
from geometry_msgs.msg import Transform, TransformStamped, TwistStamped
from sensor_msgs.msg import CameraInfo, Imu, NavSatFix, PointField
from std_msgs.msg import Header
from tf2_msgs.msg import TFMessage
from tqdm import tqdm


def save_imu_data(bag, kitti, imu_frame_id, topic):
    print("Exporting IMU")
    for timestamp, oxts in zip(kitti.timestamps, kitti.oxts):
        q = tf.transformations.quaternion_from_euler(oxts.packet.roll, oxts.packet.pitch, oxts.packet.yaw)
        imu = Imu()
        imu.header.frame_id = imu_frame_id
        imu.header.stamp = rospy.Time.from_sec(float(timestamp.strftime("%s.%f")))
        imu.orientation.x = q[0]
        imu.orientation.y = q[1]
        imu.orientation.z = q[2]
        imu.orientation.w = q[3]
        imu.linear_acceleration.x = oxts.packet.af
        imu.linear_acceleration.y = oxts.packet.al
        imu.linear_acceleration.z = oxts.packet.au
        imu.angular_velocity.x = oxts.packet.wf
        imu.angular_velocity.y = oxts.packet.wl
        imu.angular_velocity.z = oxts.packet.wu
        bag.write(topic, imu, t=imu.header.stamp)

def save_imu_data_raw(bag, kitti, imu_frame_id, topic):
    print("Exporting IMU Raw")
    synced_path = kitti.data_path
    unsynced_path = synced_path.replace('sync', 'extract')
    imu_path = os.path.join(unsynced_path, 'oxts')

    # read time stamp (convert to ros seconds format)
    ambiguous_index = None
    with open(os.path.join(imu_path, 'timestamps.txt')) as f:
        lines = f.readlines()
        imu_datetimes = []
        imu_status = []
        for line in lines:
            if len(line) == 1:
                continue
            timestamp = datetime.strptime(line[:-4], '%Y-%m-%d %H:%M:%S.%f')
            imu_datetime = float(timestamp.strftime('%s.%f'))

            if len(imu_datetimes) == 0:
                pass
            elif imu_datetimes[-1] > imu_datetime:
                if ambiguous_index is None:
                    ambiguous_index = np.argwhere(np.array(imu_datetimes) < imu_datetime).flatten()[-1] + 1

                for i in range(ambiguous_index, len(imu_datetimes)):
                    imu_status[i] = False

            if len(imu_datetimes) > 0 and imu_datetime - imu_datetimes[-1] >= 0.02:
                ambiguous_index = len(imu_datetimes)

            imu_datetimes.append(imu_datetime)
            imu_status.append(True)
    
    # # fix imu time using a linear model (may not be ideal, ^_^)
    # imu_index = np.asarray(range(len(imu_datetimes)), dtype=np.float64)
    # z = np.polyfit(imu_index, imu_datetimes, 1)
    # imu_datetimes_new = z[0] * imu_index + z[1]
    # imu_datetimes = imu_datetimes_new.tolist()
    # assert((np.diff(imu_datetimes) > 0).all())

    # get all imu data
    imu_data_dir = os.path.join(imu_path, 'data')
    imu_filenames = sorted(os.listdir(imu_data_dir))
    imu_data = [None] * len(imu_filenames)
    for i, imu_file in enumerate(imu_filenames):
        imu_data_file = open(os.path.join(imu_data_dir, imu_file), "r")
        for line in imu_data_file:
            if len(line) == 1:
                continue
            stripped_line = line.strip()
            line_list = stripped_line.split()
            imu_data[i] = line_list

    if len(imu_data) > len(imu_datetimes):
        imu_data = imu_data[:len(imu_datetimes)]

    assert len(imu_datetimes) == len(imu_data)
    print("Max IMU time interval: " + str(np.diff(np.array(imu_datetimes)[np.argwhere(imu_status).flatten()]).max()))

    for timestamp, data, status in zip(imu_datetimes, imu_data, imu_status):
        if not status:
            continue
        roll, pitch, yaw = float(data[3]), float(data[4]), float(data[5]), 
        q = tf.transformations.quaternion_from_euler(roll, pitch, yaw)
        imu = Imu()
        imu.header.frame_id = imu_frame_id
        imu.header.stamp = rospy.Time.from_sec(timestamp)
        imu.orientation.x = q[0]
        imu.orientation.y = q[1]
        imu.orientation.z = q[2]
        imu.orientation.w = q[3]
        imu.linear_acceleration.x = float(data[11])
        imu.linear_acceleration.y = float(data[12])
        imu.linear_acceleration.z = float(data[13])
        imu.angular_velocity.x = float(data[17])
        imu.angular_velocity.y = float(data[18])
        imu.angular_velocity.z = float(data[19])
        bag.write(topic, imu, t=imu.header.stamp)

        imu.header.frame_id = 'imu_enu_link'
        bag.write('/imu_correct', imu, t=imu.header.stamp) # for LIO-SAM GPS

def save_dynamic_tf(bag, kitti, kitti_type, initial_time):
    print("Exporting time dependent transformations")
    if kitti_type.find("raw") != -1:
        for timestamp, oxts in zip(kitti.timestamps, kitti.oxts):
            tf_oxts_msg = TFMessage()
            tf_oxts_transform = TransformStamped()
            tf_oxts_transform.header.stamp = rospy.Time.from_sec(float(timestamp.strftime("%s.%f")))
            tf_oxts_transform.header.frame_id = 'world'
            tf_oxts_transform.child_frame_id = 'base_link'

            transform = (oxts.T_w_imu)
            t = transform[0:3, 3]
            q = tf.transformations.quaternion_from_matrix(transform)
            oxts_tf = Transform()

            oxts_tf.translation.x = t[0]
            oxts_tf.translation.y = t[1]
            oxts_tf.translation.z = t[2]

            oxts_tf.rotation.x = q[0]
            oxts_tf.rotation.y = q[1]
            oxts_tf.rotation.z = q[2]
            oxts_tf.rotation.w = q[3]

            tf_oxts_transform.transform = oxts_tf
            tf_oxts_msg.transforms.append(tf_oxts_transform)

            bag.write('/tf', tf_oxts_msg, tf_oxts_msg.transforms[0].header.stamp)

    elif kitti_type.find("odom") != -1:
        timestamps = map(lambda x: initial_time + x.total_seconds(), kitti.timestamps)
        for timestamp, tf_matrix in zip(timestamps, kitti.T_w_cam0):
            tf_msg = TFMessage()
            tf_stamped = TransformStamped()
            tf_stamped.header.stamp = rospy.Time.from_sec(timestamp)
            tf_stamped.header.frame_id = 'world'
            tf_stamped.child_frame_id = 'camera_left'
            
            t = tf_matrix[0:3, 3]
            q = tf.transformations.quaternion_from_matrix(tf_matrix)
            transform = Transform()

            transform.translation.x = t[0]
            transform.translation.y = t[1]
            transform.translation.z = t[2]

            transform.rotation.x = q[0]
            transform.rotation.y = q[1]
            transform.rotation.z = q[2]
            transform.rotation.w = q[3]

            tf_stamped.transform = transform
            tf_msg.transforms.append(tf_stamped)

            bag.write('/tf', tf_msg, tf_msg.transforms[0].header.stamp)

def save_dynamic_tf_as_tum_file(filename, kitti, kitti_type, initial_time):
    print("Exporting time dependent transformations as the tum format")
    with open(filename, 'w') as file:
        if kitti_type.find("raw") != -1:
            for timestamp, oxts in zip(kitti.timestamps, kitti.oxts):
                tf_oxts_msg = TFMessage()
                tf_oxts_transform = TransformStamped()
                tf_oxts_transform.header.stamp = rospy.Time.from_sec(float(timestamp.strftime("%s.%f")))
                tf_oxts_transform.header.frame_id = 'world'
                tf_oxts_transform.child_frame_id = 'base_link'

                transform = (oxts.T_w_imu)
                t = transform[0:3, 3]
                q = tf.transformations.quaternion_from_matrix(transform)
                oxts_tf = Transform()

                oxts_tf.translation.x = t[0]
                oxts_tf.translation.y = t[1]
                oxts_tf.translation.z = t[2]

                oxts_tf.rotation.x = q[0]
                oxts_tf.rotation.y = q[1]
                oxts_tf.rotation.z = q[2]
                oxts_tf.rotation.w = q[3]

                tum_stamp = '%s %f %f %f %f %f %f %f\n'
                tum_stamp = tum_stamp % (timestamp.strftime('%s.%f'), t[0], t[1], t[2], q[0], q[1], q[2], q[3])
                file.write(tum_stamp)

        elif kitti_type.find("odom") != -1:
            timestamps = map(lambda x: initial_time + x.total_seconds(), kitti.timestamps)
            for timestamp, tf_matrix in zip(timestamps, kitti.T_w_cam0):
                tf_msg = TFMessage()
                tf_stamped = TransformStamped()
                tf_stamped.header.stamp = rospy.Time.from_sec(timestamp)
                tf_stamped.header.frame_id = 'world'
                tf_stamped.child_frame_id = 'camera_left'
                
                t = tf_matrix[0:3, 3]
                q = tf.transformations.quaternion_from_matrix(tf_matrix)
                transform = Transform()

                transform.translation.x = t[0]
                transform.translation.y = t[1]
                transform.translation.z = t[2]

                transform.rotation.x = q[0]
                transform.rotation.y = q[1]
                transform.rotation.z = q[2]
                transform.rotation.w = q[3]
                
                tum_stamp = '%s %f %f %f %f %f %f %f\n'
                tum_stamp = tum_stamp % (timestamp.strftime('%s.%f'), t[0], t[1], t[2], q[0], q[1], q[2], q[3])
                file.write(tum_stamp)

def save_camera_data(bag, kitti_type, kitti, util, bridge, camera, camera_frame_id, topic, initial_time):
    print("Exporting camera {}".format(camera))
    if kitti_type.find("raw") != -1:
        camera_pad = '{0:02d}'.format(camera)
        image_dir = os.path.join(kitti.data_path, 'image_{}'.format(camera_pad))
        image_path = os.path.join(image_dir, 'data')
        image_filenames = sorted(os.listdir(image_path))
        with open(os.path.join(image_dir, 'timestamps.txt')) as f:
            image_datetimes = map(lambda x: datetime.strptime(x[:-4], '%Y-%m-%d %H:%M:%S.%f'), f.readlines())
        
        calib = CameraInfo()
        calib.header.frame_id = camera_frame_id
        calib.width, calib.height = tuple(util['S_rect_{}'.format(camera_pad)].tolist())
        calib.distortion_model = 'plumb_bob'
        calib.K = util['K_{}'.format(camera_pad)]
        calib.R = util['R_rect_{}'.format(camera_pad)]
        calib.D = util['D_{}'.format(camera_pad)]
        calib.P = util['P_rect_{}'.format(camera_pad)]
            
    elif kitti_type.find("odom") != -1:
        camera_pad = '{0:01d}'.format(camera)
        image_path = os.path.join(kitti.sequence_path, 'image_{}'.format(camera_pad))
        image_filenames = sorted(os.listdir(image_path))
        image_datetimes = map(lambda x: initial_time + x.total_seconds(), kitti.timestamps)
        
        calib = CameraInfo()
        calib.header.frame_id = camera_frame_id
        calib.P = util['P{}'.format(camera_pad)]
    
    iterable = zip(image_datetimes, image_filenames)
    for dt, filename in tqdm(iterable, total=len(image_filenames)):
        image_filename = os.path.join(image_path, filename)
        cv_image = cv2.imread(image_filename)
        calib.height, calib.width = cv_image.shape[:2]
        if camera in (0, 1):
            cv_image = cv2.cvtColor(cv_image, cv2.COLOR_BGR2GRAY)
        encoding = "mono8" if camera in (0, 1) else "bgr8"
        image_message = bridge.cv2_to_imgmsg(cv_image, encoding=encoding)
        image_message.header.frame_id = camera_frame_id
        if kitti_type.find("raw") != -1:
            image_message.header.stamp = rospy.Time.from_sec(float(datetime.strftime(dt, "%s.%f")))
            topic_ext = "/image_raw"
        elif kitti_type.find("odom") != -1:
            image_message.header.stamp = rospy.Time.from_sec(dt)
            topic_ext = "/image_rect"
        calib.header.stamp = image_message.header.stamp
        bag.write(topic + topic_ext, image_message, t = image_message.header.stamp)
        bag.write(topic + '/camera_info', calib, t = calib.header.stamp) 
        
def save_velo_data(bag, kitti, velo_frame_id, topic):
    print("Exporting velodyne data")
    velo_path = os.path.join(kitti.data_path, 'velodyne_points')
    velo_data_dir = os.path.join(velo_path, 'data')
    velo_filenames = sorted(os.listdir(velo_data_dir))
    with open(os.path.join(velo_path, 'timestamps.txt')) as f:
        lines = f.readlines()
        velo_datetimes = []
        for line in lines:
            if len(line) == 1:
                continue
            dt = datetime.strptime(line[:-4], '%Y-%m-%d %H:%M:%S.%f')
            velo_datetimes.append(dt)

    iterable = zip(velo_datetimes, velo_filenames)

    count = 0

    for dt, filename in tqdm(iterable, total=len(velo_filenames)):
        if dt is None:
            continue

        velo_filename = os.path.join(velo_data_dir, filename)

        # read binary data
        scan = (np.fromfile(velo_filename, dtype=np.float32)).reshape(-1, 4)

        # get ring channel
        depth = np.linalg.norm(scan, 2, axis=1)
        pitch = np.arcsin(scan[:, 2] / depth) # arcsin(z, depth)
        fov_down = -24.8 / 180.0 * np.pi
        fov = (abs(-24.8) + abs(2.0)) / 180.0 * np.pi
        proj_y = (pitch + abs(fov_down)) / fov  # in [0.0, 1.0]
        proj_y *= 64  # in [0.0, H]
        proj_y = np.floor(proj_y)
        proj_y = np.minimum(64 - 1, proj_y)
        proj_y = np.maximum(0, proj_y).astype(np.int32)  # in [0,H-1]
        proj_y = proj_y.reshape(-1, 1)
        scan = np.concatenate((scan,proj_y), axis=1)
        scan = scan.tolist()
        for i in range(len(scan)):
            scan[i][-1] = int(scan[i][-1])

        # create header
        header = Header()
        header.frame_id = velo_frame_id
        header.stamp = rospy.Time.from_sec(float(datetime.strftime(dt, "%s.%f")))

        # fill pcl msg
        fields = [PointField('x', 0, PointField.FLOAT32, 1),
                  PointField('y', 4, PointField.FLOAT32, 1),
                  PointField('z', 8, PointField.FLOAT32, 1),
                  PointField('intensity', 12, PointField.FLOAT32, 1),
                  PointField('ring', 16, PointField.UINT16, 1)]
        pcl_msg = pcl2.create_cloud(header, fields, scan)
        pcl_msg.is_dense = True
        # print(pcl_msg)

        bag.write(topic, pcl_msg, t=pcl_msg.header.stamp)

        # count += 1
        # if count > 200:
        #     break

def get_static_transform(from_frame_id, to_frame_id, transform):
    t = transform[0:3, 3]
    q = tf.transformations.quaternion_from_matrix(transform)
    tf_msg = TransformStamped()
    tf_msg.header.frame_id = from_frame_id
    tf_msg.child_frame_id = to_frame_id
    tf_msg.transform.translation.x = float(t[0])
    tf_msg.transform.translation.y = float(t[1])
    tf_msg.transform.translation.z = float(t[2])
    tf_msg.transform.rotation.x = float(q[0])
    tf_msg.transform.rotation.y = float(q[1])
    tf_msg.transform.rotation.z = float(q[2])
    tf_msg.transform.rotation.w = float(q[3])
    return tf_msg


def inv(transform):
    "Invert rigid body transformation matrix"
    R = transform[0:3, 0:3]
    t = transform[0:3, 3]
    t_inv = -1 * R.T.dot(t)
    transform_inv = np.eye(4)
    transform_inv[0:3, 0:3] = R.T
    transform_inv[0:3, 3] = t_inv
    return transform_inv


def save_static_transforms(bag, transforms, timestamps):
    print("Exporting static transformations")
    tfm = TFMessage()
    for transform in transforms:
        t = get_static_transform(from_frame_id=transform[0], to_frame_id=transform[1], transform=transform[2])
        tfm.transforms.append(t)
    for timestamp in timestamps:
        time = rospy.Time.from_sec(float(timestamp.strftime("%s.%f")))
        for i in range(len(tfm.transforms)):
            tfm.transforms[i].header.stamp = time
        bag.write('/tf_static', tfm, t=time)


def save_gps_fix_data(bag, kitti, gps_frame_id, topic):
    for timestamp, oxts in zip(kitti.timestamps, kitti.oxts):
        navsatfix_msg = NavSatFix()
        navsatfix_msg.header.frame_id = gps_frame_id
        navsatfix_msg.header.stamp = rospy.Time.from_sec(float(timestamp.strftime("%s.%f")))
        navsatfix_msg.latitude = oxts.packet.lat
        navsatfix_msg.longitude = oxts.packet.lon
        navsatfix_msg.altitude = oxts.packet.alt
        navsatfix_msg.status.service = 1
        bag.write(topic, navsatfix_msg, t=navsatfix_msg.header.stamp)


def save_gps_vel_data(bag, kitti, gps_frame_id, topic):
    for timestamp, oxts in zip(kitti.timestamps, kitti.oxts):
        twist_msg = TwistStamped()
        twist_msg.header.frame_id = gps_frame_id
        twist_msg.header.stamp = rospy.Time.from_sec(float(timestamp.strftime("%s.%f")))
        twist_msg.twist.linear.x = oxts.packet.vf
        twist_msg.twist.linear.y = oxts.packet.vl
        twist_msg.twist.linear.z = oxts.packet.vu
        twist_msg.twist.angular.x = oxts.packet.wf
        twist_msg.twist.angular.y = oxts.packet.wl
        twist_msg.twist.angular.z = oxts.packet.wu
        bag.write(topic, twist_msg, t=twist_msg.header.stamp)


if __name__ == "__main__":
    parser = argparse.ArgumentParser(description = "Convert KITTI dataset to ROS bag file the easy way!")
    # Accepted argument values
    kitti_types = ["raw_synced", "odom_color", "odom_gray"]
    odometry_sequences = []
    for s in range(22):
        odometry_sequences.append(str(s).zfill(2))
    
    parser.add_argument("kitti_type", choices = kitti_types, help = "KITTI dataset type")
    parser.add_argument("dir", nargs = "?", default = os.getcwd(), help = "base directory of the dataset, if no directory passed the deafult is current working directory")
    parser.add_argument("-t", "--date", help = "date of the raw dataset (i.e. 2011_09_26), option is only for RAW datasets.")
    parser.add_argument("-r", "--drive", help = "drive number of the raw dataset (i.e. 0001), option is only for RAW datasets.")
    parser.add_argument("-s", "--sequence", choices = odometry_sequences,help = "sequence of the odometry dataset (between 00 - 21), option is only for ODOMETRY datasets.")
    args = parser.parse_args()

    bridge = CvBridge()
    compression = rosbag.Compression.NONE
    # compression = rosbag.Compression.BZ2
    # compression = rosbag.Compression.LZ4
    
    # CAMERAS
    cameras = [
        (0, 'camera_gray_left', '/kitti/camera_gray_left'),
        (1, 'camera_gray_right', '/kitti/camera_gray_right'),
        (2, 'camera_color_left', '/kitti/camera_color_left'),
        (3, 'camera_color_right', '/kitti/camera_color_right')
    ]

    if args.kitti_type.find("raw") != -1:
    
        if args.date == None:
            print("Date option is not given. It is mandatory for raw dataset.")
            print("Usage for raw dataset: kitti2bag raw_synced [dir] -t <date> -r <drive>")
            sys.exit(1)
        elif args.drive == None:
            print("Drive option is not given. It is mandatory for raw dataset.")
            print("Usage for raw dataset: kitti2bag raw_synced [dir] -t <date> -r <drive>")
            sys.exit(1)
        
        bag = rosbag.Bag("kitti_{}_drive_{}_{}.bag".format(args.date, args.drive, args.kitti_type[4:]), 'w', compression=compression)
        tum_filename = "kitti_{}_drive_{}_{}.tum".format(args.date, args.drive, args.kitti_type[4:])
        kitti = pykitti.raw(args.dir, args.date, args.drive)
        if not os.path.exists(kitti.data_path):
            print('Path {} does not exists. Exiting.'.format(kitti.data_path))
            sys.exit(1)

        if len(kitti.timestamps) == 0:
            print('Dataset is empty? Exiting.')
            sys.exit(1)

        try:
            # IMU
            imu_frame_id = 'imu_link'
            imu_topic = '/kitti/oxts/imu'
            imu_raw_topic = '/imu_raw'
            gps_fix_topic = '/gps/fix'
            gps_vel_topic = '/gps/vel'
            velo_frame_id = 'velodyne'
            velo_topic = '/points_raw'

            T_base_link_to_imu = np.eye(4, 4)
            T_base_link_to_imu[0:3, 3] = [-2.71/2.0-0.05, 0.32, 0.93]

            # tf_static
            transforms = [
                ('base_link', imu_frame_id, T_base_link_to_imu),
                (imu_frame_id, velo_frame_id, inv(kitti.calib.T_velo_imu)),
                (imu_frame_id, cameras[0][1], inv(kitti.calib.T_cam0_imu)),
                (imu_frame_id, cameras[1][1], inv(kitti.calib.T_cam1_imu)),
                (imu_frame_id, cameras[2][1], inv(kitti.calib.T_cam2_imu)),
                (imu_frame_id, cameras[3][1], inv(kitti.calib.T_cam3_imu))
            ]

            util = pykitti.utils.read_calib_file(os.path.join(kitti.calib_path, 'calib_cam_to_cam.txt'))

            # Export
            save_static_transforms(bag, transforms, kitti.timestamps)
            save_dynamic_tf(bag, kitti, args.kitti_type, initial_time=None)
            save_dynamic_tf_as_tum_file(tum_filename, kitti, args.kitti_type, initial_time=None)
            # save_imu_data(bag, kitti, imu_frame_id, imu_topic)
            save_imu_data_raw(bag, kitti, imu_frame_id, imu_raw_topic)
            save_gps_fix_data(bag, kitti, imu_frame_id, gps_fix_topic)
            save_gps_vel_data(bag, kitti, imu_frame_id, gps_vel_topic)
            for camera in cameras:
                save_camera_data(bag, args.kitti_type, kitti, util, bridge, camera=camera[0], camera_frame_id=camera[1], topic=camera[2], initial_time=None)
                break
            save_velo_data(bag, kitti, velo_frame_id, velo_topic)

        finally:
            print("## OVERVIEW ##")
            print(bag)
            bag.close()
            
    elif args.kitti_type.find("odom") != -1:
        
        if args.sequence == None:
            print("Sequence option is not given. It is mandatory for odometry dataset.")
            print("Usage for odometry dataset: kitti2bag {odom_color, odom_gray} [dir] -s <sequence>")
            sys.exit(1)
            
        bag = rosbag.Bag("kitti_data_odometry_{}_sequence_{}.bag".format(args.kitti_type[5:],args.sequence), 'w', compression=compression)
        
        kitti = pykitti.odometry(args.dir, args.sequence)
        if not os.path.exists(kitti.sequence_path):
            print('Path {} does not exists. Exiting.'.format(kitti.sequence_path))
            sys.exit(1)

        kitti.load_calib()         
        kitti.load_timestamps() 
             
        if len(kitti.timestamps) == 0:
            print('Dataset is empty? Exiting.')
            sys.exit(1)
            
        if args.sequence in odometry_sequences[:11]:
            print("Odometry dataset sequence {} has ground truth information (poses).".format(args.sequence))
            kitti.load_poses()

        try:
            util = pykitti.utils.read_calib_file(os.path.join(args.dir,'sequences',args.sequence, 'calib.txt'))
            current_epoch = (datetime.utcnow() - datetime(1970, 1, 1)).total_seconds()
            # Export
            if args.kitti_type.find("gray") != -1:
                used_cameras = cameras[:2]
            elif args.kitti_type.find("color") != -1:
                used_cameras = cameras[-2:]

            save_dynamic_tf(bag, kitti, args.kitti_type, initial_time=current_epoch)
            for camera in used_cameras:
                save_camera_data(bag, args.kitti_type, kitti, util, bridge, camera=camera[0], camera_frame_id=camera[1], topic=camera[2], initial_time=current_epoch)

        finally:
            print("## OVERVIEW ##")
            print(bag)
            bag.close()



================================================
FILE: config/params.yaml
================================================
lio_segmot:

  # Topics
  pointCloudTopic: "points_raw"               # Point cloud data
  imuTopic: "imu_raw"                         # IMU data
  odomTopic: "odometry/imu"                   # IMU pre-preintegration odometry, same frequency as IMU
  gpsTopic: "odometry/gpsz"                   # GPS odometry topic from navsat, see module_navsat.launch file

  # Frames
  lidarFrame: "base_link"
  baselinkFrame: "base_link"
  odometryFrame: "odom"
  mapFrame: "map"

  # GPS Settings
  useImuHeadingInitialization: false          # if using GPS data, set to "true"
  useGpsElevation: false                      # if GPS elevation is bad, set to "false"
  gpsCovThreshold: 2.0                        # m^2, threshold for using GPS data
  poseCovThreshold: 25.0                      # m^2, threshold for using GPS data
  
  # Export settings
  savePCD: false                              # https://github.com/TixiaoShan/LIO-SAM/issues/3
  savePCDDirectory: "/Downloads/LOAM/"        # in your home folder, starts and ends with "/". Warning: the code deletes "LOAM" folder then recreates it. See "mapOptimization" for implementation

  # Sensor Settings
  sensor: velodyne                            # lidar sensor type, either 'velodyne' or 'ouster'
  N_SCAN: 64                                  # number of lidar channel (i.e., 16, 32, 64, 128)
  Horizon_SCAN: 1800                          # lidar horizontal resolution (Velodyne:1800, Ouster:512,1024,2048)
  downsampleRate: 1                           # default: 1. Downsample your data if too many points. i.e., 16 = 64 / 4, 16 = 16 / 1
  lidarMinRange: 1.0                          # default: 1.0, minimum lidar range to be used
  lidarMaxRange: 1000.0                       # default: 1000.0, maximum lidar range to be used

  # IMU Settings
  imuAccNoise: 3.9939570888238808e-03
  imuGyrNoise: 1.5636343949698187e-03
  imuAccBiasN: 6.4356659353532566e-05
  imuGyrBiasN: 3.5640318696367613e-05
  imuGravity: 9.80511
  imuRPYWeight: 0.01

  # Extrinsics (lidar -> IMU, KITTI)
  extrinsicTrans: [-8.086759e-01, 3.195559e-01, -7.997231e-01]
  extrinsicRot: [ 9.999976e-01, 7.553071e-04, -2.035826e-03,
                 -7.854027e-04, 9.998898e-01, -1.482298e-02,
                  2.024406e-03, 1.482454e-02,  9.998881e-01]
  extrinsicRPY: [ 9.999976e-01, 7.553071e-04, -2.035826e-03,
                 -7.854027e-04, 9.998898e-01, -1.482298e-02,
                  2.024406e-03, 1.482454e-02,  9.998881e-01]
  # Extrinsics (lidar -> IMU, ITRI)
  # extrinsicTrans: [0, 0, 0]
  # extrinsicRot: [1, 0, 0,
  #                 0, 1, 0,
  #                 0, 0, 1]
  # extrinsicRPY: [1, 0, 0,
  #                 0, 1, 0,
  #                 0, 0, 1]
  # Extrinsics (lidar -> IMU, UrbanLoco)
  # extrinsicTrans: [0, 0, -0.0762]
  # extrinsicRot: [ 0, 1, 0,
  #                -1, 0, 0,
  #                 0, 0, 1]
  # extrinsicRPY: [ 0, 1, 0,
  #                -1, 0, 0,
  #                 0, 0, 1]

  # LOAM feature threshold
  edgeThreshold: 1.0
  surfThreshold: 0.1
  edgeFeatureMinValidNum: 10
  surfFeatureMinValidNum: 100

  # voxel filter paprams
  odometrySurfLeafSize: 0.4                     # default: 0.4 - outdoor, 0.2 - indoor
  mappingCornerLeafSize: 0.2                    # default: 0.2 - outdoor, 0.1 - indoor
  mappingSurfLeafSize: 0.4                      # default: 0.4 - outdoor, 0.2 - indoor

  # robot motion constraint (in case you are using a 2D robot)
  z_tollerance: 1000                            # meters
  rotation_tollerance: 1000                     # radians

  # CPU Params
  numberOfCores: 8                              # number of cores for mapping optimization
  mappingProcessInterval: 0.09                  # seconds, regulate mapping frequency

  # Surrounding map
  surroundingkeyframeAddingDistThreshold: 1.0   # meters, regulate keyframe adding threshold
  surroundingkeyframeAddingAngleThreshold: 0.2  # radians, regulate keyframe adding threshold
  surroundingKeyframeDensity: 2.0               # meters, downsample surrounding keyframe poses   
  surroundingKeyframeSearchRadius: 50.0         # meters, within n meters scan-to-map optimization (when loop closure disabled)

  # Loop closure
  loopClosureEnableFlag: true
  loopClosureFrequency: 1.0                     # Hz, regulate loop closure constraint add frequency
  surroundingKeyframeSize: 50                   # submap size (when loop closure enabled)
  historyKeyframeSearchRadius: 15.0             # meters, key frame that is within n meters from current pose will be considerd for loop closure
  historyKeyframeSearchTimeDiff: 30.0           # seconds, key frame that is n seconds older will be considered for loop closure
  historyKeyframeSearchNum: 25                  # number of hostory key frames will be fused into a submap for loop closure
  historyKeyframeFitnessScore: 0.3              # icp threshold, the smaller the better alignment

  # Visualization
  globalMapVisualizationSearchRadius: 1000.0    # meters, global map visualization radius
  globalMapVisualizationPoseDensity: 10.0       # meters, global map visualization keyframe density
  globalMapVisualizationLeafSize: 1.0           # meters, global map visualization cloud density

  # Dynamic object data association
  detectionMatchThreshold: 19.5
  dataAssociationVarianceVector: [3.0e-4, 3.0e-4, 3.0e-4, 5.0e-2, 3.0e-2, 3.0e-2]
  earlyLooselyCoupledMatchingVarianceVector: [3.0e-4, 3.0e-4, 3.0e-4, 5.0e-2, 5.0e-3, 5.0e-3]
  looselyCoupledMatchingVarianceVector: [1.0e-4, 1.0e-4, 1.0e-4, 2.0e-3, 2.0e-3, 2.0e-3]
  tightlyCoupledMatchingVarianceVector: [8.0e-6, 8.0e-6, 8.0e-6, 1.0e-4, 1.0e-4, 1.0e-4]
  # tightlyCoupledMatchingVarianceVector: [2.0e-7, 2.0e-7, 2.0e-7, 1.5e-5, 1.5e-5, 1.5e-5]

  # Factor covariance matrices (as diagonal vectors)
  # priorOdometryDiagonalVarianceVector: [1.0e-2, 1.0e-2, M_PI*M_PI, 1.0e+8, 1.0e+8, 1.0e+8]
  # odometryDiagonalVarianceVector: [1.0e-6, 1.0e-6, 1.0e-6, 1.0e-4, 1.0e-4, 1.0e-4]
  earlyConstantVelocityDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 2.0e-1, 1.0e-1, 1.0e-1]
  constantVelocityDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 2.0e-1, 1.0e-1, 1.0e-1]
  motionDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 1.0e-1, 1.0e-2, 1.0e-2]
  looselyCoupledDetectionVarianceVector: [2.0e-4, 2.0e-4, 2.0e-4, 1.5e-3, 1.5e-3, 1.5e-3]
  tightlyCoupledDetectionVarianceVector: [2.0e-4, 2.0e-4, 2.0e-4, 1.5e-3, 1.5e-3, 1.5e-3]

  numberOfEarlySteps: 2
  numberOfPreLooseCouplingSteps: 6
  numberOfVelocityConsistencySteps: 4
  numberOfInterLooseCouplingSteps: 0
  tightCouplingDetectionErrorThreshold: 26.0

  objectAngularVelocityConsistencyVarianceThreshold: 1.0e-5  # rad^2
  objectLinearVelocityConsistencyVarianceThreshold: 1.0e-2  # m^2

  # Tracking
  trackingStepsForLostObject: 3


# Navsat (convert GPS coordinates to Cartesian)
navsat:
  frequency: 50
  wait_for_datum: false
  delay: 0.0
  magnetic_declination_radians: 0
  yaw_offset: 0
  zero_altitude: true
  broadcast_utm_transform: false
  broadcast_utm_transform_as_parent_frame: false
  publish_filtered_gps: false

# EKF for Navsat
ekf_gps:
  publish_tf: false
  map_frame: map
  odom_frame: odom
  base_link_frame: base_link
  world_frame: odom

  frequency: 50
  two_d_mode: false
  sensor_timeout: 0.01
  # -------------------------------------
  # External IMU:
  # -------------------------------------
  imu0: imu_correct
  # make sure the input is aligned with ROS REP105. "imu_correct" is manually transformed by myself. EKF can also transform the data using tf between your imu and base_link
  imu0_config: [false, false, false,
                true,  true,  true,
                false, false, false,
                false, false, true,
                true,  true,  true]
  imu0_differential: false
  imu0_queue_size: 50 
  imu0_remove_gravitational_acceleration: true
  # -------------------------------------
  # Odometry (From Navsat):
  # -------------------------------------
  odom0: odometry/gps
  odom0_config: [true,  true,  true,
                 false, false, false,
                 false, false, false,
                 false, false, false,
                 false, false, false]
  odom0_differential: false
  odom0_queue_size: 10

  #                            x     y     z     r     p     y   x_dot  y_dot  z_dot  r_dot p_dot y_dot x_ddot y_ddot z_ddot
  process_noise_covariance: [  1.0,  0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    1.0,  0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    10.0, 0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0.03, 0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0.03, 0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0.1,  0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0.25,  0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0.25,  0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0.04,  0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0.01, 0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0.01, 0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0.5,  0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0.01, 0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0.01,   0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0.015]


================================================
FILE: config/params_hsinchu.yaml
================================================
lio_segmot:

  # Topics
  pointCloudTopic: "points_raw"               # Point cloud data
  imuTopic: "imu_raw"                         # IMU data
  odomTopic: "odometry/imu"                   # IMU pre-preintegration odometry, same frequency as IMU
  gpsTopic: "odometry/gpsz"                   # GPS odometry topic from navsat, see module_navsat.launch file

  # Frames
  lidarFrame: "base_link"
  baselinkFrame: "base_link"
  odometryFrame: "odom"
  mapFrame: "map"

  # GPS Settings
  useImuHeadingInitialization: false          # if using GPS data, set to "true"
  useGpsElevation: false                      # if GPS elevation is bad, set to "false"
  gpsCovThreshold: 2.0                        # m^2, threshold for using GPS data
  poseCovThreshold: 25.0                      # m^2, threshold for using GPS data
  
  # Export settings
  savePCD: false                              # https://github.com/TixiaoShan/LIO-SAM/issues/3
  savePCDDirectory: "/Downloads/LOAM/"        # in your home folder, starts and ends with "/". Warning: the code deletes "LOAM" folder then recreates it. See "mapOptimization" for implementation

  # Sensor Settings
  sensor: velodyne                            # lidar sensor type, either 'velodyne' or 'ouster'
  N_SCAN: 64                                  # number of lidar channel (i.e., 16, 32, 64, 128)
  Horizon_SCAN: 1800                          # lidar horizontal resolution (Velodyne:1800, Ouster:512,1024,2048)
  downsampleRate: 1                           # default: 1. Downsample your data if too many points. i.e., 16 = 64 / 4, 16 = 16 / 1
  lidarMinRange: 1.0                          # default: 1.0, minimum lidar range to be used
  lidarMaxRange: 1000.0                       # default: 1000.0, maximum lidar range to be used

  # IMU Settings
  imuAccNoise: 3.9939570888238808e-03
  imuGyrNoise: 1.5636343949698187e-03
  imuAccBiasN: 6.4356659353532566e-05
  imuGyrBiasN: 3.5640318696367613e-05
  imuGravity: 9.80511
  imuRPYWeight: 0.01

  # Extrinsics (lidar -> IMU, KITTI)
  # extrinsicTrans: [-8.086759e-01, 3.195559e-01, -7.997231e-01]
  # extrinsicRot: [ 9.999976e-01, 7.553071e-04, -2.035826e-03,
  #                -7.854027e-04, 9.998898e-01, -1.482298e-02,
  #                 2.024406e-03, 1.482454e-02,  9.998881e-01]
  # extrinsicRPY: [ 9.999976e-01, 7.553071e-04, -2.035826e-03,
  #                -7.854027e-04, 9.998898e-01, -1.482298e-02,
  #                 2.024406e-03, 1.482454e-02,  9.998881e-01]
  # Extrinsics (lidar -> IMU, ITRI)
  extrinsicTrans: [0, 0, 0]
  extrinsicRot: [1, 0, 0,
                  0, 1, 0,
                  0, 0, 1]
  extrinsicRPY: [1, 0, 0,
                  0, 1, 0,
                  0, 0, 1]
  # Extrinsics (lidar -> IMU, UrbanLoco)
  # extrinsicTrans: [0, 0, -0.0762]
  # extrinsicRot: [ 0, 1, 0,
  #                -1, 0, 0,
  #                 0, 0, 1]
  # extrinsicRPY: [ 0, 1, 0,
  #                -1, 0, 0,
  #                 0, 0, 1]

  # LOAM feature threshold
  edgeThreshold: 1.0
  surfThreshold: 0.1
  edgeFeatureMinValidNum: 10
  surfFeatureMinValidNum: 100

  # voxel filter paprams
  odometrySurfLeafSize: 0.4                     # default: 0.4 - outdoor, 0.2 - indoor
  mappingCornerLeafSize: 0.2                    # default: 0.2 - outdoor, 0.1 - indoor
  mappingSurfLeafSize: 0.4                      # default: 0.4 - outdoor, 0.2 - indoor

  # robot motion constraint (in case you are using a 2D robot)
  z_tollerance: 1000                            # meters
  rotation_tollerance: 1000                     # radians

  # CPU Params
  numberOfCores: 8                              # number of cores for mapping optimization
  mappingProcessInterval: 0.09                  # seconds, regulate mapping frequency

  # Surrounding map
  surroundingkeyframeAddingDistThreshold: 1.0   # meters, regulate keyframe adding threshold
  surroundingkeyframeAddingAngleThreshold: 0.2  # radians, regulate keyframe adding threshold
  surroundingKeyframeDensity: 2.0               # meters, downsample surrounding keyframe poses   
  surroundingKeyframeSearchRadius: 50.0         # meters, within n meters scan-to-map optimization (when loop closure disabled)

  # Loop closure
  loopClosureEnableFlag: true
  loopClosureFrequency: 1.0                     # Hz, regulate loop closure constraint add frequency
  surroundingKeyframeSize: 50                   # submap size (when loop closure enabled)
  historyKeyframeSearchRadius: 15.0             # meters, key frame that is within n meters from current pose will be considerd for loop closure
  historyKeyframeSearchTimeDiff: 30.0           # seconds, key frame that is n seconds older will be considered for loop closure
  historyKeyframeSearchNum: 25                  # number of hostory key frames will be fused into a submap for loop closure
  historyKeyframeFitnessScore: 0.3              # icp threshold, the smaller the better alignment

  # Visualization
  globalMapVisualizationSearchRadius: 1000.0    # meters, global map visualization radius
  globalMapVisualizationPoseDensity: 10.0       # meters, global map visualization keyframe density
  globalMapVisualizationLeafSize: 1.0           # meters, global map visualization cloud density

  # Dynamic object data association
  detectionMatchThreshold: 19.5
  dataAssociationVarianceVector: [3.0e-4, 3.0e-4, 3.0e-4, 5.0e-2, 3.0e-2, 3.0e-2]
  earlyLooselyCoupledMatchingVarianceVector: [3.0e-4, 3.0e-4, 3.0e-4, 5.0e-2, 5.0e-3, 5.0e-3]
  looselyCoupledMatchingVarianceVector: [1.0e-4, 1.0e-4, 1.0e-4, 2.0e-3, 2.0e-3, 2.0e-3]
  tightlyCoupledMatchingVarianceVector: [8.0e-6, 8.0e-6, 8.0e-6, 1.0e-4, 1.0e-4, 1.0e-4]
  # tightlyCoupledMatchingVarianceVector: [2.0e-7, 2.0e-7, 2.0e-7, 1.5e-5, 1.5e-5, 1.5e-5]

  # Factor covariance matrices (as diagonal vectors)
  # priorOdometryDiagonalVarianceVector: [1.0e-2, 1.0e-2, M_PI*M_PI, 1.0e+8, 1.0e+8, 1.0e+8]
  # odometryDiagonalVarianceVector: [1.0e-6, 1.0e-6, 1.0e-6, 1.0e-4, 1.0e-4, 1.0e-4]
  earlyConstantVelocityDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 2.0e-1, 1.0e-1, 1.0e-1]
  constantVelocityDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 2.0e-1, 1.0e-1, 1.0e-1]
  motionDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 1.0e-1, 1.0e-2, 1.0e-2]
  looselyCoupledDetectionVarianceVector: [2.0e-4, 2.0e-4, 2.0e-4, 1.5e-3, 1.5e-3, 1.5e-3]
  tightlyCoupledDetectionVarianceVector: [2.0e-4, 2.0e-4, 2.0e-4, 1.5e-3, 1.5e-3, 1.5e-3]

  numberOfEarlySteps: 2
  numberOfPreLooseCouplingSteps: 6
  numberOfVelocityConsistencySteps: 4
  numberOfInterLooseCouplingSteps: 0
  tightCouplingDetectionErrorThreshold: 19.5

  objectAngularVelocityConsistencyVarianceThreshold: 1.0e-5  # rad^2
  objectLinearVelocityConsistencyVarianceThreshold: 1.0e-2  # m^2

  # Tracking
  trackingStepsForLostObject: 3


# Navsat (convert GPS coordinates to Cartesian)
navsat:
  frequency: 50
  wait_for_datum: false
  delay: 0.0
  magnetic_declination_radians: 0
  yaw_offset: 0
  zero_altitude: true
  broadcast_utm_transform: false
  broadcast_utm_transform_as_parent_frame: false
  publish_filtered_gps: false

# EKF for Navsat
ekf_gps:
  publish_tf: false
  map_frame: map
  odom_frame: odom
  base_link_frame: base_link
  world_frame: odom

  frequency: 50
  two_d_mode: false
  sensor_timeout: 0.01
  # -------------------------------------
  # External IMU:
  # -------------------------------------
  imu0: imu_correct
  # make sure the input is aligned with ROS REP105. "imu_correct" is manually transformed by myself. EKF can also transform the data using tf between your imu and base_link
  imu0_config: [false, false, false,
                true,  true,  true,
                false, false, false,
                false, false, true,
                true,  true,  true]
  imu0_differential: false
  imu0_queue_size: 50 
  imu0_remove_gravitational_acceleration: true
  # -------------------------------------
  # Odometry (From Navsat):
  # -------------------------------------
  odom0: odometry/gps
  odom0_config: [true,  true,  true,
                 false, false, false,
                 false, false, false,
                 false, false, false,
                 false, false, false]
  odom0_differential: false
  odom0_queue_size: 10

  #                            x     y     z     r     p     y   x_dot  y_dot  z_dot  r_dot p_dot y_dot x_ddot y_ddot z_ddot
  process_noise_covariance: [  1.0,  0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    1.0,  0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    10.0, 0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0.03, 0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0.03, 0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0.1,  0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0.25,  0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0.25,  0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0.04,  0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0.01, 0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0.01, 0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0.5,  0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0.01, 0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0.01,   0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0.015]


================================================
FILE: config/params_kitti.yaml
================================================
lio_segmot:

  # Topics
  pointCloudTopic: "points_raw"               # Point cloud data
  imuTopic: "imu_raw"                         # IMU data
  odomTopic: "odometry/imu"                   # IMU pre-preintegration odometry, same frequency as IMU
  gpsTopic: "odometry/gpsz"                   # GPS odometry topic from navsat, see module_navsat.launch file

  # Frames
  lidarFrame: "base_link"
  baselinkFrame: "base_link"
  odometryFrame: "odom"
  mapFrame: "map"

  # GPS Settings
  useImuHeadingInitialization: false          # if using GPS data, set to "true"
  useGpsElevation: false                      # if GPS elevation is bad, set to "false"
  gpsCovThreshold: 2.0                        # m^2, threshold for using GPS data
  poseCovThreshold: 25.0                      # m^2, threshold for using GPS data
  
  # Export settings
  savePCD: false                              # https://github.com/TixiaoShan/LIO-SAM/issues/3
  savePCDDirectory: "/Downloads/LOAM/"        # in your home folder, starts and ends with "/". Warning: the code deletes "LOAM" folder then recreates it. See "mapOptimization" for implementation

  # Sensor Settings
  sensor: velodyne                            # lidar sensor type, either 'velodyne' or 'ouster'
  N_SCAN: 64                                  # number of lidar channel (i.e., 16, 32, 64, 128)
  Horizon_SCAN: 1800                          # lidar horizontal resolution (Velodyne:1800, Ouster:512,1024,2048)
  downsampleRate: 1                           # default: 1. Downsample your data if too many points. i.e., 16 = 64 / 4, 16 = 16 / 1
  lidarMinRange: 1.0                          # default: 1.0, minimum lidar range to be used
  lidarMaxRange: 1000.0                       # default: 1000.0, maximum lidar range to be used

  # IMU Settings
  imuAccNoise: 3.9939570888238808e-03
  imuGyrNoise: 1.5636343949698187e-03
  imuAccBiasN: 6.4356659353532566e-05
  imuGyrBiasN: 3.5640318696367613e-05
  imuGravity: 9.80511
  imuRPYWeight: 0.01

  # Extrinsics (lidar -> IMU, KITTI)
  extrinsicTrans: [-8.086759e-01, 3.195559e-01, -7.997231e-01]
  extrinsicRot: [ 9.999976e-01, 7.553071e-04, -2.035826e-03,
                 -7.854027e-04, 9.998898e-01, -1.482298e-02,
                  2.024406e-03, 1.482454e-02,  9.998881e-01]
  extrinsicRPY: [ 9.999976e-01, 7.553071e-04, -2.035826e-03,
                 -7.854027e-04, 9.998898e-01, -1.482298e-02,
                  2.024406e-03, 1.482454e-02,  9.998881e-01]
  # Extrinsics (lidar -> IMU, ITRI)
  # extrinsicTrans: [0, 0, 0]
  # extrinsicRot: [1, 0, 0,
  #                 0, 1, 0,
  #                 0, 0, 1]
  # extrinsicRPY: [1, 0, 0,
  #                 0, 1, 0,
  #                 0, 0, 1]
  # Extrinsics (lidar -> IMU, UrbanLoco)
  # extrinsicTrans: [0, 0, -0.0762]
  # extrinsicRot: [ 0, 1, 0,
  #                -1, 0, 0,
  #                 0, 0, 1]
  # extrinsicRPY: [ 0, 1, 0,
  #                -1, 0, 0,
  #                 0, 0, 1]

  # LOAM feature threshold
  edgeThreshold: 1.0
  surfThreshold: 0.1
  edgeFeatureMinValidNum: 10
  surfFeatureMinValidNum: 100

  # voxel filter paprams
  odometrySurfLeafSize: 0.4                     # default: 0.4 - outdoor, 0.2 - indoor
  mappingCornerLeafSize: 0.2                    # default: 0.2 - outdoor, 0.1 - indoor
  mappingSurfLeafSize: 0.4                      # default: 0.4 - outdoor, 0.2 - indoor

  # robot motion constraint (in case you are using a 2D robot)
  z_tollerance: 1000                            # meters
  rotation_tollerance: 1000                     # radians

  # CPU Params
  numberOfCores: 8                              # number of cores for mapping optimization
  mappingProcessInterval: 0.09                  # seconds, regulate mapping frequency

  # Surrounding map
  surroundingkeyframeAddingDistThreshold: 1.0   # meters, regulate keyframe adding threshold
  surroundingkeyframeAddingAngleThreshold: 0.2  # radians, regulate keyframe adding threshold
  surroundingKeyframeDensity: 2.0               # meters, downsample surrounding keyframe poses   
  surroundingKeyframeSearchRadius: 50.0         # meters, within n meters scan-to-map optimization (when loop closure disabled)

  # Loop closure
  loopClosureEnableFlag: true
  loopClosureFrequency: 1.0                     # Hz, regulate loop closure constraint add frequency
  surroundingKeyframeSize: 50                   # submap size (when loop closure enabled)
  historyKeyframeSearchRadius: 15.0             # meters, key frame that is within n meters from current pose will be considerd for loop closure
  historyKeyframeSearchTimeDiff: 30.0           # seconds, key frame that is n seconds older will be considered for loop closure
  historyKeyframeSearchNum: 25                  # number of hostory key frames will be fused into a submap for loop closure
  historyKeyframeFitnessScore: 0.3              # icp threshold, the smaller the better alignment

  # Visualization
  globalMapVisualizationSearchRadius: 1000.0    # meters, global map visualization radius
  globalMapVisualizationPoseDensity: 10.0       # meters, global map visualization keyframe density
  globalMapVisualizationLeafSize: 1.0           # meters, global map visualization cloud density

  # Dynamic object data association
  detectionMatchThreshold: 19.5
  dataAssociationVarianceVector: [3.0e-4, 3.0e-4, 3.0e-4, 5.0e-2, 3.0e-2, 3.0e-2]
  earlyLooselyCoupledMatchingVarianceVector: [3.0e-4, 3.0e-4, 3.0e-4, 5.0e-2, 5.0e-3, 5.0e-3]
  looselyCoupledMatchingVarianceVector: [1.0e-4, 1.0e-4, 1.0e-4, 2.0e-3, 2.0e-3, 2.0e-3]
  tightlyCoupledMatchingVarianceVector: [8.0e-6, 8.0e-6, 8.0e-6, 1.0e-4, 1.0e-4, 1.0e-4]
  # tightlyCoupledMatchingVarianceVector: [2.0e-7, 2.0e-7, 2.0e-7, 1.5e-5, 1.5e-5, 1.5e-5]

  # Factor covariance matrices (as diagonal vectors)
  # priorOdometryDiagonalVarianceVector: [1.0e-2, 1.0e-2, M_PI*M_PI, 1.0e+8, 1.0e+8, 1.0e+8]
  # odometryDiagonalVarianceVector: [1.0e-6, 1.0e-6, 1.0e-6, 1.0e-4, 1.0e-4, 1.0e-4]
  earlyConstantVelocityDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 2.0e-1, 1.0e-1, 1.0e-1]
  constantVelocityDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 2.0e-1, 1.0e-1, 1.0e-1]
  motionDiagonalVarianceVector: [2.0e-4, 2.0e-4, 1.0e-3, 1.0e-1, 1.0e-2, 1.0e-2]
  looselyCoupledDetectionVarianceVector: [2.0e-4, 2.0e-4, 2.0e-4, 1.5e-3, 1.5e-3, 1.5e-3]
  tightlyCoupledDetectionVarianceVector: [2.0e-4, 2.0e-4, 2.0e-4, 1.5e-3, 1.5e-3, 1.5e-3]

  numberOfEarlySteps: 2
  numberOfPreLooseCouplingSteps: 6
  numberOfVelocityConsistencySteps: 4
  numberOfInterLooseCouplingSteps: 0
  tightCouplingDetectionErrorThreshold: 26.0

  objectAngularVelocityConsistencyVarianceThreshold: 1.0e-5  # rad^2
  objectLinearVelocityConsistencyVarianceThreshold: 1.0e-2  # m^2

  # Tracking
  trackingStepsForLostObject: 3


# Navsat (convert GPS coordinates to Cartesian)
navsat:
  frequency: 50
  wait_for_datum: false
  delay: 0.0
  magnetic_declination_radians: 0
  yaw_offset: 0
  zero_altitude: true
  broadcast_utm_transform: false
  broadcast_utm_transform_as_parent_frame: false
  publish_filtered_gps: false

# EKF for Navsat
ekf_gps:
  publish_tf: false
  map_frame: map
  odom_frame: odom
  base_link_frame: base_link
  world_frame: odom

  frequency: 50
  two_d_mode: false
  sensor_timeout: 0.01
  # -------------------------------------
  # External IMU:
  # -------------------------------------
  imu0: imu_correct
  # make sure the input is aligned with ROS REP105. "imu_correct" is manually transformed by myself. EKF can also transform the data using tf between your imu and base_link
  imu0_config: [false, false, false,
                true,  true,  true,
                false, false, false,
                false, false, true,
                true,  true,  true]
  imu0_differential: false
  imu0_queue_size: 50 
  imu0_remove_gravitational_acceleration: true
  # -------------------------------------
  # Odometry (From Navsat):
  # -------------------------------------
  odom0: odometry/gps
  odom0_config: [true,  true,  true,
                 false, false, false,
                 false, false, false,
                 false, false, false,
                 false, false, false]
  odom0_differential: false
  odom0_queue_size: 10

  #                            x     y     z     r     p     y   x_dot  y_dot  z_dot  r_dot p_dot y_dot x_ddot y_ddot z_ddot
  process_noise_covariance: [  1.0,  0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    1.0,  0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    10.0, 0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0.03, 0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0.03, 0,    0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0.1,  0,     0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0.25,  0,     0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0.25,  0,     0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0.04,  0,    0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0.01, 0,    0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0.01, 0,    0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0.5,  0,    0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0.01, 0,      0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0.01,   0,
                               0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,    0,    0,      0.015]


================================================
FILE: include/factor.h
================================================
// Copyright 2021 Yu-Kai Lin. All rights reserved.
// Use of this source code is governed by a BSD-style
// license that can be found in the LICENSE file.
#pragma once

#include <stdexcept>
#include <tuple>
#include <vector>

#include <gtsam/geometry/Point3.h>
#include <gtsam/geometry/Pose3.h>
#include <gtsam/linear/NoiseModel.h>
#include <gtsam/nonlinear/NonlinearFactor.h>
#include <gtsam/slam/BetweenFactor.h>
#include <jsk_recognition_msgs/BoundingBox.h>
#include <Eigen/Dense>

class MaxMixturePoint3 : public gtsam::Point3 {};

class Detection {
 public:
  using BoundingBox = jsk_recognition_msgs::BoundingBox;

 protected:
  BoundingBox box;
  gtsam::Pose3 pose;
  gtsam::Vector6 variances;

 public:
  Detection(BoundingBox box, gtsam::Vector6 variances);

  const BoundingBox getBoundingBox() const { return this->box; }
  const gtsam::Pose3 getPose() const { return this->pose; };
  gtsam::noiseModel::Diagonal::shared_ptr getDiagonal() const { return gtsam::noiseModel::Diagonal::Variances(this->variances); }

  const double error(const gtsam::Pose3 x) const;
};

std::tuple<size_t, double>
getDetectionIndexAndError(const gtsam::Pose3 &d,
                          std::vector<Detection> detections);

class TightlyCoupledDetectionFactor : public gtsam::NoiseModelFactor2<gtsam::Pose3,
                                                                      gtsam::Pose3> {
 private:
  using This = TightlyCoupledDetectionFactor;
  using Base = gtsam::NoiseModelFactor2<gtsam::Pose3, gtsam::Pose3>;

 protected:
  std::vector<Detection> detections;
  std::vector<gtsam::noiseModel::Diagonal::shared_ptr> noiseModels;

  mutable int cachedDetectionIndex = -1;

 public:
  /** Constructor */
  TightlyCoupledDetectionFactor(gtsam::Key robotPoseKey,
                                gtsam::Key objectPoseKey,
                                std::vector<Detection> detections)
      : Base(boost::shared_ptr<gtsam::noiseModel::Diagonal>(),
             robotPoseKey,
             objectPoseKey),
        detections(detections) {
    for (const auto &detection : detections) {
      this->noiseModels.push_back(detection.getDiagonal());
    }
  }

  /** Contructor with cached detection index */
  TightlyCoupledDetectionFactor(gtsam::Key robotPoseKey,
                                gtsam::Key objectPoseKey,
                                std::vector<Detection> detections,
                                int cachedDetectionIndex)
      : Base(boost::shared_ptr<gtsam::noiseModel::Diagonal>(),
             robotPoseKey,
             objectPoseKey),
        detections(detections),
        cachedDetectionIndex(cachedDetectionIndex) {
    for (const auto &detection : detections) {
      this->noiseModels.push_back(detection.getDiagonal());
    }
  }

  /** print */
  virtual void
  print(const std::string &s,
        const gtsam::KeyFormatter &keyFormatter = gtsam::DefaultKeyFormatter) const {
    std::cout << s << "TightlyCoupledDetectionFactor("
              << keyFormatter(this->robotPoseKey()) << ", "
              << keyFormatter(this->objectPoseKey()) << ")\n";
  }

  /** equals */
  virtual bool equals(const NonlinearFactor &expected, double tol = 1e-9) const {
    const This *e = dynamic_cast<const This *>(&expected);
    return e != NULL && Base::equals(*e, tol);
  }

  gtsam::Key robotPoseKey() const { return key1(); }
  gtsam::Key objectPoseKey() const { return key2(); }
  const std::vector<Detection> &getDetections() const { return this->detections; }
  const int getCachedDetectionIndex() const { return this->cachedDetectionIndex; }

  virtual gtsam::Vector
  evaluateError(const gtsam::Pose3 &robotPose,
                const gtsam::Pose3 &objectPose,
                const gtsam::Pose3 &measured,
                boost::optional<gtsam::Matrix &> H1 = boost::none,
                boost::optional<gtsam::Matrix &> H2 = boost::none) const;

  virtual gtsam::Vector
  evaluateError(const gtsam::Pose3 &robotPose,
                const gtsam::Pose3 &objectPose,
                boost::optional<gtsam::Matrix &> H1 = boost::none,
                boost::optional<gtsam::Matrix &> H2 = boost::none) const override {
    throw std::runtime_error("This evaluateError function is revoked.");
  }

  virtual gtsam::Vector unwhitenedError(const gtsam::Pose3 &measured,
                                        const gtsam::Values &x,
                                        boost::optional<std::vector<gtsam::Matrix> &> H = boost::none) const;

  virtual gtsam::Vector unwhitenedError(const gtsam::Values &x,
                                        boost::optional<std::vector<gtsam::Matrix> &> H = boost::none) const override {
    throw std::runtime_error("This unwhitenedError function is revoked.");
  };

  virtual boost::shared_ptr<gtsam::GaussianFactor>
  linearize(const gtsam::Values &c) const override;

  virtual double error(const gtsam::Values &c) const override;

  std::size_t size() const {
    return 2;
  }

  virtual size_t dim() const override { return 6; }
};

class LooselyCoupledDetectionFactor : public gtsam::NoiseModelFactor1<gtsam::Pose3> {
 private:
  using This = LooselyCoupledDetectionFactor;
  using Base = gtsam::NoiseModelFactor1<gtsam::Pose3>;

 protected:
  std::vector<Detection> detections;
  std::vector<gtsam::noiseModel::Diagonal::shared_ptr> noiseModels;

  gtsam::Key robotPoseKey_;

  mutable int cachedDetectionIndex = -1;

 public:
  /** Constructor */
  LooselyCoupledDetectionFactor(gtsam::Key robotPoseKey,
                                gtsam::Key objectPoseKey,
                                std::vector<Detection> detections)
      : Base(boost::shared_ptr<gtsam::noiseModel::Diagonal>(), objectPoseKey),
        robotPoseKey_(robotPoseKey),
        detections(detections) {
    for (const auto &detection : detections) {
      this->noiseModels.push_back(detection.getDiagonal());
    }
  }

  /** Contructor with cached detection index */
  LooselyCoupledDetectionFactor(gtsam::Key robotPoseKey,
                                gtsam::Key objectPoseKey,
                                std::vector<Detection> detections,
                                int cachedDetectionIndex)
      : Base(boost::shared_ptr<gtsam::noiseModel::Diagonal>(), objectPoseKey),
        robotPoseKey_(robotPoseKey),
        detections(detections),
        cachedDetectionIndex(cachedDetectionIndex) {
    for (const auto &detection : detections) {
      this->noiseModels.push_back(detection.getDiagonal());
    }
  }

  /** print */
  virtual void
  print(const std::string &s,
        const gtsam::KeyFormatter &keyFormatter = gtsam::DefaultKeyFormatter) const {
    std::cout << s << "LooselyCoupledDetectionFactor("
              << keyFormatter(this->robotPoseKey()) << ","
              << keyFormatter(this->objectPoseKey()) << ")\n";
  }

  /** equals */
  virtual bool equals(const NonlinearFactor &expected, double tol = 1e-9) const {
    const This *e = dynamic_cast<const This *>(&expected);
    return e != NULL && Base::equals(*e, tol);
  }

  gtsam::Key robotPoseKey() const { return this->robotPoseKey_; }
  gtsam::Key objectPoseKey() const { return this->key(); }
  const std::vector<Detection> &getDetections() const { return this->detections; }
  const int getCachedDetectionIndex() const { return this->cachedDetectionIndex; }

  virtual gtsam::Vector
  evaluateError(const gtsam::Pose3 &robotPose,
                const gtsam::Pose3 &objectPose,
                const gtsam::Pose3 &measured,
                boost::optional<gtsam::Matrix &> H1 = boost::none) const;

  virtual gtsam::Vector
  evaluateError(const gtsam::Pose3 &x, boost::optional<gtsam::Matrix &> H = boost::none) const override {
    throw std::runtime_error("This evaluateError function is revoked.");
  }

  virtual gtsam::Vector unwhitenedError(const gtsam::Pose3 &measured,
                                        const gtsam::Values &x,
                                        boost::optional<std::vector<gtsam::Matrix> &> H = boost::none) const;

  virtual gtsam::Vector unwhitenedError(const gtsam::Values &x,
                                        boost::optional<std::vector<gtsam::Matrix> &> H = boost::none) const override {
    throw std::runtime_error("This unwhitenedError function is revoked.");
  };

  virtual boost::shared_ptr<gtsam::GaussianFactor>
  linearize(const gtsam::Values &c) const override;

  virtual double error(const gtsam::Values &c) const override;

  std::size_t size() const {
    return 1;
  }

  virtual size_t dim() const override { return 6; }
};

class ConstantVelocityFactor : public gtsam::BetweenFactor<gtsam::Pose3> {
 private:
  using This = ConstantVelocityFactor;
  using Base = gtsam::BetweenFactor<gtsam::Pose3>;

 public:
  ConstantVelocityFactor(gtsam::Key key1,
                         gtsam::Key key2,
                         const gtsam::SharedNoiseModel &model = nullptr)
      : Base(key1, key2, gtsam::Pose3::identity(), model) {
  }

  virtual ~ConstantVelocityFactor() {}

  ///@name Testable
  ///@{

  /** print */
  virtual void
  print(const std::string &s,
        const gtsam::KeyFormatter &keyFormatter = gtsam::DefaultKeyFormatter) const {
    std::cout << s << "ConstantVelocityFactor("
              << keyFormatter(this->key1()) << ","
              << keyFormatter(this->key2()) << ")\n";
    this->noiseModel_->print("  noise model: ");
  }

  /** equals */
  virtual bool equals(const NonlinearFactor &expected, double tol = 1e-9) const {
    const This *e = dynamic_cast<const This *>(&expected);
    return e != NULL && Base::equals(*e, tol);
  }
};

class StablePoseFactor : public gtsam::NoiseModelFactor3<gtsam::Pose3,
                                                         gtsam::Pose3,
                                                         gtsam::Pose3> {
 private:
  using This = StablePoseFactor;
  using Base = gtsam::NoiseModelFactor3<gtsam::Pose3,
                                        gtsam::Pose3,
                                        gtsam::Pose3>;

 protected:
  double deltaTime;

 public:
  /** Constructor */
  StablePoseFactor(gtsam::Key previousPoseKey,
                   gtsam::Key velocityKey,
                   gtsam::Key nextPoseKey,
                   double deltaTime,
                   const gtsam::SharedNoiseModel &model = nullptr)
      : Base(model, previousPoseKey, velocityKey, nextPoseKey), deltaTime(deltaTime) {
  }

  /** print */
  virtual void
  print(const std::string &s,
        const gtsam::KeyFormatter &keyFormatter = gtsam::DefaultKeyFormatter) const {
    std::cout << s << "StablePoseFactor("
              << keyFormatter(this->key1()) << ","
              << keyFormatter(this->key2()) << ","
              << keyFormatter(this->key3()) << ")"
              << "  dt = " << this->deltaTime << '\n';
    this->noiseModel_->print("  noise model: ");
  }

  /** equals */
  virtual bool equals(const NonlinearFactor &expected, double tol = 1e-9) const {
    const This *e = dynamic_cast<const This *>(&expected);
    return e != NULL && Base::equals(*e, tol);
  }

  inline gtsam::Key previousPoseKey() const { return key1(); }
  inline gtsam::Key velocityKey() const { return key2(); }
  inline gtsam::Key nextPoseKey() const { return key3(); }

  virtual gtsam::Vector
  evaluateError(const gtsam::Pose3 &previousPose,
                const gtsam::Pose3 &velocity,
                const gtsam::Pose3 &nextPose,
                boost::optional<gtsam::Matrix &> H1 = boost::none,
                boost::optional<gtsam::Matrix &> H2 = boost::none,
                boost::optional<gtsam::Matrix &> H3 = boost::none) const override;
};

================================================
FILE: include/solver.h
================================================
// Copyright 2022 Yu-Kai Lin. All rights reserved.
// Use of this source code is governed by a BSD-style
// license that can be found in the LICENSE file.
#pragma once

#include <gtsam/nonlinear/ISAM2.h>

#include "factor.h"

class MaxMixtureISAM2 : public gtsam::ISAM2 {
 public:
  explicit MaxMixtureISAM2(const gtsam::ISAM2Params& params) : gtsam::ISAM2(params){};

  MaxMixtureISAM2() : gtsam::ISAM2(){};

  virtual gtsam::ISAM2Result update(
      const gtsam::NonlinearFactorGraph& newFactors                            = gtsam::NonlinearFactorGraph(),
      const gtsam::Values& newTheta                                            = gtsam::Values(),
      const gtsam::FactorIndices& removeFactorIndices                          = gtsam::FactorIndices(),
      const boost::optional<gtsam::FastMap<gtsam::Key, int> >& constrainedKeys = boost::none,
      const boost::optional<gtsam::FastList<gtsam::Key> >& noRelinKeys         = boost::none,
      const boost::optional<gtsam::FastList<gtsam::Key> >& extraReelimKeys     = boost::none,
      bool force_relinearize                                                   = false) override;

  /**
   * Add variable, update, and relinearize if need; almost everything is same as
   * the original ISAM2 solver, except for the relinearization condition for
   * max-mixture models.
   */
  virtual gtsam::ISAM2Result
  update(const gtsam::NonlinearFactorGraph& newFactors,
         const gtsam::Values& newTheta,
         const gtsam::ISAM2UpdateParams& updateParams) override;
};

================================================
FILE: include/utility.h
================================================
#pragma once
#ifndef _UTILITY_LIDAR_ODOMETRY_H_
#define _UTILITY_LIDAR_ODOMETRY_H_

#include <ros/ros.h>

#include <nav_msgs/Odometry.h>
#include <nav_msgs/Path.h>
#include <sensor_msgs/Imu.h>
#include <sensor_msgs/NavSatFix.h>
#include <sensor_msgs/PointCloud2.h>
#include <std_msgs/Float64MultiArray.h>
#include <std_msgs/Header.h>
#include <visualization_msgs/Marker.h>
#include <visualization_msgs/MarkerArray.h>

#include <opencv/cv.h>

#include <pcl/common/common.h>
#include <pcl/common/transforms.h>
#include <pcl/filters/crop_box.h>
#include <pcl/filters/filter.h>
#include <pcl/filters/voxel_grid.h>
#include <pcl/io/pcd_io.h>
#include <pcl/kdtree/kdtree_flann.h>
#include <pcl/point_cloud.h>
#include <pcl/point_types.h>
#include <pcl/range_image/range_image.h>
#include <pcl/registration/icp.h>
#include <pcl_conversions/pcl_conversions.h>
#include <pcl/search/impl/search.hpp>

#include <tf/LinearMath/Quaternion.h>
#include <tf/transform_broadcaster.h>
#include <tf/transform_datatypes.h>
#include <tf/transform_listener.h>

#include <algorithm>
#include <array>
#include <cfloat>
#include <cmath>
#include <ctime>
#include <deque>
#include <fstream>
#include <iomanip>
#include <iostream>
#include <iterator>
#include <limits>
#include <mutex>
#include <queue>
#include <sstream>
#include <string>
#include <thread>
#include <vector>

using namespace std;

typedef pcl::PointXYZI PointType;

enum class SensorType { VELODYNE,
                        OUSTER };

class ParamServer {
 public:
  ros::NodeHandle nh;

  std::string robot_id;

  // Topics
  string pointCloudTopic;
  string imuTopic;
  string odomTopic;
  string gpsTopic;

  // Frames
  string lidarFrame;
  string baselinkFrame;
  string odometryFrame;
  string mapFrame;

  // GPS Settings
  bool useImuHeadingInitialization;
  bool useGpsElevation;
  float gpsCovThreshold;
  float poseCovThreshold;

  // Save pcd
  bool savePCD;
  string savePCDDirectory;

  // Lidar Sensor Configuration
  SensorType sensor;
  int N_SCAN;
  int Horizon_SCAN;
  int downsampleRate;
  float lidarMinRange;
  float lidarMaxRange;

  // IMU
  float imuAccNoise;
  float imuGyrNoise;
  float imuAccBiasN;
  float imuGyrBiasN;
  float imuGravity;
  float imuRPYWeight;
  vector<double> extRotV;
  vector<double> extRPYV;
  vector<double> extTransV;
  Eigen::Matrix3d extRot;
  Eigen::Matrix3d extRPY;
  Eigen::Vector3d extTrans;
  Eigen::Quaterniond extQRPY;

  // LOAM
  float edgeThreshold;
  float surfThreshold;
  int edgeFeatureMinValidNum;
  int surfFeatureMinValidNum;

  // voxel filter paprams
  float odometrySurfLeafSize;
  float mappingCornerLeafSize;
  float mappingSurfLeafSize;

  float z_tollerance;
  float rotation_tollerance;

  // CPU Params
  int numberOfCores;
  double mappingProcessInterval;

  // Surrounding map
  float surroundingkeyframeAddingDistThreshold;
  float surroundingkeyframeAddingAngleThreshold;
  float surroundingKeyframeDensity;
  float surroundingKeyframeSearchRadius;

  // Loop closure
  bool loopClosureEnableFlag;
  float loopClosureFrequency;
  int surroundingKeyframeSize;
  float historyKeyframeSearchRadius;
  float historyKeyframeSearchTimeDiff;
  int historyKeyframeSearchNum;
  float historyKeyframeFitnessScore;

  // global map visualization radius
  float globalMapVisualizationSearchRadius;
  float globalMapVisualizationPoseDensity;
  float globalMapVisualizationLeafSize;

  // Dynamic object data association
  float detectionMatchThreshold;
  vector<double> dataAssociationVarianceVector;
  vector<double> earlyLooselyCoupledMatchingVarianceVector;
  vector<double> looselyCoupledMatchingVarianceVector;
  vector<double> tightlyCoupledMatchingVarianceVector;

  Eigen::Matrix<double, 6, 1> dataAssociationVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> earlyLooselyCoupledMatchingVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> looselyCoupledMatchingVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> tightlyCoupledMatchingVarianceEigenVector;

  // Factor covariance matrices (presented as diagonal vectors)
  vector<double> priorOdometryDiagonalVarianceVector;
  vector<double> odometryDiagonalVarianceVector;
  vector<double> earlyConstantVelocityDiagonalVarianceVector;
  vector<double> constantVelocityDiagonalVarianceVector;
  vector<double> motionDiagonalVarianceVector;
  vector<double> looselyCoupledDetectionVarianceVector;
  vector<double> tightlyCoupledDetectionVarianceVector;

  Eigen::Matrix<double, 6, 1> priorOdometryDiagonalVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> odometryDiagonalVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> earlyConstantVelocityDiagonalVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> constantVelocityDiagonalVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> motionDiagonalVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> looselyCoupledDetectionVarianceEigenVector;
  Eigen::Matrix<double, 6, 1> tightlyCoupledDetectionVarianceEigenVector;

  // Gentle coupling options
  int numberOfEarlySteps;
  int numberOfPreLooseCouplingSteps;
  int numberOfVelocityConsistencySteps;
  int numberOfInterLooseCouplingSteps;
  float tightCouplingDetectionErrorThreshold;

  float objectAngularVelocityConsistencyVarianceThreshold;
  float objectLinearVelocityConsistencyVarianceThreshold;

  // Tracking
  int trackingStepsForLostObject;

  ParamServer() {
    nh.param<std::string>("/robot_id", robot_id, "roboat");

    nh.param<std::string>("lio_segmot/pointCloudTopic", pointCloudTopic, "points_raw");
    nh.param<std::string>("lio_segmot/imuTopic", imuTopic, "imu_correct");
    nh.param<std::string>("lio_segmot/odomTopic", odomTopic, "odometry/imu");
    nh.param<std::string>("lio_segmot/gpsTopic", gpsTopic, "odometry/gps");

    nh.param<std::string>("lio_segmot/lidarFrame", lidarFrame, "base_link");
    nh.param<std::string>("lio_segmot/baselinkFrame", baselinkFrame, "base_link");
    nh.param<std::string>("lio_segmot/odometryFrame", odometryFrame, "odom");
    nh.param<std::string>("lio_segmot/mapFrame", mapFrame, "map");

    nh.param<bool>("lio_segmot/useImuHeadingInitialization", useImuHeadingInitialization, false);
    nh.param<bool>("lio_segmot/useGpsElevation", useGpsElevation, false);
    nh.param<float>("lio_segmot/gpsCovThreshold", gpsCovThreshold, 2.0);
    nh.param<float>("lio_segmot/poseCovThreshold", poseCovThreshold, 25.0);

    nh.param<bool>("lio_segmot/savePCD", savePCD, false);
    nh.param<std::string>("lio_segmot/savePCDDirectory", savePCDDirectory, "/Downloads/LOAM/");

    std::string sensorStr;
    nh.param<std::string>("lio_segmot/sensor", sensorStr, "");
    if (sensorStr == "velodyne") {
      sensor = SensorType::VELODYNE;
    } else if (sensorStr == "ouster") {
      sensor = SensorType::OUSTER;
    } else {
      ROS_ERROR_STREAM(
          "Invalid sensor type (must be either 'velodyne' or 'ouster'): " << sensorStr);
      ros::shutdown();
    }

    nh.param<int>("lio_segmot/N_SCAN", N_SCAN, 16);
    nh.param<int>("lio_segmot/Horizon_SCAN", Horizon_SCAN, 1800);
    nh.param<int>("lio_segmot/downsampleRate", downsampleRate, 1);
    nh.param<float>("lio_segmot/lidarMinRange", lidarMinRange, 1.0);
    nh.param<float>("lio_segmot/lidarMaxRange", lidarMaxRange, 1000.0);

    nh.param<float>("lio_segmot/imuAccNoise", imuAccNoise, 0.01);
    nh.param<float>("lio_segmot/imuGyrNoise", imuGyrNoise, 0.001);
    nh.param<float>("lio_segmot/imuAccBiasN", imuAccBiasN, 0.0002);
    nh.param<float>("lio_segmot/imuGyrBiasN", imuGyrBiasN, 0.00003);
    nh.param<float>("lio_segmot/imuGravity", imuGravity, 9.80511);
    nh.param<float>("lio_segmot/imuRPYWeight", imuRPYWeight, 0.01);
    nh.param<vector<double>>("lio_segmot/extrinsicRot", extRotV, vector<double>());
    nh.param<vector<double>>("lio_segmot/extrinsicRPY", extRPYV, vector<double>());
    nh.param<vector<double>>("lio_segmot/extrinsicTrans", extTransV, vector<double>());
    extRot   = Eigen::Map<const Eigen::Matrix<double, -1, -1, Eigen::RowMajor>>(extRotV.data(), 3, 3);
    extRPY   = Eigen::Map<const Eigen::Matrix<double, -1, -1, Eigen::RowMajor>>(extRPYV.data(), 3, 3);
    extTrans = Eigen::Map<const Eigen::Matrix<double, -1, -1, Eigen::RowMajor>>(extTransV.data(), 3, 1);
    extQRPY  = Eigen::Quaterniond(extRPY);

    nh.param<float>("lio_segmot/edgeThreshold", edgeThreshold, 0.1);
    nh.param<float>("lio_segmot/surfThreshold", surfThreshold, 0.1);
    nh.param<int>("lio_segmot/edgeFeatureMinValidNum", edgeFeatureMinValidNum, 10);
    nh.param<int>("lio_segmot/surfFeatureMinValidNum", surfFeatureMinValidNum, 100);

    nh.param<float>("lio_segmot/odometrySurfLeafSize", odometrySurfLeafSize, 0.2);
    nh.param<float>("lio_segmot/mappingCornerLeafSize", mappingCornerLeafSize, 0.2);
    nh.param<float>("lio_segmot/mappingSurfLeafSize", mappingSurfLeafSize, 0.2);

    nh.param<float>("lio_segmot/z_tollerance", z_tollerance, FLT_MAX);
    nh.param<float>("lio_segmot/rotation_tollerance", rotation_tollerance, FLT_MAX);

    nh.param<int>("lio_segmot/numberOfCores", numberOfCores, 2);
    nh.param<double>("lio_segmot/mappingProcessInterval", mappingProcessInterval, 0.15);

    nh.param<float>("lio_segmot/surroundingkeyframeAddingDistThreshold", surroundingkeyframeAddingDistThreshold, 1.0);
    nh.param<float>("lio_segmot/surroundingkeyframeAddingAngleThreshold", surroundingkeyframeAddingAngleThreshold, 0.2);
    nh.param<float>("lio_segmot/surroundingKeyframeDensity", surroundingKeyframeDensity, 1.0);
    nh.param<float>("lio_segmot/surroundingKeyframeSearchRadius", surroundingKeyframeSearchRadius, 50.0);

    nh.param<bool>("lio_segmot/loopClosureEnableFlag", loopClosureEnableFlag, false);
    nh.param<float>("lio_segmot/loopClosureFrequency", loopClosureFrequency, 1.0);
    nh.param<int>("lio_segmot/surroundingKeyframeSize", surroundingKeyframeSize, 50);
    nh.param<float>("lio_segmot/historyKeyframeSearchRadius", historyKeyframeSearchRadius, 10.0);
    nh.param<float>("lio_segmot/historyKeyframeSearchTimeDiff", historyKeyframeSearchTimeDiff, 30.0);
    nh.param<int>("lio_segmot/historyKeyframeSearchNum", historyKeyframeSearchNum, 25);
    nh.param<float>("lio_segmot/historyKeyframeFitnessScore", historyKeyframeFitnessScore, 0.3);

    nh.param<float>("lio_segmot/globalMapVisualizationSearchRadius", globalMapVisualizationSearchRadius, 1e3);
    nh.param<float>("lio_segmot/globalMapVisualizationPoseDensity", globalMapVisualizationPoseDensity, 10.0);
    nh.param<float>("lio_segmot/globalMapVisualizationLeafSize", globalMapVisualizationLeafSize, 1.0);

    nh.param<float>("lio_segmot/detectionMatchThreshold", detectionMatchThreshold, 19.5);
    nh.param<vector<double>>("lio_segmot/dataAssociationVarianceVector", dataAssociationVarianceVector, {1e-4, 1e-4, 1e-4, 1e-2, 2e-3, 2e-3});
    nh.param<vector<double>>("lio_segmot/earlyLooselyCoupledMatchingVarianceVector", earlyLooselyCoupledMatchingVarianceVector, {1e-4, 1e-4, 1e-4, 1e-2, 2e-3, 2e-3});
    nh.param<vector<double>>("lio_segmot/looselyCoupledMatchingVarianceVector", looselyCoupledMatchingVarianceVector, {1e-4, 1e-4, 1e-4, 1e-2, 2e-3, 2e-3});
    nh.param<vector<double>>("lio_segmot/tightlyCoupledMatchingVarianceVector", tightlyCoupledMatchingVarianceVector, {1e-4, 1e-4, 1e-4, 1e-2, 2e-3, 2e-3});

    nh.param<vector<double>>("lio_segmot/priorOdometryDiagonalVarianceVector", priorOdometryDiagonalVarianceVector, {1e-2, 1e-2, M_PI * M_PI, 1e8, 1e8, 1e8});
    nh.param<vector<double>>("lio_segmot/odometryDiagonalVarianceVector", odometryDiagonalVarianceVector, {1e-6, 1e-6, 1e-6, 1e-4, 1e-4, 1e-4});
    nh.param<vector<double>>("lio_segmot/earlyConstantVelocityDiagonalVarianceVector", earlyConstantVelocityDiagonalVarianceVector, {1e-3, 1e-3, 1e-3, 2e-1, 1e-1, 1e-1});
    nh.param<vector<double>>("lio_segmot/constantVelocityDiagonalVarianceVector", constantVelocityDiagonalVarianceVector, {1e-3, 1e-3, 1e-3, 2e-1, 1e-1, 1e-1});
    nh.param<vector<double>>("lio_segmot/motionDiagonalVarianceVector", motionDiagonalVarianceVector, {1e-4, 1e-4, 1e-2, 1e-1, 1e-2, 1e-2});
    nh.param<vector<double>>("lio_segmot/looselyCoupledDetectionVarianceVector", looselyCoupledDetectionVarianceVector, {1e-4, 1e-4, 1e-4, 1e-2, 2e-3, 2e-3});
    nh.param<vector<double>>("lio_segmot/tightlyCoupledDetectionVarianceVector", tightlyCoupledDetectionVarianceVector, {1e-4, 1e-4, 1e-4, 1e-2, 2e-3, 2e-3});

    dataAssociationVarianceEigenVector             = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(dataAssociationVarianceVector.data());
    earlyLooselyCoupledMatchingVarianceEigenVector = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(earlyLooselyCoupledMatchingVarianceVector.data());
    looselyCoupledMatchingVarianceEigenVector      = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(looselyCoupledMatchingVarianceVector.data());
    tightlyCoupledMatchingVarianceEigenVector      = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(tightlyCoupledMatchingVarianceVector.data());

    priorOdometryDiagonalVarianceEigenVector         = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(priorOdometryDiagonalVarianceVector.data());
    odometryDiagonalVarianceEigenVector              = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(odometryDiagonalVarianceVector.data());
    earlyConstantVelocityDiagonalVarianceEigenVector = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(earlyConstantVelocityDiagonalVarianceVector.data());
    constantVelocityDiagonalVarianceEigenVector      = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(constantVelocityDiagonalVarianceVector.data());
    motionDiagonalVarianceEigenVector                = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(motionDiagonalVarianceVector.data());
    looselyCoupledDetectionVarianceEigenVector       = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(looselyCoupledDetectionVarianceVector.data());
    tightlyCoupledDetectionVarianceEigenVector       = Eigen::Map<const Eigen::Matrix<double, 6, 1>>(tightlyCoupledDetectionVarianceVector.data());

    nh.param<int>("lio_segmot/numberOfEarlySteps", numberOfEarlySteps, 1);
    nh.param<int>("lio_segmot/numberOfPreLooseCouplingSteps", numberOfPreLooseCouplingSteps, 10);
    nh.param<int>("lio_segmot/numberOfVelocityConsistencySteps", numberOfVelocityConsistencySteps, 4);
    nh.param<int>("lio_segmot/numberOfInterLooseCouplingSteps", numberOfInterLooseCouplingSteps, 0);
    nh.param<float>("lio_segmot/tightCouplingDetectionErrorThreshold", tightCouplingDetectionErrorThreshold, 500.0);

    nh.param<float>("lio_segmot/objectAngularVelocityConsistencyVarianceThreshold", objectAngularVelocityConsistencyVarianceThreshold, 1e-5);
    nh.param<float>("lio_segmot/objectLinearVelocityConsistencyVarianceThreshold", objectLinearVelocityConsistencyVarianceThreshold, 1e-2);

    nh.param<int>("lio_segmot/trackingStepsForLostObject", trackingStepsForLostObject, 3);

    usleep(100);
  }

  sensor_msgs::Imu imuConverter(const sensor_msgs::Imu &imu_in) {
    sensor_msgs::Imu imu_out = imu_in;
    // rotate acceleration
    Eigen::Vector3d acc(imu_in.linear_acceleration.x, imu_in.linear_acceleration.y, imu_in.linear_acceleration.z);
    acc                           = extRot * acc;
    imu_out.linear_acceleration.x = acc.x();
    imu_out.linear_acceleration.y = acc.y();
    imu_out.linear_acceleration.z = acc.z();
    // rotate gyroscope
    Eigen::Vector3d gyr(imu_in.angular_velocity.x, imu_in.angular_velocity.y, imu_in.angular_velocity.z);
    gyr                        = extRot * gyr;
    imu_out.angular_velocity.x = gyr.x();
    imu_out.angular_velocity.y = gyr.y();
    imu_out.angular_velocity.z = gyr.z();
    // rotate roll pitch yaw
    Eigen::Quaterniond q_from(imu_in.orientation.w, imu_in.orientation.x, imu_in.orientation.y, imu_in.orientation.z);
    Eigen::Quaterniond q_final = q_from * extQRPY;
    imu_out.orientation.x      = q_final.x();
    imu_out.orientation.y      = q_final.y();
    imu_out.orientation.z      = q_final.z();
    imu_out.orientation.w      = q_final.w();

    if (sqrt(q_final.x() * q_final.x() + q_final.y() * q_final.y() + q_final.z() * q_final.z() + q_final.w() * q_final.w()) < 0.1) {
      ROS_ERROR("Invalid quaternion, please use a 9-axis IMU!");
      ros::shutdown();
    }

    return imu_out;
  }
};

sensor_msgs::PointCloud2 publishCloud(ros::Publisher *thisPub, pcl::PointCloud<PointType>::Ptr thisCloud, ros::Time thisStamp, std::string thisFrame) {
  sensor_msgs::PointCloud2 tempCloud;
  pcl::toROSMsg(*thisCloud, tempCloud);
  tempCloud.header.stamp    = thisStamp;
  tempCloud.header.frame_id = thisFrame;
  if (thisPub->getNumSubscribers() != 0)
    thisPub->publish(tempCloud);
  return tempCloud;
}

template <typename T>
double ROS_TIME(T msg) {
  return msg->header.stamp.toSec();
}

template <typename T>
void imuAngular2rosAngular(sensor_msgs::Imu *thisImuMsg, T *angular_x, T *angular_y, T *angular_z) {
  *angular_x = thisImuMsg->angular_velocity.x;
  *angular_y = thisImuMsg->angular_velocity.y;
  *angular_z = thisImuMsg->angular_velocity.z;
}

template <typename T>
void imuAccel2rosAccel(sensor_msgs::Imu *thisImuMsg, T *acc_x, T *acc_y, T *acc_z) {
  *acc_x = thisImuMsg->linear_acceleration.x;
  *acc_y = thisImuMsg->linear_acceleration.y;
  *acc_z = thisImuMsg->linear_acceleration.z;
}

template <typename T>
void imuRPY2rosRPY(sensor_msgs::Imu *thisImuMsg, T *rosRoll, T *rosPitch, T *rosYaw) {
  double imuRoll, imuPitch, imuYaw;
  tf::Quaternion orientation;
  tf::quaternionMsgToTF(thisImuMsg->orientation, orientation);
  tf::Matrix3x3(orientation).getRPY(imuRoll, imuPitch, imuYaw);

  *rosRoll  = imuRoll;
  *rosPitch = imuPitch;
  *rosYaw   = imuYaw;
}

float pointDistance(PointType p) {
  return sqrt(p.x * p.x + p.y * p.y + p.z * p.z);
}

float pointDistance(PointType p1, PointType p2) {
  return sqrt((p1.x - p2.x) * (p1.x - p2.x) + (p1.y - p2.y) * (p1.y - p2.y) + (p1.z - p2.z) * (p1.z - p2.z));
}

#endif


================================================
FILE: launch/debug.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>
    
    <!-- Parameters -->
    <rosparam file="$(find lio_segmot)/config/params.yaml" command="load" />

    <!--- LOAM -->
    <include file="$(find lio_segmot)/launch/include/module_loam_debug.launch" />

    <!--- Robot State TF -->
    <include file="$(find lio_segmot)/launch/include/module_robot_state_publisher.launch" />

    <!--- Run Navsat -->
    <include file="$(find lio_segmot)/launch/include/module_navsat.launch" />

    <!--- Run Rviz-->
    <include file="$(find lio_segmot)/launch/include/module_rviz.launch" />

</launch>


================================================
FILE: launch/gdbserver_debug.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>
    
    <!-- Parameters -->
    <rosparam file="$(find lio_segmot)/config/params.yaml" command="load" />

    <!--- LOAM -->
    <include file="$(find lio_segmot)/launch/include/module_loam_gdbserver_debug.launch" />

    <!--- Robot State TF -->
    <include file="$(find lio_segmot)/launch/include/module_robot_state_publisher.launch" />

    <!--- Run Navsat -->
    <include file="$(find lio_segmot)/launch/include/module_navsat.launch" />

    <!--- Run Rviz-->
    <include file="$(find lio_segmot)/launch/include/module_rviz.launch" />

</launch>


================================================
FILE: launch/include/config/robot.urdf.xacro
================================================
<?xml version="1.0"?>
<robot name="lio" xmlns:xacro="http://tixiaoshan.github.io/">
  <xacro:property name="PI" value="3.1415926535897931" />

  <link name="chassis_link"></link>

  <link name="base_link"></link>
  <joint name="base_link_joint" type="fixed">
    <parent link="base_link"/>
    <child link="chassis_link" />
    <origin xyz="0 0 0" rpy="0 0 0" />
  </joint>

  <link name="imu_link"> </link>
  <joint name="imu_joint" type="fixed">
    <parent link="chassis_link" />
    <child link="imu_link" />
    <origin xyz="0 0 0" rpy="0 0 0" />
  </joint>

  <link name="velodyne"> </link>
  <joint name="velodyne_joint" type="fixed">
    <parent link="chassis_link" />
    <child link="velodyne" />
    <origin xyz="0 0 0" rpy="0 0 0" />
  </joint>

  <link name="navsat_link"> </link>
  <joint name="navsat_joint" type="fixed">
    <parent link="chassis_link" />
    <child link="navsat_link" />
    <origin xyz="0 0 0" rpy="0 0 0" />
  </joint>

</robot>


================================================
FILE: launch/include/config/rviz.rviz
================================================
Panels:
  - Class: rviz/Displays
    Help Height: 191
    Name: Displays
    Property Tree Widget:
      Expanded:
        - /Global Options1
        - /Status1
        - /links1
        - /Odometry1
        - /Point Cloud1
        - /Feature Cloud1
        - /Mapping1
        - /Mapping1/Map (global)1
        - /Tracking1
        - /Tracking1/Dynamic Objects1
      Splitter Ratio: 0.5940054655075073
    Tree Height: 686
  - Class: rviz/Selection
    Name: Selection
  - Class: rviz/Views
    Expanded:
      - /Current View1
    Name: Views
    Splitter Ratio: 0.5
  - Class: rviz/Tool Properties
    Expanded: ~
    Name: Tool Properties
    Splitter Ratio: 0.5
Preferences:
  PromptSaveOnExit: true
Toolbars:
  toolButtonStyle: 2
Visualization Manager:
  Class: ""
  Displays:
    - Class: rviz/Group
      Displays:
        - Class: rviz/TF
          Enabled: false
          Frame Timeout: 999
          Frames:
            All Enabled: false
          Marker Scale: 3
          Name: TF
          Show Arrows: true
          Show Axes: true
          Show Names: true
          Tree:
            {}
          Update Interval: 0
          Value: false
        - Class: rviz/Axes
          Enabled: true
          Length: 2
          Name: map
          Radius: 0.5
          Reference Frame: map
          Value: true
        - Class: rviz/Axes
          Enabled: true
          Length: 1
          Name: base_link
          Radius: 0.30000001192092896
          Reference Frame: base_link
          Value: true
        - Class: rviz/Axes
          Enabled: true
          Length: 1
          Name: lidar_link
          Radius: 0.10000000149011612
          Reference Frame: lidar_link
          Value: true
      Enabled: true
      Name: links
    - Class: rviz/Group
      Displays:
        - Angle Tolerance: 0.10000000149011612
          Class: rviz/Odometry
          Covariance:
            Orientation:
              Alpha: 0.5
              Color: 255; 255; 127
              Color Style: Unique
              Frame: Local
              Offset: 1
              Scale: 1
              Value: true
            Position:
              Alpha: 0.30000001192092896
              Color: 204; 51; 204
              Scale: 1
              Value: true
            Value: false
          Enabled: true
          Keep: 300
          Name: Odom IMU
          Position Tolerance: 0.10000000149011612
          Shape:
            Alpha: 1
            Axes Length: 0.5
            Axes Radius: 0.10000000149011612
            Color: 255; 25; 0
            Head Length: 0.30000001192092896
            Head Radius: 0.10000000149011612
            Shaft Length: 1
            Shaft Radius: 0.05000000074505806
            Value: Axes
          Topic: /odometry/imu
          Unreliable: false
          Value: true
        - Angle Tolerance: 0.10000000149011612
          Class: rviz/Odometry
          Covariance:
            Orientation:
              Alpha: 0.5
              Color: 255; 255; 127
              Color Style: Unique
              Frame: Local
              Offset: 1
              Scale: 1
              Value: true
            Position:
              Alpha: 0.30000001192092896
              Color: 204; 51; 204
              Scale: 1
              Value: true
            Value: false
          Enabled: true
          Keep: 300
          Name: Odom GPS
          Position Tolerance: 0.10000000149011612
          Shape:
            Alpha: 1
            Axes Length: 1
            Axes Radius: 0.30000001192092896
            Color: 255; 25; 0
            Head Length: 0.30000001192092896
            Head Radius: 0.10000000149011612
            Shaft Length: 1
            Shaft Radius: 0.05000000074505806
            Value: Axes
          Topic: /odometry/gps
          Unreliable: false
          Value: true
        - Angle Tolerance: 0.10000000149011612
          Class: rviz/Odometry
          Covariance:
            Orientation:
              Alpha: 0.5
              Color: 255; 255; 127
              Color Style: Unique
              Frame: Local
              Offset: 1
              Scale: 1
              Value: true
            Position:
              Alpha: 0.30000001192092896
              Color: 204; 51; 204
              Scale: 1
              Value: true
            Value: false
          Enabled: false
          Keep: 300
          Name: Odom lidar
          Position Tolerance: 0.10000000149011612
          Shape:
            Alpha: 1
            Axes Length: 0.25
            Axes Radius: 0.10000000149011612
            Color: 255; 25; 0
            Head Length: 0.30000001192092896
            Head Radius: 0.10000000149011612
            Shaft Length: 1
            Shaft Radius: 0.05000000074505806
            Value: Axes
          Topic: /lio_segmot/mapping/odometry
          Unreliable: false
          Value: false
      Enabled: false
      Name: Odometry
    - Class: rviz/Group
      Displays:
        - Alpha: 1
          Autocompute Intensity Bounds: false
          Autocompute Value Bounds:
            Max Value: 10
            Min Value: -10
            Value: true
          Axis: Z
          Channel Name: intensity
          Class: rviz/PointCloud2
          Color: 255; 255; 255
          Color Transformer: Intensity
          Decay Time: 0
          Enabled: true
          Invert Rainbow: false
          Max Color: 191; 64; 64
          Max Intensity: 0.9900000095367432
          Min Color: 191; 64; 64
          Min Intensity: 0
          Name: Velodyne
          Position Transformer: XYZ
          Queue Size: 10
          Selectable: false
          Size (Pixels): 2
          Size (m): 0.009999999776482582
          Style: Points
          Topic: /lio_segmot/mapping/cloud_deskewed
          Unreliable: false
          Use Fixed Frame: true
          Use rainbow: false
          Value: true
        - Alpha: 1
          Autocompute Intensity Bounds: true
          Autocompute Value Bounds:
            Max Value: 10
            Min Value: -10
            Value: true
          Axis: Z
          Channel Name: intensity
          Class: rviz/PointCloud2
          Color: 255; 255; 255
          Color Transformer: Intensity
          Decay Time: 0
          Enabled: false
          Invert Rainbow: false
          Max Color: 255; 255; 255
          Min Color: 0; 0; 0
          Name: Reg Cloud
          Position Transformer: XYZ
          Queue Size: 10
          Selectable: true
          Size (Pixels): 3
          Size (m): 0.009999999776482582
          Style: Points
          Topic: /lio_segmot/mapping/cloud_registered
          Unreliable: false
          Use Fixed Frame: true
          Use rainbow: true
          Value: false
        - Class: jsk_rviz_plugin/BoundingBoxArray
          Enabled: true
          Name: Detections
          Topic: /lio_segmot/mapping/detections
          Unreliable: false
          Value: true
          alpha: 0.6000000238418579
          color: 171; 255; 152
          coloring: Flat color
          line width: 0.15000000596046448
          only edge: false
          show coords: false
      Enabled: true
      Name: Point Cloud
    - Class: rviz/Group
      Displays:
        - Alpha: 1
          Autocompute Intensity Bounds: true
          Autocompute Value Bounds:
            Max Value: 10
            Min Value: -10
            Value: true
          Axis: Z
          Channel Name: intensity
          Class: rviz/PointCloud2
          Color: 0; 255; 0
          Color Transformer: FlatColor
          Decay Time: 0
          Enabled: true
          Invert Rainbow: false
          Max Color: 255; 255; 255
          Min Color: 0; 0; 0
          Name: Edge Feature
          Position Transformer: XYZ
          Queue Size: 10
          Selectable: true
          Size (Pixels): 5
          Size (m): 0.009999999776482582
          Style: Points
          Topic: /lio_segmot/feature/cloud_corner
          Unreliable: false
          Use Fixed Frame: true
          Use rainbow: true
          Value: true
        - Alpha: 1
          Autocompute Intensity Bounds: true
          Autocompute Value Bounds:
            Max Value: 10
            Min Value: -10
            Value: true
          Axis: Z
          Channel Name: intensity
          Class: rviz/PointCloud2
          Color: 255; 85; 255
          Color Transformer: FlatColor
          Decay Time: 0
          Enabled: true
          Invert Rainbow: false
          Max Color: 255; 255; 255
          Min Color: 0; 0; 0
          Name: Plannar Feature
          Position Transformer: XYZ
          Queue Size: 10
          Selectable: true
          Size (Pixels): 3
          Size (m): 0.009999999776482582
          Style: Points
          Topic: /lio_segmot/feature/cloud_surface
          Unreliable: false
          Use Fixed Frame: true
          Use rainbow: true
          Value: true
      Enabled: false
      Name: Feature Cloud
    - Class: rviz/Group
      Displays:
        - Alpha: 0.30000001192092896
          Autocompute Intensity Bounds: false
          Autocompute Value Bounds:
            Max Value: 7.4858574867248535
            Min Value: -1.2309398651123047
            Value: true
          Axis: Z
          Channel Name: intensity
          Class: rviz/PointCloud2
          Color: 255; 255; 255
          Color Transformer: Intensity
          Decay Time: 300
          Enabled: false
          Invert Rainbow: false
          Max Color: 46; 52; 54
          Max Intensity: 0.9900000095367432
          Min Color: 46; 52; 54
          Min Intensity: 0
          Name: Map (cloud)
          Position Transformer: XYZ
          Queue Size: 10
          Selectable: false
          Size (Pixels): 2
          Size (m): 0.009999999776482582
          Style: Points
          Topic: /lio_segmot/mapping/cloud_registered
          Unreliable: false
          Use Fixed Frame: true
          Use rainbow: false
          Value: false
        - Alpha: 1
          Autocompute Intensity Bounds: true
          Autocompute Value Bounds:
            Max Value: 36.61034393310547
            Min Value: -2.3476977348327637
            Value: true
          Axis: Z
          Channel Name: intensity
          Class: rviz/PointCloud2
          Color: 255; 255; 255
          Color Transformer: AxisColor
          Decay Time: 0
          Enabled: false
          Invert Rainbow: false
          Max Color: 255; 255; 255
          Min Color: 0; 0; 0
          Name: Map (local)
          Position Transformer: XYZ
          Queue Size: 10
          Selectable: true
          Size (Pixels): 2
          Size (m): 0.10000000149011612
          Style: Flat Squares
          Topic: /lio_segmot/mapping/map_local
          Unreliable: false
          Use Fixed Frame: true
          Use rainbow: true
          Value: false
        - Alpha: 1
          Autocompute Intensity Bounds: true
          Autocompute Value Bounds:
            Max Value: 10
            Min Value: -10
            Value: true
          Axis: Z
          Channel Name: intensity
          Class: rviz/PointCloud2
          Color: 255; 255; 255
          Color Transformer: Intensity
          Decay Time: 0
          Enabled: true
          Invert Rainbow: false
          Max Color: 75; 75; 75
          Min Color: 75; 75; 75
          Name: Map (global)
          Position Transformer: XYZ
          Queue Size: 10
          Selectable: true
          Size (Pixels): 2
          Size (m): 0.009999999776482582
          Style: Points
          Topic: /lio_segmot/mapping/map_global
          Unreliable: false
          Use Fixed Frame: true
          Use rainbow: false
          Value: true
        - Alpha: 1
          Buffer Length: 1
          Class: rviz/Path
          Color: 85; 255; 255
          Enabled: true
          Head Diameter: 0.30000001192092896
          Head Length: 0.20000000298023224
          Length: 0.30000001192092896
          Line Style: Billboards
          Line Width: 0.10000000149011612
          Name: Path (global)
          Offset:
            X: 0
            Y: 0
            Z: 0
          Pose Color: 255; 85; 255
          Pose Style: None
          Radius: 0.029999999329447746
          Shaft Diameter: 0.10000000149011612
          Shaft Length: 0.10000000149011612
          Topic: /lio_segmot/mapping/path
          Unreliable: false
          Value: true
        - Alpha: 1
          Buffer Length: 1
          Class: rviz/Path
          Color: 255; 85; 255
          Enabled: true
          Head Diameter: 0.30000001192092896
          Head Length: 0.20000000298023224
          Length: 0.30000001192092896
          Line Style: Billboards
          Line Width: 0.10000000149011612
          Name: Path (local)
          Offset:
            X: 0
            Y: 0
            Z: 0
          Pose Color: 255; 85; 255
          Pose Style: None
          Radius: 0.029999999329447746
          Shaft Diameter: 0.10000000149011612
          Shaft Length: 0.10000000149011612
          Topic: /lio_segmot/imu/path
          Unreliable: false
          Value: true
        - Alpha: 1
          Autocompute Intensity Bounds: true
          Autocompute Value Bounds:
            Max Value: 20.802837371826172
            Min Value: -0.03934507071971893
            Value: true
          Axis: Z
          Channel Name: intensity
          Class: rviz/PointCloud2
          Color: 255; 255; 255
          Color Transformer: AxisColor
          Decay Time: 0
          Enabled: false
          Invert Rainbow: false
          Max Color: 255; 255; 255
          Min Color: 0; 0; 0
          Name: Loop closure
          Position Transformer: XYZ
          Queue Size: 10
          Selectable: true
          Size (Pixels): 2
          Size (m): 0.30000001192092896
          Style: Flat Squares
          Topic: /lio_segmot/mapping/icp_loop_closure_corrected_cloud
          Unreliable: false
          Use Fixed Frame: true
          Use rainbow: true
          Value: false
        - Class: rviz/MarkerArray
          Enabled: true
          Marker Topic: /lio_segmot/mapping/loop_closure_constraints
          Name: Loop constraint
          Namespaces:
            {}
          Queue Size: 100
          Value: true
        - Class: jsk_rviz_plugin/BoundingBoxArray
          Enabled: false
          Name: Dynamic Objects
          Topic: /lio_segmot/mapping/objects
          Unreliable: false
          Value: false
          alpha: 0.800000011920929
          color: 25; 255; 0
          coloring: Label
          line width: 0.15000000596046448
          only edge: true
          show coords: true
        - Class: rviz/MarkerArray
          Enabled: false
          Marker Topic: /lio_segmot/mapping/object_paths
          Name: Dynamic Object Paths
          Namespaces:
            {}
          Queue Size: 100
          Value: false
        - Class: rviz/MarkerArray
          Enabled: false
          Marker Topic: /lio_segmot/mapping/object_labels
          Name: Dynamic Object Labels
          Namespaces:
            {}
          Queue Size: 100
          Value: false
        - Class: rviz/MarkerArray
          Enabled: false
          Marker Topic: /lio_segmot/mapping/object_velocities
          Name: Dynamic Object Predicted Path
          Namespaces:
            {}
          Queue Size: 100
          Value: false
        - Class: rviz/Marker
          Enabled: true
          Marker Topic: /lio_segmot/mapping/tightly_coupled_object_points
          Name: Tightly-coupled nodes
          Namespaces:
            {}
          Queue Size: 100
          Value: true
      Enabled: true
      Name: Mapping
    - Class: rviz/Group
      Displays:
        - Class: jsk_rviz_plugin/BoundingBoxArray
          Enabled: true
          Name: Dynamic Objects
          Topic: /lio_segmot/tracking/objects
          Unreliable: false
          Value: true
          alpha: 0.800000011920929
          color: 25; 255; 0
          coloring: Label
          line width: 0.15000000596046448
          only edge: true
          show coords: false
        - Class: rviz/MarkerArray
          Enabled: true
          Marker Topic: /lio_segmot/tracking/object_paths
          Name: Dynamic Object Paths
          Namespaces:
            {}
          Queue Size: 100
          Value: true
        - Class: rviz/MarkerArray
          Enabled: false
          Marker Topic: /lio_segmot/tracking/object_labels
          Name: Dynamic Object Labels
          Namespaces:
            {}
          Queue Size: 100
          Value: false
        - Class: rviz/MarkerArray
          Enabled: true
          Marker Topic: /lio_segmot/tracking/object_velocities
          Name: Dynamic Object Predicted Path
          Namespaces:
            {}
          Queue Size: 100
          Value: true
        - Class: rviz/MarkerArray
          Enabled: true
          Marker Topic: /lio_segmot/tracking/object_velocity_arrows
          Name: Dynamic Object Predicted Path Arrow
          Namespaces:
            {}
          Queue Size: 100
          Value: true
      Enabled: true
      Name: Tracking
  Enabled: true
  Global Options:
    Background Color: 0; 0; 0
    Default Light: true
    Fixed Frame: map
    Frame Rate: 30
  Name: root
  Tools:
    - Class: rviz/Interact
      Hide Inactive Objects: true
    - Class: rviz/FocusCamera
    - Class: rviz/Measure
  Value: true
  Views:
    Current:
      Class: rviz/Orbit
      Distance: 182.35630798339844
      Enable Stereo Rendering:
        Stereo Eye Separation: 0.05999999865889549
        Stereo Focal Distance: 1
        Swap Stereo Eyes: false
        Value: false
      Focal Point:
        X: 68.5766372680664
        Y: 25.248287200927734
        Z: -78.56790161132812
      Focal Shape Fixed Size: false
      Focal Shape Size: 0.05000000074505806
      Invert Z Axis: false
      Name: Current View
      Near Clip Distance: 0.009999999776482582
      Pitch: 0.8203978538513184
      Target Frame: base_link
      Value: Orbit (rviz)
      Yaw: 3.4916627407073975
    Saved: ~
Window Geometry:
  Displays:
    collapsed: false
  Height: 1022
  Hide Left Dock: false
  Hide Right Dock: true
  QMainWindow State: 000000ff00000000fd000000040000000000000171000003a8fc0200000009fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000000a0049006d006100670065000000002e000001340000000000000000fb000000100044006900730070006c006100790073010000003b000003a8000000c700fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730000000286000000a70000005c00ffffff000000010000011100000435fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000002e00000435000000a000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000004b000000040fc0100000002fb0000000800540069006d00650000000000000004b00000000000000000fb0000000800540069006d00650100000000000004500000000000000000000005c2000003a800000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000
  Selection:
    collapsed: false
  Tool Properties:
    collapsed: false
  Views:
    collapsed: true
  Width: 1849
  X: 71
  Y: 28


================================================
FILE: launch/include/module_loam.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>
    
    <node pkg="$(arg project)" type="$(arg project)_imuPreintegration"   name="$(arg project)_imuPreintegration"    output="screen" 	respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_imageProjection"     name="$(arg project)_imageProjection"      output="screen"     respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_featureExtraction"   name="$(arg project)_featureExtraction"    output="screen"     respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_mapOptimization"      name="$(arg project)_mapOptimization"       output="screen"     respawn="false"/>
    
</launch>

================================================
FILE: launch/include/module_loam_debug.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>
    
    <node pkg="$(arg project)" type="$(arg project)_imuPreintegration"   name="$(arg project)_imuPreintegration"    output="screen" 	respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_imageProjection"     name="$(arg project)_imageProjection"      output="screen"     respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_featureExtraction"   name="$(arg project)_featureExtraction"    output="screen"     respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_mapOptimization"
    name="$(arg project)_mapOptimization"       output="screen"
    respawn="false" launch-prefix="gdb -ex run --args"/>
    
</launch>

================================================
FILE: launch/include/module_loam_gdbserver_debug.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>
    
    <node pkg="$(arg project)" type="$(arg project)_imuPreintegration"   name="$(arg project)_imuPreintegration"    output="screen" 	respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_imageProjection"     name="$(arg project)_imageProjection"      output="screen"     respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_featureExtraction"   name="$(arg project)_featureExtraction"    output="screen"     respawn="true"/>
    <node pkg="$(arg project)" type="$(arg project)_mapOptimization"
    name="$(arg project)_mapOptimization"       output="screen"
    respawn="false" launch-prefix="gdbserver 127.0.0.1:12222"/>
    
</launch>

================================================
FILE: launch/include/module_navsat.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>

    <env name="ROSCONSOLE_CONFIG_FILE" value="$(find lio_segmot)/launch/include/rosconsole/rosconsole_error.conf"/>
    
    <!-- EKF GPS-->
    <node pkg="robot_localization" type="ekf_localization_node" name="ekf_gps" respawn="true">
        <remap from="odometry/filtered" to="odometry/navsat" />
    </node>

    <!-- Navsat -->
    <node pkg="robot_localization" type="navsat_transform_node" name="navsat" respawn="true">
        <!-- <rosparam param="datum">[42.35893211, -71.09345588, 0.0, world, base_link]</rosparam> -->
        <remap from="imu/data" to="imu_correct" />
        <remap from="gps/fix" to="gps/fix" />
        <remap from="odometry/filtered" to="odometry/navsat" />
    </node>

</launch>

================================================
FILE: launch/include/module_robot_state_publisher.launch
================================================
<launch>

	<arg name="project" default="lio_segmot"/>

    <param name="robot_description" command="$(find xacro)/xacro $(find lio_segmot)/launch/include/config/robot.urdf.xacro --inorder" />

    <node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" respawn="true">
        <!-- <param name="tf_prefix" value="$(env ROS_HOSTNAME)"/> -->
    </node>
  
</launch>

================================================
FILE: launch/include/module_rviz.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>

    <!--- Run Rviz-->
    <node pkg="rviz" type="rviz" name="$(arg project)_rviz" args="-d $(find lio_segmot)/launch/include/config/rviz.rviz" />

</launch>


================================================
FILE: launch/include/rosconsole/rosconsole_error.conf
================================================
# Set the default ros output to warning and higher
log4j.logger.ros = ERROR

================================================
FILE: launch/include/rosconsole/rosconsole_info.conf
================================================
# Set the default ros output to warning and higher
log4j.logger.ros = INFO

================================================
FILE: launch/include/rosconsole/rosconsole_warn.conf
================================================
# Set the default ros output to warning and higher
log4j.logger.ros = WARN

================================================
FILE: launch/run.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>
    
    <!-- Parameters -->
    <rosparam file="$(find lio_segmot)/config/params.yaml" command="load" />

    <!--- LOAM -->
    <include file="$(find lio_segmot)/launch/include/module_loam.launch" />

    <!--- Robot State TF -->
    <include file="$(find lio_segmot)/launch/include/module_robot_state_publisher.launch" />

    <!--- Run Navsat -->
    <include file="$(find lio_segmot)/launch/include/module_navsat.launch" />

    <!--- Run Rviz-->
    <include file="$(find lio_segmot)/launch/include/module_rviz.launch" />

</launch>


================================================
FILE: launch/run_hsinchu.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>
    
    <!-- Parameters -->
    <rosparam file="$(find lio_segmot)/config/params_hsinchu.yaml" command="load" />

    <!--- LOAM -->
    <include file="$(find lio_segmot)/launch/include/module_loam.launch" />

    <!--- Robot State TF -->
    <include file="$(find lio_segmot)/launch/include/module_robot_state_publisher.launch" />

    <!--- Run Navsat -->
    <include file="$(find lio_segmot)/launch/include/module_navsat.launch" />

    <!--- Run Rviz-->
    <include file="$(find lio_segmot)/launch/include/module_rviz.launch" />

</launch>


================================================
FILE: launch/run_kitti.launch
================================================
<launch>

    <arg name="project" default="lio_segmot"/>
    
    <!-- Parameters -->
    <rosparam file="$(find lio_segmot)/config/params_kitti.yaml" command="load" />

    <!--- LOAM -->
    <include file="$(find lio_segmot)/launch/include/module_loam.launch" />

    <!--- Robot State TF -->
    <include file="$(find lio_segmot)/launch/include/module_robot_state_publisher.launch" />

    <!--- Run Navsat -->
    <include file="$(find lio_segmot)/launch/include/module_navsat.launch" />

    <!--- Run Rviz-->
    <include file="$(find lio_segmot)/launch/include/module_rviz.launch" />

</launch>


================================================
FILE: msg/Diagnosis.msg
================================================
Header header

float64 computationalTime
int32 numberOfDetections
int32 numberOfTightlyCoupledObjects

================================================
FILE: msg/ObjectState.msg
================================================
Header header

jsk_recognition_msgs/BoundingBox detection

geometry_msgs/Pose pose
geometry_msgs/Pose velocity

bool hasTightlyCoupledDetectionError
float64 tightlyCoupledDetectionError
float64 initialTightlyCoupledDetectionError

bool hasLooselyCoupledDetectionError
float64 looselyCoupledDetectionError
float64 initialLooselyCoupledDetectionError

bool hasMotionError
float64 motionError
float64 initialMotionError

int32 index
int32 lostCount
float64 confidence
bool isTightlyCoupled
bool isFirst

================================================
FILE: msg/ObjectStateArray.msg
================================================
Header header

lio_segmot/ObjectState[] objects

================================================
FILE: msg/cloud_info.msg
================================================
# Cloud Info
Header header 

int32[] startRingIndex
int32[] endRingIndex

int32[]  pointColInd # point column index in range image
float32[] pointRange # point range 

int64 imuAvailable
int64 odomAvailable

# Attitude for LOAM initialization
float32 imuRollInit
float32 imuPitchInit
float32 imuYawInit

# Initial guess from imu pre-integration
float32 initialGuessX
float32 initialGuessY
float32 initialGuessZ
float32 initialGuessRoll
float32 initialGuessPitch
float32 initialGuessYaw

# Point cloud messages
sensor_msgs/PointCloud2 cloud_raw       # original cloud without any processing
sensor_msgs/PointCloud2 cloud_deskewed  # original cloud deskewed
sensor_msgs/PointCloud2 cloud_corner    # extracted corner feature
sensor_msgs/PointCloud2 cloud_surface   # extracted surface feature

================================================
FILE: msg/flags.msg
================================================
int32[] flags

================================================
FILE: package.xml
================================================
<?xml version="1.0"?>
<package>
  <name>lio_segmot</name>
  <version>1.0.0</version>
  <description>Lidar Odometry</description>

  <maintainer email="shant@mit.edu">Tixiao Shan</maintainer>

  <license>TODO</license>

  <author>Tixiao Shan</author>

  <buildtool_depend>catkin</buildtool_depend>

  <build_depend>roscpp</build_depend>
  <run_depend>roscpp</run_depend>
  <build_depend>rospy</build_depend>
  <run_depend>rospy</run_depend>
  <build_depend>rosbag</build_depend>
  <run_depend>rosbag</run_depend>

  <build_depend>tf</build_depend>
  <run_depend>tf</run_depend>

  <build_depend>cv_bridge</build_depend>
  <run_depend>cv_bridge</run_depend>

  <build_depend>pcl_conversions</build_depend>
  <run_depend>pcl_conversions</run_depend>

  <build_depend>std_msgs</build_depend>
  <run_depend>std_msgs</run_depend>
  <build_depend>sensor_msgs</build_depend>
  <run_depend>sensor_msgs</run_depend>
  <build_depend>geometry_msgs</build_depend>
  <run_depend>geometry_msgs</run_depend>
  <build_depend>nav_msgs</build_depend>
  <run_depend>nav_msgs</run_depend>
  <build_depend>visualization_msgs</build_depend>
  <run_depend>visualization_msgs</run_depend>

  <build_depend>message_generation</build_depend>
  <run_depend>message_generation</run_depend>
  <build_depend>message_runtime</build_depend>
  <run_depend>message_runtime</run_depend>

  <build_depend>GTSAM</build_depend>
  <run_depend>GTSAM</run_depend>

  <build_depend>jsk_recognition_msgs</build_depend>
  <run_depend>jsk_recognition_msgs</run_depend>

  <build_depend>jsk_topic_tools</build_depend>
  <run_depend>jsk_topic_tools</run_depend>

</package>


================================================
FILE: scripts/diagnose.py
================================================
import copy
import multiprocessing as mp
from typing import Dict, List, Tuple

import dash
import dash_bootstrap_components as dbc
import numpy as np
import plotly.graph_objects as go
import rospy
from dash import dcc, html
from dash.dependencies import Input, Output
from lio_segmot.msg import ObjectState, ObjectStateArray
from std_msgs.msg import Header

# ------------------------------------------------------------------------------------------------ #
#                                      Environmental Variables                                     #
# ------------------------------------------------------------------------------------------------ #

RECENT_HISTORY_LENGTH = 15


# ------------------------------------------------------------------------------------------------ #
#                                               Utils                                              #
# ------------------------------------------------------------------------------------------------ #


def get_color(index):
    colormap = [
        (0.121569, 0.466667, 0.705882),
        (0.682353, 0.780392, 0.909804),
        (1.000000, 0.498039, 0.054902),
        (1.000000, 0.733333, 0.470588),
        (0.172549, 0.627451, 0.172549),
        (0.596078, 0.874510, 0.541176),
        (0.839216, 0.152941, 0.156863),
        (1.000000, 0.596078, 0.588235),
        (0.580392, 0.403922, 0.741176),
        (0.772549, 0.690196, 0.835294),
        (0.549020, 0.337255, 0.294118),
        (0.768627, 0.611765, 0.580392),
        (0.890196, 0.466667, 0.760784),
        (0.968627, 0.713725, 0.823529),
        (0.498039, 0.498039, 0.498039),
        (0.780392, 0.780392, 0.780392),
        (0.737255, 0.741176, 0.133333),
        (0.858824, 0.858824, 0.552941),
        (0.090196, 0.745098, 0.811765),
        (0.619608, 0.854902, 0.898039),
    ]
    colormap = [(int(i * 255), int(j * 255), int(k * 255)) for (i, j, k) in colormap]
    color = colormap[index % 20]
    return "rgba({}, {}, {}, 1)".format(color[0], color[1], color[2])


# ------------------------------------------------------------------------------------------------ #
#                                               Data                                               #
# ------------------------------------------------------------------------------------------------ #


class DynamicObject:
    def __init__(self) -> None:
        self.states: Dict[int, ObjectState] = {}  # {stamp index}: {object state}
        self.index: int = -1
        self.name: str = ""
        self.color: str = "rgba(0, 0, 0, 1)"

    def get_confidence_score(self, index: int) -> float:
        if index not in self.states.keys():
            raise ValueError("Invalid index")
        return self.states[index].detection.value

    def get_initial_tightly_coupled_detection_error(self, index: int) -> float:
        if index not in self.states.keys():
            raise ValueError("Invalid index")
        return self.states[index].initialTightlyCoupledDetectionError

    def get_initial_loosely_coupled_detection_error(self, index: int) -> float:
        if index not in self.states.keys():
            raise ValueError("Invalid index")
        return self.states[index].initialLooselyCoupledDetectionError

    def get_tightly_coupled_detection_error(self, index: int) -> float:
        if index not in self.states.keys():
            raise ValueError("Invalid index")
        return self.states[index].tightlyCoupledDetectionError

    def get_loosely_coupled_detection_error(self, index: int) -> float:
        if index not in self.states.keys():
            raise ValueError("Invalid index")
        return self.states[index].looselyCoupledDetectionError

    def get_initial_motion_error(self, index: int) -> float:
        if index not in self.states.keys():
            raise ValueError("Invalid index")
        return self.states[index].initialMotionError

    def get_motion_error(self, index: int) -> float:
        if index not in self.states.keys():
            raise ValueError("Invalid index")
        return self.states[index].motionError

    def recent_indices(self, start_index: int, value_type: str = "") -> np.ndarray:
        indices = np.array(list(self.states.keys()))
        indices = indices[np.where(indices >= start_index)[0]]
        if value_type in ["", "confidence_score"]:
            return indices
        elif value_type in [
            "initial_tightly_coupled_detection_error",
            "tightly_coupled_detection_error",
        ]:
            return indices[
                np.where(
                    np.array(
                        [
                            1 if self.states[idx].hasTightlyCoupledDetectionError else 0
                            for idx in indices
                        ]
                    )
                )[0]
            ]
        elif value_type in [
            "initial_loosely_coupled_detection_error",
            "loosely_coupled_detection_error",
        ]:
            return indices[
                np.where(
                    np.array(
                        [
                            1 if self.states[idx].hasLooselyCoupledDetectionError else 0
                            for idx in indices
                        ]
                    )
                )[0]
            ]
        elif value_type in ["initial_motion_error", "motion_error"]:
            return indices[
                np.where(
                    np.array(
                        [1 if self.states[idx].hasMotionError else 0 for idx in indices]
                    )
                )[0]
            ]
        else:
            raise ValueError("Invalid value type")

    def has_recent_data(self, start_index: int) -> bool:
        return len(self.recent_indices(start_index)) > 0

    def get_recent_values_plot_data(
        self, value_type: str, start_index: int, stamps: np.ndarray
    ) -> go.Scatter:
        indices = self.recent_indices(start_index, value_type)
        if len(indices) == 0:
            return go.Scatter(x=[], y=[], name=self.name)

        x = stamps[indices]
        y = None
        if value_type == "confidence_score":
            y = [self.states[i].detection.value for i in indices]
        elif value_type == "initial_loosely_coupled_detection_error":
            y = [self.states[i].initialLooselyCoupledDetectionError for i in indices]
        elif value_type == "initial_tightly_coupled_detection_error":
            y = [self.states[i].initialTightlyCoupledDetectionError for i in indices]
        elif value_type == "loosely_coupled_detection_error":
            y = [self.states[i].looselyCoupledDetectionError for i in indices]
        elif value_type == "tightly_coupled_detection_error":
            y = [self.states[i].tightlyCoupledDetectionError for i in indices]
        elif value_type == "initial_motion_error":
            y = [self.states[i].initialMotionError for i in indices]
        elif value_type == "motion_error":
            y = [self.states[i].motionError for i in indices]
        else:
            raise ValueError("Invalid value type")

        return go.Scatter(x=x, y=y, name=self.name, marker=dict(color=self.color))


class Data:
    def __init__(self) -> None:
        self.objects: Dict[int, DynamicObject] = {}
        self.tightly_coupled_objects: Dict[int, DynamicObject] = {}
        self.start_time = -1
        self.stamps: List[float] = []
        self.stamps_hash: List[str] = []

    def add_new_object_state(self, object_state: ObjectState) -> None:
        stamp_index = self.update_time_stamp(object_state.header)

        if object_state.index not in self.objects.keys():
            self.objects[object_state.index] = DynamicObject()
            self.objects[object_state.index].index = object_state.index
            self.objects[object_state.index].name = "Object {}".format(
                object_state.index
            )
            self.objects[object_state.index].color = get_color(object_state.index)

        self.objects[object_state.index].states[stamp_index] = copy.deepcopy(
            object_state
        )

        if object_state.isTightlyCoupled:
            if object_state.index not in self.tightly_coupled_objects.keys():
                self.tightly_coupled_objects[object_state.index] = DynamicObject()
                self.tightly_coupled_objects[
                    object_state.index
                ].index = object_state.index
                self.tightly_coupled_objects[
                    object_state.index
                ].name = "Object {}".format(object_state.index)
                self.tightly_coupled_objects[object_state.index].color = get_color(
                    object_state.index
                )

            self.tightly_coupled_objects[object_state.index].states[
                stamp_index
            ] = copy.deepcopy(object_state)

    def update_time_stamp(self, header: Header) -> int:
        stamp = header.stamp.to_sec()

        if self.start_time < 0:
            self.start_time = stamp

        stamp = stamp - self.start_time
        stamp_hash = "{}.{}".format(header.stamp.secs, header.stamp.nsecs)

        # TODO: Check ordering of time stamps
        if stamp_hash not in self.stamps_hash:
            self.stamps.append(stamp)
            self.stamps_hash.append(stamp_hash)
            return len(self.stamps) - 1
        else:
            length = len(self.stamps)
            return next(
                (
                    length - i - 1
                    for i, e in enumerate(self.stamps_hash[::-1])
                    if e == stamp_hash
                )
            )

    @property
    def number_of_stamps(self):
        return len(self.stamps)

    def get_recent_values_plot_data(
        self, value_type: str, length: int, only_tightly_coupled: bool = False
    ) -> List[go.Scatter]:
        start_index = max(self.number_of_stamps - length, 0)
        np_stamps = np.array(self.stamps)
        available_object_states = (
            self.tightly_coupled_objects.values()
            if only_tightly_coupled
            else self.objects.values()
        )
        return [
            d.get_recent_values_plot_data(value_type, start_index, np_stamps)
            for d in available_object_states
            if d.has_recent_data(start_index)
        ]

    def get_recent_plot_xaxis_range(self, length: int) -> Tuple[float, float]:
        start_index = max(self.number_of_stamps - length, 0)
        end_stamp = max(self.stamps[-1], self.stamps[start_index] + length * 0.1) + 0.2
        return self.stamps[start_index], end_stamp


data = Data()
queue = mp.Queue(maxsize=5)

confidence_score_min_y = 0.0
confidence_score_max_y = 0.55

initial_detection_error_min_y = 0.0
initial_detection_error_max_y = 200
detection_error_min_y = 0.0
detection_error_max_y = 20

initial_motion_error_min_y = 0.0
initial_motion_error_max_y = 5.0
motion_error_min_y = 0.0
motion_error_max_y = 20

# ------------------------------------------------------------------------------------------------ #
#                                                ROS                                               #
# ------------------------------------------------------------------------------------------------ #


class Callback:
    def __init__(self, queue: mp.Queue) -> None:
        self.queue: mp.Queue = queue

    def __call__(self, msg: ObjectStateArray) -> None:
        queue.put(msg)


def ros_main():
    callback = Callback(queue)
    rospy.init_node("lio_segmotmot_diagnose", anonymous=True)
    rospy.Subscriber("lio_segmot/mapping/object_states", ObjectStateArray, callback)
    rospy.spin()


# ------------------------------------------------------------------------------------------------ #
#                                               Dash                                               #
# ------------------------------------------------------------------------------------------------ #

app = dash.Dash(__name__, external_stylesheets=[dbc.themes.FLATLY])
app.layout = html.Div(
    html.Div(
        [
            dbc.NavbarSimple(
                children=[],
                brand="Coupling LIO-SAM and MOT",
                brand_href="#",
                color="primary",
                dark=True,
            ),
            dbc.Toast(
                "Data is reset!",
                id="data-reset-toast",
                header="Notification",
                is_open=False,
                dismissable=True,
                icon="info",
                # top: 66 positions the toast below the navbar
                style={
                    "position": "fixed",
                    "top": 66,
                    "right": 10,
                    "width": 350,
                    "z-index": "100",
                },
            ),
            html.Div(
                [
                    dbc.Row(
                        [
                            dbc.Col(
                                [html.H3("SE-SSD's Confidence Score")],
                                class_name="col-6",
                            ),
                            dbc.Col(
                                [
                                    dbc.InputGroup(
                                        [
                                            dbc.InputGroupText("min(y)"),
                                            dbc.Input(
                                                type="number",
                                                id="confidence-score-min-y",
                                                placeholder="min(y)",
                                                value=confidence_score_min_y,
                                            ),
                                        ]
                                    ),
                                ],
                                class_name="col-3",
                            ),
                            dbc.Col(
                                [
                                    dbc.InputGroup(
                                        [
                                            dbc.InputGroupText("max(y)"),
                                            dbc.Input(
                                                type="number",
                                                id="confidence-score-max-y",
                                                placeholder="max(y)",
                                                value=confidence_score_max_y,
                                            ),
                                        ]
                                    ),
                                ],
                                class_name="col-3",
                            ),
                        ],
                        style={"margin-top": "30px"},
                    ),
                    dbc.Row(
                        [
                            dbc.Col(
                                dcc.Graph(id="detection-score-all"),
                                class_name="col-lg-6 col-12",
                            ),
                            dbc.Col(
                                dcc.Graph(id="detection-score-tightly-coupled"),
                                class_name="col-lg-6 col-12",
                            ),
                        ]
                    ),
                    html.Hr(),
                    dbc.Row(
                        [
                            dbc.Col(
                                [html.H3("Initial Detection Error")], class_name="col-6"
                            ),
                            dbc.Col(
                                [
                                    dbc.InputGroup(
                                        [
                                            dbc.InputGroupText("min(y)"),
                                            dbc.Input(
                                                type="number",
                                                id="initial-detection-error-min-y",
                                                placeholder="min(y)",
                                                value=initial_detection_error_min_y,
                                            ),
                                        ]
                                    ),
                                ],
                                class_name="col-3",
                            ),
                            dbc.Col(
                                [
                                    dbc.InputGroup(
                                        [
                                            dbc.InputGroupText("max(y)"),
                                            dbc.Input(
                                                type="number",
                                                id="initial-detection-error-max-y",
                                                placeholder="max(y)",
                                                value=initial_detection_error_max_y,
                                            ),
                                        ]
                                    ),
                                ],
                                class_name="col-3",
                            ),
                        ],
                        style={"margin-top": "30px"},
                    ),
                    dbc.Row(
                        [
                            dbc.Col(
                                dcc.Graph(id="initial-loosely-coupled-detection-error"),
                                class_name="col-lg-6 col-12",
                            ),
                            dbc.Col(
                                dcc.Graph(id="initial-tightly-coupled-detection-error"),
                                class_name="col-lg-6 col-12",
                            ),
                        ]
                    ),
                    html.Hr(),
                    dbc.Row(
                        [
                            dbc.Col([html.H3("Detection Error")], class_name="col-6"),
                            dbc.Col(
                                [
                                    dbc.InputGroup(
                                        [
                                            dbc.InputGroupText("min(y)"),
                                            dbc.Input(
                                                type="number",
                                                id="detection-error-min-y",
                                                placeholder="min(y)",
                                                value=detection_error_min_y,
                                            ),
                                        ]
                                    ),
                                ],
                                class_name="col-3",
                            ),
                            dbc.Col(
                                [
                                    dbc.InputGroup(
                                        [
                                            dbc.InputGroupText("max(y)"),
                                            dbc.Input(
                                                type="number",
                                                id="detection-error-max-y",
                                                placeholder="max(y)",
                                                value=detection_error_max_y,
                                            ),
                                        ]
                                    ),
                                ],
                                class_name="col-3",
                            ),
                        ],
                        style={"margin-top": "30px"},
                    ),
                    dbc.Row(
                        [
                            dbc.Col(
                                dcc.Graph(id="loosely-coupled-detection-error"),
                                class_name="col-lg-6 col-12",
                            ),
                            dbc.Col(
                                dcc.Graph(id="tightly-coupled-detection-error"),
                                class_name="col-lg-6 col-12",
                            ),
                        ]
                    ),
                    # html.Hr(),
                    # dbc.Row(
                    #     [
                    #         dbc.Col(
                    #             [html.H3("Initial Motion Error")], class_name="col-6"
                    #         ),
                    #         dbc.Col(
                    #             [
                    #                 dbc.InputGroup(
                    #                     [
                    #                         dbc.InputGroupText("min(y)"),
                    #                         dbc.Input(
                    #                             type="number",
                    #                             id="initial-motion-error-min-y",
                    #                             placeholder="min(y)",
                    #                             value=initial_motion_error_min_y,
                    #                         ),
                    #                     ]
                    #                 ),
                    #             ],
                    #             class_name="col-3",
                    #         ),
                    #         dbc.Col(
                    #             [
                    #                 dbc.InputGroup(
                    #                     [
                    #                         dbc.InputGroupText("max(y)"),
                    #                         dbc.Input(
                    #                             type="number",
                    #                             id="initial-motion-error-max-y",
                    #                             placeholder="max(y)",
                    #                             value=initial_motion_error_max_y,
                    #                         ),
                    #                     ]
                    #                 ),
                    #             ],
                    #             class_name="col-3",
                    #         ),
                    #     ],
                    #     style={"margin-top": "30px"},
                    # ),
                    # dbc.Row(
                    #     [
                    #         dbc.Col(
                    #             dcc.Graph(id="initial-motion-error-all"),
                    #             class_name="col-lg-6 col-12",
                    #         ),
                    #         dbc.Col(
                    #             dcc.Graph(id="initial-motion-error-tightly-coupled"),
                    #             class_name="col-lg-6 col-12",
                    #         ),
                    #     ]
                    # ),
                    html.Hr(),
                    dbc.Row(
                        [
                            dbc.Col([html.H3("Motion Error")], class_name="col-6"),
                            dbc.Col(
                                [
                                    dbc.InputGroup(
                                        [
                                            dbc.InputGroupText("min(y)"),
                                            dbc.Input(
                                                type="number",
                                                id="motion-error-min-y",
                                                placeholder="min(y)",
                                                value=motion_error_min_y,
                                            ),
                                        ]
                                    ),
                                ],
                                class_name="col-3",
                            ),
                            dbc.Col(
                                [
                                    dbc.InputGroup(
                                        [
                                            dbc.InputGroupText("max(y)"),
                                            dbc.Input(
                                                type="number",
                                                id="motion-error-max-y",
                                                placeholder="max(y)",
                                                value=motion_error_max_y,
                                            ),
                                        ]
                                    ),
                                ],
                                class_name="col-3",
                            ),
                        ],
                        style={"margin-top": "30px"},
                    ),
                    dbc.Row(
                        [
                            dbc.Col(
                                dcc.Graph(id="motion-error-all"),
                                class_name="col-lg-6 col-12",
                            ),
                            dbc.Col(
                                dcc.Graph(id="motion-error-tightly-coupled"),
                                class_name="col-lg-6 col-12",
                            ),
                        ]
                    ),
                    html.Hr(),
                    html.Div(
                        [
                            dbc.Button(
                                "Reset",
                                id="reset-button",
                                color="danger",
                                n_clicks=0,
                                className="me-1",
                            ),
                        ]
                    ),
                ],
                className="container",
            ),
            dcc.Interval(
                id="interval-component",
                interval=1 * 200,  # in milliseconds
                n_intervals=0,
            ),
        ]
    )
)


@app.callback(
    Output("data-reset-toast", "is_open"),
    Input("reset-button", "n_clicks"),
)
def reset(n):
    global data
    data = Data()
    return True


@app.callback(
    Output("detection-score-all", "figure"),
    Output("detection-score-tightly-coupled", "figure"),
    Output("initial-loosely-coupled-detection-error", "figure"),
    Output("initial-tightly-coupled-detection-error", "figure"),
    Output("loosely-coupled-detection-error", "figure"),
    Output("tightly-coupled-detection-error", "figure"),
    # Output("initial-motion-error-all", "figure"),
    # Output("initial-motion-error-tightly-coupled", "figure"),
    Output("motion-error-all", "figure"),
    Output("motion-error-tightly-coupled", "figure"),
    Input("interval-component", "n_intervals"),
    Input("confidence-score-min-y", "value"),
    Input("confidence-score-max-y", "value"),
    Input("initial-detection-error-min-y", "value"),
    Input("initial-detection-error-max-y", "value"),
    Input("detection-error-min-y", "value"),
    Input("detection-error-max-y", "value"),
    # Input("initial-motion-error-min-y", "value"),
    # Input("initial-motion-error-max-y", "value"),
    Input("motion-error-min-y", "value"),
    Input("motion-error-max-y", "value"),
)
def update(
    n,
    c_min_y,
    c_max_y,
    i_d_min_y,
    i_d_max_y,
    d_min_y,
    d_max_y,
    # i_m_min_y,
    # i_m_max_y,
    m_min_y,
    m_max_y,
):
    global confidence_score_min_y
    global confidence_score_max_y
    global initial_detection_error_min_y
    global initial_detection_error_max_y
    global detection_error_min_y
    global detection_error_max_y
    global initial_motion_error_min_y
    global initial_motion_error_max_y
    global motion_error_min_y
    global motion_error_max_y

    changed = False

    if confidence_score_min_y != c_min_y:
        changed = True
        confidence_score_min_y = c_min_y
    if confidence_score_max_y != c_max_y:
        changed = True
        confidence_score_max_y = c_max_y
    if initial_detection_error_min_y != i_d_min_y:
        changed = True
        initial_detection_error_min_y = i_d_min_y
    if initial_detection_error_max_y != i_d_max_y:
        changed = True
        initial_detection_error_max_y = i_d_max_y
    if detection_error_min_y != d_min_y:
        changed = True
        detection_error_min_y = d_min_y
    if detection_error_max_y != d_max_y:
        changed = True
        detection_error_max_y = d_max_y
    # if initial_motion_error_min_y != i_m_min_y:
    #     changed = True
    #     initial_motion_error_min_y = i_m_min_y
    # if initial_motion_error_max_y != i_m_max_y:
    #     changed = True
    #     initial_motion_error_max_y = i_m_max_y
    if motion_error_min_y != m_min_y:
        changed = True
        motion_error_min_y = m_min_y
    if motion_error_max_y != m_max_y:
        changed = True
        motion_error_max_y = m_max_y

    if queue.empty() and not changed:
        return [dash.no_update] * 8
    elif changed and len(data.objects) == 0:
        return [dash.no_update] * 8

    while not queue.empty():
        msg: ObjectStateArray = queue.get()
        for object_state in msg.objects:
            data.add_new_object_state(object_state)

    confidence_score_all = go.Figure(
        data=[
            go.Scatter(
                x=[data.stamps[-1]] * 2,
                y=[confidence_score_min_y, confidence_score_max_y],
                marker={"color": "red"},
            )
        ]
        + data.get_recent_values_plot_data(
            "confidence_score", RECENT_HISTORY_LENGTH, False
        ),
        layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
        layout_yaxis_range=[0, 0.55],
    )
    confidence_score_all.update_layout(
        title="All Objects",
        xaxis_title="Time Stamp",
        yaxis_title="Confidence Score",
        showlegend=True,
    )
    confidence_score_tightly_coupled = go.Figure(
        data=[
            go.Scatter(
                x=[data.stamps[-1]] * 2,
                y=[confidence_score_min_y - 1, confidence_score_max_y + 1],
                marker={"color": "red"},
            )
        ]
        + data.get_recent_values_plot_data(
            "confidence_score", RECENT_HISTORY_LENGTH, True
        ),
        layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
        layout_yaxis_range=[confidence_score_min_y, confidence_score_max_y],
    )
    confidence_score_tightly_coupled.update_layout(
        title="Tightly-coupled Objects",
        xaxis_title="Time Stamp",
        yaxis_title="Confidence Score",
        showlegend=True,
    )

    initial_loosely_coupled_detection_error = go.Figure(
        data=[
            go.Scatter(
                x=[data.stamps[-1]] * 2,
                y=[
                    initial_detection_error_min_y - 1,
                    initial_detection_error_max_y + 1,
                ],
                marker={"color": "red"},
            )
        ]
        + data.get_recent_values_plot_data(
            "initial_loosely_coupled_detection_error", RECENT_HISTORY_LENGTH, False
        ),
        layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
        layout_yaxis_range=[
            initial_detection_error_min_y,
            initial_detection_error_max_y,
        ],
    )
    initial_loosely_coupled_detection_error.update_layout(
        title="Loosely-coupled Objects",
        xaxis_title="Time Stamp",
        yaxis_title="Loosely-coupled Detection Error",
        showlegend=True,
    )
    initial_tightly_coupled_detection_error = go.Figure(
        data=[
            go.Scatter(
                x=[data.stamps[-1]] * 2,
                y=[
                    initial_detection_error_min_y - 1,
                    initial_detection_error_max_y + 1,
                ],
                marker={"color": "red"},
            )
        ]
        + data.get_recent_values_plot_data(
            "initial_tightly_coupled_detection_error", RECENT_HISTORY_LENGTH, False
        ),
        layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
        layout_yaxis_range=[
            initial_detection_error_min_y,
            initial_detection_error_max_y,
        ],
    )
    initial_tightly_coupled_detection_error.update_layout(
        title="Tightly-coupled Objects",
        xaxis_title="Time Stamp",
        yaxis_title="Tightly-coupled Detection Error",
        showlegend=True,
    )

    loosely_coupled_detection_error = go.Figure(
        data=[
            go.Scatter(
                x=[data.stamps[-1]] * 2,
                y=[detection_error_min_y - 1, detection_error_max_y + 1],
                marker={"color": "red"},
            )
        ]
        + data.get_recent_values_plot_data(
            "loosely_coupled_detection_error", RECENT_HISTORY_LENGTH, False
        ),
        layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
        layout_yaxis_range=[detection_error_min_y, detection_error_max_y],
    )
    loosely_coupled_detection_error.update_layout(
        title="Loosely-coupled Objects",
        xaxis_title="Time Stamp",
        yaxis_title="Loosely-coupled Detection Error",
        showlegend=True,
    )
    tightly_coupled_detection_error = go.Figure(
        data=[
            go.Scatter(
                x=[data.stamps[-1]] * 2,
                y=[detection_error_min_y - 1, detection_error_max_y + 1],
                marker={"color": "red"},
            )
        ]
        + data.get_recent_values_plot_data(
            "tightly_coupled_detection_error", RECENT_HISTORY_LENGTH, False
        ),
        layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
        layout_yaxis_range=[detection_error_min_y, detection_error_max_y],
    )
    tightly_coupled_detection_error.update_layout(
        title="Tightly-coupled Objects",
        xaxis_title="Time Stamp",
        yaxis_title="Tightly-coupled Detection Error",
        showlegend=True,
    )

    # initial_motion_error_all = go.Figure(
    #     data=[
    #         go.Scatter(
    #             x=[data.stamps[-1]] * 2,
    #             y=[initial_motion_error_min_y - 1, initial_motion_error_max_y + 1],
    #             marker={"color": "red"},
    #         )
    #     ]
    #     + data.get_recent_values_plot_data(
    #         "initial_motion_error", RECENT_HISTORY_LENGTH, False
    #     ),
    #     layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
    #     layout_yaxis_range=[initial_motion_error_min_y, initial_motion_error_max_y],
    # )
    # initial_motion_error_all.update_layout(
    #     title="All Objects",
    #     xaxis_title="Time Stamp",
    #     yaxis_title="Motion Error",
    #     showlegend=True,
    # )
    # initial_motion_error_tightly_coupled = go.Figure(
    #     data=[
    #         go.Scatter(
    #             x=[data.stamps[-1]] * 2,
    #             y=[initial_motion_error_min_y - 1, initial_motion_error_max_y + 1],
    #             marker={"color": "red"},
    #         )
    #     ]
    #     + data.get_recent_values_plot_data(
    #         "initial_motion_error", RECENT_HISTORY_LENGTH, True
    #     ),
    #     layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
    #     layout_yaxis_range=[initial_motion_error_min_y, initial_motion_error_max_y],
    # )
    # initial_motion_error_tightly_coupled.update_layout(
    #     title="Tightly-coupled Objects",
    #     xaxis_title="Time Stamp",
    #     yaxis_title="Motion Error",
    #     showlegend=True,
    # )

    motion_error_all = go.Figure(
        data=[
            go.Scatter(
                x=[data.stamps[-1]] * 2,
                y=[motion_error_min_y - 1, motion_error_max_y + 1],
                marker={"color": "red"},
            )
        ]
        + data.get_recent_values_plot_data(
            "motion_error", RECENT_HISTORY_LENGTH, False
        ),
        layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
        layout_yaxis_range=[motion_error_min_y, motion_error_max_y],
    )
    motion_error_all.update_layout(
        title="All Objects",
        xaxis_title="Time Stamp",
        yaxis_title="Motion Error",
        showlegend=True,
    )
    motion_error_tightly_coupled = go.Figure(
        data=[
            go.Scatter(
                x=[data.stamps[-1]] * 2,
                y=[motion_error_min_y - 1, motion_error_max_y + 1],
                marker={"color": "red"},
            )
        ]
        + data.get_recent_values_plot_data("motion_error", RECENT_HISTORY_LENGTH, True),
        layout_xaxis_range=data.get_recent_plot_xaxis_range(RECENT_HISTORY_LENGTH),
        layout_yaxis_range=[motion_error_min_y, motion_error_max_y],
    )
    motion_error_tightly_coupled.update_layout(
        title="Tightly-coupled Objects",
        xaxis_title="Time Stamp",
        yaxis_title="Motion Error",
        showlegend=True,
    )

    return (
        confidence_score_all,
        confidence_score_tightly_coupled,
        initial_loosely_coupled_detection_error,
        initial_tightly_coupled_detection_error,
        loosely_coupled_detection_error,
        tightly_coupled_detection_error,
        # initial_motion_error_all,
        # initial_motion_error_tightly_coupled,
        motion_error_all,
        motion_error_tightly_coupled,
    )


if __name__ == "__main__":
    ros_main_process = mp.Process(target=ros_main)
    ros_main_process.start()
    app.run_server(debug=True)
    ros_main_process.join()


====================
Download .txt
gitextract_j229x4hx/

├── .clang-format
├── .github/
│   └── stale.yml
├── .gitignore
├── CITATION.bib
├── CMakeLists.txt
├── LICENSE
├── README.md
├── config/
│   ├── doc/
│   │   └── kitti2bag/
│   │       ├── README.md
│   │       └── kitti2bag.py
│   ├── params.yaml
│   ├── params_hsinchu.yaml
│   └── params_kitti.yaml
├── include/
│   ├── factor.h
│   ├── solver.h
│   └── utility.h
├── launch/
│   ├── debug.launch
│   ├── gdbserver_debug.launch
│   ├── include/
│   │   ├── config/
│   │   │   ├── robot.urdf.xacro
│   │   │   └── rviz.rviz
│   │   ├── module_loam.launch
│   │   ├── module_loam_debug.launch
│   │   ├── module_loam_gdbserver_debug.launch
│   │   ├── module_navsat.launch
│   │   ├── module_robot_state_publisher.launch
│   │   ├── module_rviz.launch
│   │   └── rosconsole/
│   │       ├── rosconsole_error.conf
│   │       ├── rosconsole_info.conf
│   │       └── rosconsole_warn.conf
│   ├── run.launch
│   ├── run_hsinchu.launch
│   └── run_kitti.launch
├── msg/
│   ├── Diagnosis.msg
│   ├── ObjectState.msg
│   ├── ObjectStateArray.msg
│   ├── cloud_info.msg
│   └── flags.msg
├── package.xml
├── scripts/
│   └── diagnose.py
├── src/
│   ├── factor.cpp
│   ├── featureExtraction.cpp
│   ├── imageProjection.cpp
│   ├── imuPreintegration.cpp
│   ├── mapOptimization.cpp
│   ├── offlineBagPlayer.cpp
│   └── solver.cpp
└── srv/
    ├── detection.srv
    ├── save_estimation_result.srv
    └── save_map.srv
Download .txt
SYMBOL INDEX (160 symbols across 12 files)

FILE: config/doc/kitti2bag/kitti2bag.py
  function save_imu_data (line 30) | def save_imu_data(bag, kitti, imu_frame_id, topic):
  function save_imu_data_raw (line 49) | def save_imu_data_raw(bag, kitti, imu_frame_id, topic):
  function save_dynamic_tf (line 131) | def save_dynamic_tf(bag, kitti, kitti_type, initial_time):
  function save_dynamic_tf_as_tum_file (line 187) | def save_dynamic_tf_as_tum_file(filename, kitti, kitti_type, initial_time):
  function save_camera_data (line 242) | def save_camera_data(bag, kitti_type, kitti, util, bridge, camera, camer...
  function save_velo_data (line 291) | def save_velo_data(bag, kitti, velo_frame_id, topic):
  function get_static_transform (line 355) | def get_static_transform(from_frame_id, to_frame_id, transform):
  function inv (line 371) | def inv(transform):
  function save_static_transforms (line 382) | def save_static_transforms(bag, transforms, timestamps):
  function save_gps_fix_data (line 395) | def save_gps_fix_data(bag, kitti, gps_frame_id, topic):
  function save_gps_vel_data (line 407) | def save_gps_vel_data(bag, kitti, gps_frame_id, topic):

FILE: include/factor.h
  function class (line 18) | class MaxMixturePoint3 : public gtsam::Point3 {}
  function class (line 20) | class Detection {
  function virtual (line 289) | virtual void

FILE: include/solver.h
  function class (line 10) | class MaxMixtureISAM2 : public gtsam::ISAM2 {

FILE: include/utility.h
  type pcl (line 58) | typedef pcl::PointXYZI PointType;
  function SensorType (line 60) | enum class SensorType { VELODYNE,
  function pointDistance (line 392) | float pointDistance(PointType p) {
  function pointDistance (line 396) | float pointDistance(PointType p1, PointType p2) {

FILE: scripts/diagnose.py
  function get_color (line 27) | def get_color(index):
  class DynamicObject (line 60) | class DynamicObject:
    method __init__ (line 61) | def __init__(self) -> None:
    method get_confidence_score (line 67) | def get_confidence_score(self, index: int) -> float:
    method get_initial_tightly_coupled_detection_error (line 72) | def get_initial_tightly_coupled_detection_error(self, index: int) -> f...
    method get_initial_loosely_coupled_detection_error (line 77) | def get_initial_loosely_coupled_detection_error(self, index: int) -> f...
    method get_tightly_coupled_detection_error (line 82) | def get_tightly_coupled_detection_error(self, index: int) -> float:
    method get_loosely_coupled_detection_error (line 87) | def get_loosely_coupled_detection_error(self, index: int) -> float:
    method get_initial_motion_error (line 92) | def get_initial_motion_error(self, index: int) -> float:
    method get_motion_error (line 97) | def get_motion_error(self, index: int) -> float:
    method recent_indices (line 102) | def recent_indices(self, start_index: int, value_type: str = "") -> np...
    method has_recent_data (line 146) | def has_recent_data(self, start_index: int) -> bool:
    method get_recent_values_plot_data (line 149) | def get_recent_values_plot_data(
  class Data (line 178) | class Data:
    method __init__ (line 179) | def __init__(self) -> None:
    method add_new_object_state (line 186) | def add_new_object_state(self, object_state: ObjectState) -> None:
    method update_time_stamp (line 218) | def update_time_stamp(self, header: Header) -> int:
    method number_of_stamps (line 243) | def number_of_stamps(self):
    method get_recent_values_plot_data (line 246) | def get_recent_values_plot_data(
    method get_recent_plot_xaxis_range (line 262) | def get_recent_plot_xaxis_range(self, length: int) -> Tuple[float, flo...
  class Callback (line 289) | class Callback:
    method __init__ (line 290) | def __init__(self, queue: mp.Queue) -> None:
    method __call__ (line 293) | def __call__(self, msg: ObjectStateArray) -> None:
  function ros_main (line 297) | def ros_main():
  function reset (line 627) | def reset(n):
  function update (line 656) | def update(

FILE: src/factor.cpp
  function getDetectionIndexAndError (line 34) | std::tuple<size_t, double>

FILE: src/featureExtraction.cpp
  type smoothness_t (line 4) | struct smoothness_t {
  type by_value (line 9) | struct by_value {
  class FeatureExtraction (line 15) | class FeatureExtraction : public ParamServer {
    method FeatureExtraction (line 37) | FeatureExtraction() {
    method initializationValue (line 47) | void initializationValue() {
    method laserCloudInfoHandler (line 61) | void laserCloudInfoHandler(const lio_segmot::cloud_infoConstPtr &msgIn) {
    method calculateSmoothness (line 75) | void calculateSmoothness() {
    method markOccludedPoints (line 90) | void markOccludedPoints() {
    method extractFeatures (line 126) | void extractFeatures() {
    method freeCloudInfoMemory (line 211) | void freeCloudInfoMemory() {
    method publishFeatureCloud (line 218) | void publishFeatureCloud() {
  function main (line 229) | int main(int argc, char **argv) {

FILE: src/imageProjection.cpp
  type VelodynePointXYZIRT (line 4) | struct VelodynePointXYZIRT {
  type OusterPointXYZIRT (line 14) | struct OusterPointXYZIRT {
  class ImageProjection (line 32) | class ImageProjection : public ParamServer {
    method ImageProjection (line 84) | ImageProjection() : deskewFlag(0) {
    method allocateMemory (line 99) | void allocateMemory() {
    method resetParameters (line 118) | void resetParameters() {
    method imuHandler (line 138) | void imuHandler(const sensor_msgs::Imu::ConstPtr &imuMsg) {
    method odometryHandler (line 162) | void odometryHandler(const nav_msgs::Odometry::ConstPtr &odometryMsg) {
    method cloudHandler (line 167) | void cloudHandler(const sensor_msgs::PointCloud2ConstPtr &laserCloudMs...
    method cachePointCloud (line 182) | bool cachePointCloud(const sensor_msgs::PointCloud2ConstPtr &laserClou...
    method deskewInfo (line 268) | bool deskewInfo() {
    method imuDeskewInfo (line 285) | void imuDeskewInfo() {
    method odomDeskewInfo (line 341) | void odomDeskewInfo() {
    method findRotation (line 419) | void findRotation(double pointTime, float *rotXCur, float *rotYCur, fl...
    method findPosition (line 445) | void findPosition(double relTime, float *posXCur, float *posYCur, floa...
    method PointType (line 462) | PointType deskewPoint(PointType *point, double relTime) {
    method projectPointCloud (line 492) | void projectPointCloud() {
    method cloudExtraction (line 535) | void cloudExtraction() {
    method publishClouds (line 557) | void publishClouds() {
  function main (line 565) | int main(int argc, char **argv) {

FILE: src/imuPreintegration.cpp
  class TransformFusion (line 23) | class TransformFusion : public ParamServer {
    method TransformFusion (line 43) | TransformFusion() {
    method odom2affine (line 60) | Eigen::Affine3f odom2affine(nav_msgs::Odometry odom) {
    method lidarOdometryHandler (line 71) | void lidarOdometryHandler(const nav_msgs::Odometry::ConstPtr& odomMsg) {
    method imuOdometryHandler (line 79) | void imuOdometryHandler(const nav_msgs::Odometry::ConstPtr& odomMsg) {
  class IMUPreintegration (line 144) | class IMUPreintegration : public ParamServer {
    method IMUPreintegration (line 190) | IMUPreintegration() {
    method resetOptimization (line 214) | void resetOptimization() {
    method resetParams (line 227) | void resetParams() {
    method odometryHandler (line 233) | void odometryHandler(const nav_msgs::Odometry::ConstPtr& odomMsg) {
    method failureDetection (line 402) | bool failureDetection(const gtsam::Vector3& velCur, const gtsam::imuBi...
    method imuHandler (line 419) | void imuHandler(const sensor_msgs::Imu::ConstPtr& imu_raw) {
  function main (line 469) | int main(int argc, char** argv) {

FILE: src/mapOptimization.cpp
  type PointXYZIRPYT (line 50) | struct PointXYZIRPYT {
  function gtsamPose2ROSPose (line 65) | geometry_msgs::Pose gtsamPose2ROSPose(const Pose3& pose) {
  class ObjectState (line 81) | class ObjectState {
    method ObjectState (line 109) | ObjectState(Pose3 pose                                        = Pose3:...
    method ObjectState (line 141) | ObjectState clone() const {
    method isTurning (line 159) | bool isTurning(float threshold) const {
    method isMovingFast (line 164) | bool isMovingFast(float threshold) const {
    method velocityIsConsistent (line 169) | bool velocityIsConsistent(int samplingSize,
  class Timer (line 210) | class Timer {
    method Timer (line 216) | Timer() {
    method reset (line 220) | void reset() {
    method stop (line 224) | void stop() {
    method elapsed (line 228) | double elapsed() const {
  class mapOptimization (line 233) | class mapOptimization : public ParamServer {
    method mapOptimization (line 391) | mapOptimization() {
    method allocateMemory (line 447) | void allocateMemory() {
    method laserCloudInfoHandler (line 502) | void laserCloudInfoHandler(const lio_segmot::cloud_infoConstPtr& msgIn) {
    method getDetections (line 552) | void getDetections() {
    method gpsHandler (line 561) | void gpsHandler(const nav_msgs::Odometry::ConstPtr& gpsMsg) {
    method pointAssociateToMap (line 565) | void pointAssociateToMap(PointType const* const pi, PointType* const p...
    method transformPointCloud (line 572) | pcl::PointCloud<PointType>::Ptr transformPointCloud(pcl::PointCloud<Po...
    method pclPointTogtsamPose3 (line 591) | gtsam::Pose3 pclPointTogtsamPose3(PointTypePose thisPoint) {
    method trans2gtsamPose (line 596) | gtsam::Pose3 trans2gtsamPose(float transformIn[]) {
    method pclPointToAffine3f (line 601) | Eigen::Affine3f pclPointToAffine3f(PointTypePose thisPoint) {
    method trans2Affine3f (line 605) | Eigen::Affine3f trans2Affine3f(float transformIn[]) {
    method PointTypePose (line 609) | PointTypePose trans2PointTypePose(float transformIn[]) {
    method saveMapService (line 620) | bool saveMapService(lio_segmot::save_mapRequest& req, lio_segmot::save...
    method saveEstimationResultService (line 685) | bool saveEstimationResultService(lio_segmot::save_estimation_resultReq...
    method visualizeGlobalMapThread (line 726) | void visualizeGlobalMapThread() {
    method publishGlobalMap (line 744) | void publishGlobalMap() {
    method loopClosureThread (line 795) | void loopClosureThread() {
    method loopInfoHandler (line 807) | void loopInfoHandler(const std_msgs::Float64MultiArray::ConstPtr& loop...
    method performLoopClosure (line 818) | void performLoopClosure() {
    method detectLoopClosureDistance (line 897) | bool detectLoopClosureDistance(int* latestID, int* closestID) {
    method detectLoopClosureExternal (line 929) | bool detectLoopClosureExternal(int* latestID, int* closestID) {
    method loopFindNearKeyframes (line 980) | void loopFindNearKeyframes(pcl::PointCloud<PointType>::Ptr& nearKeyfra...
    method visualizeLoopClosure (line 1002) | void visualizeLoopClosure() {
    method updateInitialGuess (line 1059) | void updateInitialGuess() {
    method extractForLoopClosure (line 1115) | void extractForLoopClosure() {
    method extractNearby (line 1128) | void extractNearby() {
    method extractCloud (line 1161) | void extractCloud(pcl::PointCloud<PointType>::Ptr cloudToExtract) {
    method extractSurroundingKeyFrames (line 1198) | void extractSurroundingKeyFrames() {
    method downsampleCurrentScan (line 1212) | void downsampleCurrentScan() {
    method updatePointAssociateToMap (line 1225) | void updatePointAssociateToMap() {
    method cornerOptimization (line 1229) | void cornerOptimization() {
    method surfOptimization (line 1336) | void surfOptimization() {
    method combineOptimizationCoeffs (line 1407) | void combineOptimizationCoeffs() {
    method LMOptimization (line 1427) | bool LMOptimization(int iterCount) {
    method scan2MapOptimization (line 1541) | void scan2MapOptimization() {
    method transformUpdate (line 1568) | void transformUpdate() {
    method constraintTransformation (line 1597) | float constraintTransformation(float value, float limit) {
    method saveFrame (line 1606) | bool saveFrame() {
    method addOdomFactor (line 1626) | void addOdomFactor() {
    method addGPSFactor (line 1652) | void addGPSFactor() {
    method addLoopFactor (line 1723) | void addLoopFactor() {
    method propagateObjectPoses (line 1741) | void propagateObjectPoses() {
    method addConstantVelocityFactor (line 1797) | void addConstantVelocityFactor() {
    method addStablePoseFactor (line 1830) | void addStablePoseFactor() {
    method addDetectionFactor (line 1880) | void addDetectionFactor(bool requiredMockDetection = false) {
    method saveKeyFramesAndFactor (line 2232) | void saveKeyFramesAndFactor() {
    method correctPoses (line 2439) | void correctPoses() {
    method updatePath (line 2470) | void updatePath(const PointTypePose& pose_in) {
    method publishOdometry (line 2486) | void publishOdometry() {
    method publishFrames (line 2553) | void publishFrames() {
  function main (line 2793) | int main(int argc, char** argv) {

FILE: src/offlineBagPlayer.cpp
  function playInstances (line 20) | void playInstances(const std::vector<rosbag::MessageInstance>& instances) {
  function odometryIsDoneCallback (line 40) | void odometryIsDoneCallback(const std_msgs::EmptyConstPtr& msg) {
  function main (line 46) | int main(int argc, char* argv[]) {

FILE: src/solver.cpp
  function gatherMaxMixtureRelinearizationKeys (line 4) | gtsam::KeySet gatherMaxMixtureRelinearizationKeys(const gtsam::Nonlinear...
Condensed preview — 48 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (404K chars).
[
  {
    "path": ".clang-format",
    "chars": 3293,
    "preview": "---\nLanguage:        Cpp\n# BasedOnStyle:  Google\nAccessModifierOffset: -1\nAlignAfterOpenBracket: Align\nAlignConsecutiveA"
  },
  {
    "path": ".github/stale.yml",
    "chars": 682,
    "preview": "# Number of days of inactivity before an issue becomes stale\ndaysUntilStale: 14\n# Number of days of inactivity before a "
  },
  {
    "path": ".gitignore",
    "chars": 2971,
    "preview": "devel/\nlogs/\nbuild/\nbin/\nlib/\nmsg_gen/\nsrv_gen/\nmsg/*Action.msg\nmsg/*ActionFeedback.msg\nmsg/*ActionGoal.msg\nmsg/*ActionR"
  },
  {
    "path": "CITATION.bib",
    "chars": 405,
    "preview": "@article{lin2023lio-segmot,\n  title={Asynchronous State Estimation of Simultaneous Ego-motion Estimation and Multiple Ob"
  },
  {
    "path": "CMakeLists.txt",
    "chars": 3018,
    "preview": "cmake_minimum_required(VERSION 2.8.3)\nproject(lio_segmot)\n\nset(CMAKE_BUILD_TYPE \"Release\")\nset(CMAKE_CXX_FLAGS \"-std=c++"
  },
  {
    "path": "LICENSE",
    "chars": 1547,
    "preview": "BSD 3-Clause License\n\nCopyright (c) 2020, Tixiao Shan\nCopyright (c) 2022-2023, Yu-Kai Lin\nAll rights reserved.\n\nRedistri"
  },
  {
    "path": "README.md",
    "chars": 28645,
    "preview": "# LIO-SEGMOT\n\nThe official implementation of LIO-SEGMOT (**L**iDAR-**I**nertial **O**dometry\nvia **S**imultaneous **Eg**"
  },
  {
    "path": "config/doc/kitti2bag/README.md",
    "chars": 680,
    "preview": "# kitti2bag\n\n## How to run it?\n\n```bash\nwget https://s3.eu-central-1.amazonaws.com/avg-kitti/raw_data/2011_09_26_drive_0"
  },
  {
    "path": "config/doc/kitti2bag/kitti2bag.py",
    "chars": 23713,
    "preview": "#!env python\n# -*- coding: utf-8 -*-\n\nimport sys\n\ntry:\n    import pykitti\nexcept ImportError as e:\n    print('Could not "
  },
  {
    "path": "config/params.yaml",
    "chars": 10180,
    "preview": "lio_segmot:\n\n  # Topics\n  pointCloudTopic: \"points_raw\"               # Point cloud data\n  imuTopic: \"imu_raw\"          "
  },
  {
    "path": "config/params_hsinchu.yaml",
    "chars": 10180,
    "preview": "lio_segmot:\n\n  # Topics\n  pointCloudTopic: \"points_raw\"               # Point cloud data\n  imuTopic: \"imu_raw\"          "
  },
  {
    "path": "config/params_kitti.yaml",
    "chars": 10180,
    "preview": "lio_segmot:\n\n  # Topics\n  pointCloudTopic: \"points_raw\"               # Point cloud data\n  imuTopic: \"imu_raw\"          "
  },
  {
    "path": "include/factor.h",
    "chars": 11628,
    "preview": "// Copyright 2021 Yu-Kai Lin. All rights reserved.\n// Use of this source code is governed by a BSD-style\n// license that"
  },
  {
    "path": "include/solver.h",
    "chars": 1530,
    "preview": "// Copyright 2022 Yu-Kai Lin. All rights reserved.\n// Use of this source code is governed by a BSD-style\n// license that"
  },
  {
    "path": "include/utility.h",
    "chars": 17833,
    "preview": "#pragma once\n#ifndef _UTILITY_LIDAR_ODOMETRY_H_\n#define _UTILITY_LIDAR_ODOMETRY_H_\n\n#include <ros/ros.h>\n\n#include <nav_"
  },
  {
    "path": "launch/debug.launch",
    "chars": 602,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n    \n    <!-- Parameters -->\n    <rosparam file=\"$(find lio_seg"
  },
  {
    "path": "launch/gdbserver_debug.launch",
    "chars": 612,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n    \n    <!-- Parameters -->\n    <rosparam file=\"$(find lio_seg"
  },
  {
    "path": "launch/include/config/robot.urdf.xacro",
    "chars": 965,
    "preview": "<?xml version=\"1.0\"?>\n<robot name=\"lio\" xmlns:xacro=\"http://tixiaoshan.github.io/\">\n  <xacro:property name=\"PI\" value=\"3"
  },
  {
    "path": "launch/include/config/rviz.rviz",
    "chars": 19848,
    "preview": "Panels:\n  - Class: rviz/Displays\n    Help Height: 191\n    Name: Displays\n    Property Tree Widget:\n      Expanded:\n     "
  },
  {
    "path": "launch/include/module_loam.launch",
    "chars": 688,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n    \n    <node pkg=\"$(arg project)\" type=\"$(arg project)_imuPre"
  },
  {
    "path": "launch/include/module_loam_debug.launch",
    "chars": 722,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n    \n    <node pkg=\"$(arg project)\" type=\"$(arg project)_imuPre"
  },
  {
    "path": "launch/include/module_loam_gdbserver_debug.launch",
    "chars": 729,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n    \n    <node pkg=\"$(arg project)\" type=\"$(arg project)_imuPre"
  },
  {
    "path": "launch/include/module_navsat.launch",
    "chars": 771,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n\n    <env name=\"ROSCONSOLE_CONFIG_FILE\" value=\"$(find lio_segmo"
  },
  {
    "path": "launch/include/module_robot_state_publisher.launch",
    "chars": 400,
    "preview": "<launch>\n\n\t<arg name=\"project\" default=\"lio_segmot\"/>\n\n    <param name=\"robot_description\" command=\"$(find xacro)/xacro "
  },
  {
    "path": "launch/include/module_rviz.launch",
    "chars": 215,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n\n    <!--- Run Rviz-->\n    <node pkg=\"rviz\" type=\"rviz\" name=\"$"
  },
  {
    "path": "launch/include/rosconsole/rosconsole_error.conf",
    "chars": 75,
    "preview": "# Set the default ros output to warning and higher\nlog4j.logger.ros = ERROR"
  },
  {
    "path": "launch/include/rosconsole/rosconsole_info.conf",
    "chars": 74,
    "preview": "# Set the default ros output to warning and higher\nlog4j.logger.ros = INFO"
  },
  {
    "path": "launch/include/rosconsole/rosconsole_warn.conf",
    "chars": 74,
    "preview": "# Set the default ros output to warning and higher\nlog4j.logger.ros = WARN"
  },
  {
    "path": "launch/run.launch",
    "chars": 596,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n    \n    <!-- Parameters -->\n    <rosparam file=\"$(find lio_seg"
  },
  {
    "path": "launch/run_hsinchu.launch",
    "chars": 604,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n    \n    <!-- Parameters -->\n    <rosparam file=\"$(find lio_seg"
  },
  {
    "path": "launch/run_kitti.launch",
    "chars": 602,
    "preview": "<launch>\n\n    <arg name=\"project\" default=\"lio_segmot\"/>\n    \n    <!-- Parameters -->\n    <rosparam file=\"$(find lio_seg"
  },
  {
    "path": "msg/Diagnosis.msg",
    "chars": 101,
    "preview": "Header header\n\nfloat64 computationalTime\nint32 numberOfDetections\nint32 numberOfTightlyCoupledObjects"
  },
  {
    "path": "msg/ObjectState.msg",
    "chars": 499,
    "preview": "Header header\n\njsk_recognition_msgs/BoundingBox detection\n\ngeometry_msgs/Pose pose\ngeometry_msgs/Pose velocity\n\nbool has"
  },
  {
    "path": "msg/ObjectStateArray.msg",
    "chars": 47,
    "preview": "Header header\n\nlio_segmot/ObjectState[] objects"
  },
  {
    "path": "msg/cloud_info.msg",
    "chars": 790,
    "preview": "# Cloud Info\nHeader header \n\nint32[] startRingIndex\nint32[] endRingIndex\n\nint32[]  pointColInd # point column index in r"
  },
  {
    "path": "msg/flags.msg",
    "chars": 13,
    "preview": "int32[] flags"
  },
  {
    "path": "package.xml",
    "chars": 1626,
    "preview": "<?xml version=\"1.0\"?>\n<package>\n  <name>lio_segmot</name>\n  <version>1.0.0</version>\n  <description>Lidar Odometry</desc"
  },
  {
    "path": "scripts/diagnose.py",
    "chars": 37833,
    "preview": "import copy\nimport multiprocessing as mp\nfrom typing import Dict, List, Tuple\n\nimport dash\nimport dash_bootstrap_compone"
  },
  {
    "path": "src/factor.cpp",
    "chars": 12435,
    "preview": "#include \"factor.h\"\n\n#include <boost/format.hpp>\n\n#include <gtsam/linear/JacobianFactor.h>\n#include <gtsam/slam/PriorFac"
  },
  {
    "path": "src/featureExtraction.cpp",
    "chars": 8327,
    "preview": "#include \"lio_segmot/cloud_info.h\"\n#include \"utility.h\"\n\nstruct smoothness_t {\n  float value;\n  size_t ind;\n};\n\nstruct b"
  },
  {
    "path": "src/imageProjection.cpp",
    "chars": 18740,
    "preview": "#include \"lio_segmot/cloud_info.h\"\n#include \"utility.h\"\n\nstruct VelodynePointXYZIRT {\n  PCL_ADD_POINT4D\n  PCL_ADD_INTENS"
  },
  {
    "path": "src/imuPreintegration.cpp",
    "chars": 20723,
    "preview": "#include \"utility.h\"\n\n#include <gtsam/geometry/Pose3.h>\n#include <gtsam/geometry/Rot3.h>\n#include <gtsam/inference/Symbo"
  },
  {
    "path": "src/mapOptimization.cpp",
    "chars": 125124,
    "preview": "#include <jsk_topic_tools/color_utils.h>\n#include \"factor.h\"\n#include \"lio_segmot/Diagnosis.h\"\n#include \"lio_segmot/Obje"
  },
  {
    "path": "src/offlineBagPlayer.cpp",
    "chars": 3157,
    "preview": "#include \"utility.h\"\n\n#include <rosbag/bag.h>\n#include <rosbag/view.h>\n\nstd::queue<std::vector<rosbag::MessageInstance>>"
  },
  {
    "path": "src/solver.cpp",
    "chars": 6322,
    "preview": "#include \"solver.h\"\n#include <gtsam/nonlinear/ISAM2-impl.h>\n\ngtsam::KeySet gatherMaxMixtureRelinearizationKeys(const gts"
  },
  {
    "path": "srv/detection.srv",
    "chars": 83,
    "preview": "sensor_msgs/PointCloud2 cloud\n---\njsk_recognition_msgs/BoundingBoxArray detections\n"
  },
  {
    "path": "srv/save_estimation_result.srv",
    "chars": 306,
    "preview": "---\nnav_msgs/Path robotTrajectory\nnav_msgs/Path[] objectTrajectories\nnav_msgs/Path[] objectVelocities\nnav_msgs/Path[] tr"
  },
  {
    "path": "srv/save_map.srv",
    "chars": 55,
    "preview": "float32 resolution\nstring destination\n---\nbool success\n"
  }
]

About this extraction

This page contains the full source code of the StephLin/LIO-SEGMOT GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 48 files (381.1 KB), approximately 102.7k tokens, and a symbol index with 160 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!