Repository: intel/ros2_grasp_library Branch: master Commit: 980b7ddd4348 Files: 139 Total size: 13.7 MB Directory structure: gitextract_b6simm2f/ ├── .gitignore ├── CHANGELOG.rst ├── LICENSE ├── README.md ├── docker/ │ ├── Dockerfile │ ├── README.md │ ├── script/ │ │ ├── 00_ros2_install.sh │ │ ├── 10_eigen_install.sh │ │ ├── 11_libpcl_install.sh │ │ ├── 12_opencv_install.sh │ │ ├── 13_openvino_install.sh │ │ ├── 20_librealsense_install.sh │ │ ├── 30_gpg_install.sh │ │ ├── 31_gpd_install.sh │ │ ├── 32_ur_modern_driver_install.sh │ │ ├── 50_ros2_deps.sh │ │ ├── install_ros2_grasp_library.sh │ │ ├── ros_entrypoint.sh │ │ └── ros_env.sh │ └── setup_docker_display.sh ├── grasp_apps/ │ ├── draw_x/ │ │ ├── CMakeLists.txt │ │ ├── launch/ │ │ │ ├── draw_x.launch.py │ │ │ └── draw_x.yaml │ │ ├── package.xml │ │ └── src/ │ │ └── draw_x.cpp │ ├── fixed_position_pick/ │ │ ├── CMakeLists.txt │ │ ├── launch/ │ │ │ ├── fixed_position_pick.launch.py │ │ │ └── fixed_position_pick.yaml │ │ ├── package.xml │ │ └── src/ │ │ └── fixed_position_pick.cpp │ ├── random_pick/ │ │ ├── CMakeLists.txt │ │ ├── cfg/ │ │ │ └── random_pick.yaml │ │ ├── package.xml │ │ └── src/ │ │ └── random_pick.cpp │ └── recognize_pick/ │ ├── CMakeLists.txt │ ├── cfg/ │ │ └── recognize_pick.yaml │ ├── package.xml │ └── src/ │ ├── place_publisher.cpp │ └── recognize_pick.cpp ├── grasp_msgs/ │ ├── CMakeLists.txt │ ├── msg/ │ │ ├── CloudIndexed.msg │ │ ├── CloudSamples.msg │ │ ├── CloudSources.msg │ │ ├── GraspConfig.msg │ │ ├── GraspConfigList.msg │ │ └── SamplesMsg.msg │ └── package.xml ├── grasp_ros2/ │ ├── CMakeLists.txt │ ├── cfg/ │ │ ├── grasp_ros2_params.yaml │ │ ├── random_pick.yaml │ │ ├── recognize_pick.yaml │ │ └── test_grasp_ros2.yaml │ ├── include/ │ │ └── grasp_library/ │ │ └── ros2/ │ │ ├── consts.hpp │ │ ├── grasp_detector_base.hpp │ │ ├── grasp_detector_gpd.hpp │ │ ├── grasp_planner.hpp │ │ └── ros_params.hpp │ ├── package.xml │ ├── src/ │ │ ├── consts.cpp │ │ ├── grasp_composition.cpp │ │ ├── grasp_detector_gpd.cpp │ │ ├── grasp_planner.cpp │ │ └── ros_params.cpp │ └── tests/ │ ├── CMakeLists.txt │ ├── resource/ │ │ └── table_top.pcd │ ├── tgrasp_ros2.cpp │ └── tgrasp_ros2.h.in ├── grasp_tutorials/ │ ├── CMakeLists.txt │ ├── README.md │ ├── _static/ │ │ └── images/ │ │ └── workflow.vsdx │ ├── conf.py │ ├── doc/ │ │ ├── bringup_robot.rst │ │ ├── draw_x.rst │ │ ├── fixed_position_pick.rst │ │ ├── getting_start.rst │ │ ├── grasp_api.rst │ │ ├── grasp_planner.rst │ │ ├── grasp_ros2/ │ │ │ ├── install_gpd.md │ │ │ ├── install_openvino.md │ │ │ ├── tutorials_1_grasp_ros2_with_camera.md │ │ │ ├── tutorials_2_grasp_ros2_test.md │ │ │ └── tutorials_3_grasp_ros2_launch_options.md │ │ ├── handeye_calibration.rst │ │ ├── overview.rst │ │ ├── random_pick.rst │ │ ├── recognize_pick.rst │ │ ├── robot_interface.rst │ │ └── template.rst │ ├── index.rst │ └── package.xml ├── grasp_utils/ │ ├── handeye_dashboard/ │ │ ├── README.md │ │ ├── config/ │ │ │ └── Default.perspective │ │ ├── data/ │ │ │ ├── camera-robot.json │ │ │ └── dataset.json │ │ ├── launch/ │ │ │ └── handeye_dashboard.launch.py │ │ ├── package.xml │ │ ├── plugin.xml │ │ ├── resource/ │ │ │ └── handeye_dashboard │ │ ├── setup.py │ │ └── src/ │ │ └── handeye_dashboard/ │ │ ├── __init__.py │ │ ├── handeye_calibration.py │ │ ├── main.py │ │ └── plugin.py │ ├── handeye_target_detection/ │ │ ├── .clang-format │ │ ├── CMakeLists.txt │ │ ├── README.md │ │ ├── include/ │ │ │ └── PoseEstimator.h │ │ ├── launch/ │ │ │ ├── pose_estimation.launch.py │ │ │ └── pose_estimation.yaml │ │ ├── package.xml │ │ └── src/ │ │ ├── pose_estimation_node.cpp │ │ └── pose_estimator.cpp │ ├── handeye_tf_service/ │ │ ├── CMakeLists.txt │ │ ├── README.md │ │ ├── package.xml │ │ ├── src/ │ │ │ └── handeye_tf_server.cpp │ │ └── srv/ │ │ └── HandeyeTF.srv │ └── robot_interface/ │ ├── CMakeLists.txt │ ├── Doxyfile │ ├── README.md │ ├── include/ │ │ └── robot_interface/ │ │ ├── control_base.hpp │ │ └── control_ur.hpp │ ├── launch/ │ │ ├── ur_test.launch.py │ │ └── ur_test.yaml │ ├── package.xml │ ├── src/ │ │ ├── control_base.cpp │ │ └── control_ur.cpp │ └── test/ │ ├── ur_test_move_command.cpp │ └── ur_test_state_publish.cpp └── moveit_msgs_light/ ├── CMakeLists.txt ├── README.md ├── msg/ │ ├── CollisionObject.msg │ ├── Grasp.msg │ ├── GripperTranslation.msg │ ├── MoveItErrorCodes.msg │ ├── ObjectType.msg │ └── PlaceLocation.msg ├── package.xml └── srv/ └── GraspPlanning.srv ================================================ FILE CONTENTS ================================================ ================================================ FILE: .gitignore ================================================ # rviz file *.rviz # document buid file build _build # vscode file .vscode ================================================ FILE: CHANGELOG.rst ================================================ changelog for ros2_grasp_library ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 0.5.0 (2019-11-06) ------------------ * Added examples for advanced industrial robot applications * draw X * fixed position pick and place * random picking with OpenVINO grasp planning * recognition picking with OpenVINO grasp planning and OpenVINO mask-rcnn object segmentation * Support ROS2 hand-eye calibration * Support robot interface for manipulation * Added tutorials on how to * Build and launch example applications * Operate hand-eye calibration and publish the transformation * Quickly enable robot interface on a new industrial robot 0.4.0 (2019-03-13) ------------------ * Support "service-driven" grasp detection mechanism (via configure auto_mode) to optimize CPU load for real-time processing. * Support grasp transformation from camera frame to a specified target frame expected in the visual manipulation. * Support launch option "grasp_approach" to specify expected approach direction in the target frame specified by 'grasp_frame_id'. Grasp Planner will return grasp poses with approach direction approximate to this parameter. * Support launch option "device" to configure device for grasp pose inference to execute, 0 for CPU, 1 for GPU, 2 for VPU, 3 for FPGA. In case OpenVINO plug-ins are installed (tutorial), this configure deploy the CNN based deep learning inference on to the target device. * Add tutorials for introduction to Intel DLDT toolkit and Intel OpenVINO toolkit. * Add tutorials for launch options and customization notes. 0.3.0 (2018-12-28) ------------------ * Support grasp pose detection from RGBD point cloud. * Support MoveIt! grasp planning service. ================================================ FILE: LICENSE ================================================ Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright 2018 Intel Corporation Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ================================================ FILE: README.md ================================================ # DISCONTINUATION OF PROJECT # This project will no longer be maintained by Intel. Intel has ceased development and contributions including, but not limited to, maintenance, bug fixes, new releases, or updates, to this project. Intel no longer accepts patches to this project. If you have an ongoing need to use this project, are interested in independently developing it, or would like to maintain patches for the open source software community, please create your own fork of this project. Contact: webadmin@linux.intel.com # ROS2 Grasp Library A ROS2 intelligent visual grasp solution for advanced industrial usages, with OpenVINO™ grasp detection and MoveIt Grasp Planning. ## Overview ROS2 Grasp Library enables state-of-the-art CNN based deep learning grasp detection algorithms on ROS2 for intelligent visual grasp in industrial robot usage scenarios. This package provides ROS2 interfaces compliant with the open source [MoveIt](http://moveit.ros.org/) motion planning framework supported by most of the [robot models](https://moveit.ros.org/robots) in ROS industrial. This package delivers * A ROS2 Grasp Planner providing grasp planning service, as an extensible capability of MoveIt ([moveit_msgs::srv::GraspPlanning](http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html)), translating grasp detection results into MoveIt Interfaces ([moveit_msgs::msg::Grasp](http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html)) * A ROS2 Grasp Detctor abstracting interfaces for grasp detection results * A ROS2 hand-eye calibration module generating transformation from camera frame to robot frame * ROS2 example applications demonstrating how to use this ROS2 Grasp Library in advanced industrial usages for intelligent visual grasp ## Grasp Detection Algorithms Grasp detection back-end algorithms enabled by this ROS2 Grasp Library: - [Grasp Pose Detection](https://github.com/atenpas/gpd) detects 6-DOF grasp poses for a 2-finger grasp (e.g. a parallel jaw gripper) in 3D point clouds from RGBD sensor or PCD file. The grasp detection was enabled with Intel® [DLDT](https://github.com/opencv/dldt) toolkit and Intel® [OpenVINO™](https://software.intel.com/en-us/openvino-toolkit) toolkit. ROS2 Grasp Library ## Tutorials Refer to ROS2 Grasp Library [Tutorials](http://intel.github.io/ros2_grasp_library) for how to * Install, build, and launch the ROS2 Grasp Planner and Detector * Use launch options to customize in a new workspace * Bring up the intelligent visual grasp solution on a new robot * Do hand-eye calibration for a new camera setup * Launch the example applications ## Example Applications ### Random Picking (OpenVINO Grasp Detection) [Random Pick with OpenVINO Grasp Detection - Link to Youtube video demo](https://www.youtube.com/embed/b4EPvHdidOA) ### Recognition Picking (OpenVINO Grasp Detection + OpenVINO Mask-rcnn Object Segmentation) [Recognition Pick with OpenVINO Grasp Detection - Link to Youtube video demo](https://www.youtube.com/embed/trIt0uKRXBs) ## Known Issues * Cloud camera failed at "Invalid sizes when resizing a matrix or array" when dealing with XYZRGBA pointcloud from ROS2 Realsenes, tracked as [#6](https://github.com/atenpas/gpg/issues/6) of gpg, [patch](https://github.com/atenpas/gpg/pull/7) under review. * 'colcon test' sometimes failed with test suite "tgrasp_ros2", due to ROS2 service request failure issue (reported ros2 examples issue [#228](https://github.com/ros2/examples/issues/228) and detailed discussed in ros2 demo issue [#304](https://github.com/ros2/demos/issues/304)) * Rviz2 failed to receive Static TF from camera due to transient_local QoS (expected in the coming ROS2 Eloquent, discussed in geometry2 issue [#183](https://github.com/ros2/geometry2/issues/183)), workaround [patch](https://github.com/intel/ros2_intel_realsense/pull/88) available till the adaption to Eloquent ## Contribute to This Project It's welcomed to contribute to this project. Here're some recommended practices: * When adding a new feature it's expected to add tests covering the new functionalities ```bash colcon test --packages-select ``` * Before submitting a patch, it's recommended to pass all existing tests to avoid regression ```bash colcon test --packages-select ``` ###### *Any security issue should be reported using process at https://01.org/security* ================================================ FILE: docker/Dockerfile ================================================ ######################################################## # Based on Ubuntu 18.04 ######################################################## # Set the base image to ubuntu 18.04 FROM ubuntu:bionic MAINTAINER Liu Cong "congx.liu@intel.com" ARG DEPS_DIR=/root/deps WORKDIR $DEPS_DIR # install ros2 grasp library deps COPY ./script/ $DEPS_DIR/script/ RUN bash script/install_ros2_grasp_library_deps.sh /root/deps WORKDIR /root ENTRYPOINT ["/root/script/ros_entrypoint.sh"] CMD ["bash"] ================================================ FILE: docker/README.md ================================================ # Precondition ## add docker group ``` sudo groupadd docker sudo usermod -aG docker $USER ``` ## Build docker image ``` cd ros2_grasp_library/docker docker build -t intel/ros2:ros2_grasp_library_deps . ``` If your use proxy ``` docker build -t intel/ros2:ros2_grasp_library_deps --build-arg http_proxy=: --build-arg https_proxy=: . ``` ## OPTION:Please refer below command to verify image creating success ``` docker images REPOSITORY TAG IMAGE ID CREATED SIZE intel/ros2 ros2_grasp_library_deps b6d619a01f33 1 hours ago 6.92GB ``` # Run OpenVINO Grasp Library with RGBD Camera ## Terminal 1: Build ros2_grasp_library and launch Rviz2 to illustrate detection results. After the project runs, there will be a pop-up x window, you need to set the operating environment first. ``` ./setup_docker_display.sh docker run -it --rm --privileged -v /tmp/.X11-unix:/tmp/.X11-unix:rw -v /tmp/.docker.xauth:/tmp/.docker.xauth:rw -v /dev/bus/usb:/dev/bus/usb \ -v /dev:/dev:rw -e XAUTHORITY=/tmp/.docker.xauth -e DISPLAY --name ros2_grasp_library intel/ros2:ros2_grasp_library_deps bash # cd /root/ # mkdir -p ros2_ws/src # cd ros2_ws/src # git clone https://github.com/intel/ros2_grasp_library.git # git clone https://github.com/intel/ros2_intel_realsense.git -b refactor # git clone https://github.com/intel/ros2_openvino_toolkit.git # cd .. # colcon build --symlink-install --packages-select grasp_msgs moveit_msgs people_msgs grasp_ros2 realsense_msgs realsense_ros realsense_node # source ./install/local_setup.bash # ros2 run rviz2 rviz2 -d src/ros2_grasp_library/grasp_ros2/rviz2/grasp.rviz ``` ## Terminal 2: launch RGBD camera ``` docker exec -t -i ros2_grasp_library bash # source /root/ros2_ws/install/setup.bash # ros2 run realsense_node realsense_node ``` ## Terminal 3: launch Grasp Library ``` docker exec -t -i ros2_grasp_library bash # source /root/ros2_ws/install/setup.bash # ros2 run grasp_ros2 grasp_ros2 __params:=src/ros2_grasp_library/grasp_ros2/cfg/grasp_ros2_params.yaml ``` Note: If you haven't already installed or want more information on how to use docker, please see the article here for more information: https://docs.docker.com/install/ ================================================ FILE: docker/script/00_ros2_install.sh ================================================ #!/bin/bash DEPS_DIR=${DEPS_PATH} ros2_version=dashing SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi # fix popup caused by libssl $SUDO apt-get install -y debconf-utils \ echo 'libssl1.0.0:amd64 libraries/restart-without-asking boolean true' | $SUDO debconf-set-selections # Authorize gpg key with apt $SUDO apt-get update && $SUDO apt-get install -y curl gnupg2 lsb-release &&\ curl http://repo.ros2.org/repos.key | $SUDO apt-key add - # Add the repository to sources list $SUDO sh -c 'echo "deb [arch=amd64,arm64] http://packages.ros.org/ros2/ubuntu `lsb_release -cs` main" > /etc/apt/sources.list.d/ros2-latest.list' # Install development tools and ROS tools $SUDO apt-get update && $SUDO apt-get install -y \ python-rosdep \ python3-vcstool \ python3-colcon-common-extensions # Install ROS 2 packages echo "install $ros2_version" $SUDO apt-get update && $SUDO apt-get install -y ros-${ros2_version}-desktop ================================================ FILE: docker/script/10_eigen_install.sh ================================================ #!/bin/bash set -e DEPS_DIR=${DEPS_PATH} eigen_version=https://gitlab.com/libeigen/eigen/-/archive/3.2/eigen-3.2.tar.gz SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi cd $DEPS_DIR $SUDO apt-get install -y gfortran wget -t 3 -c $eigen_version tar -xvf eigen-3.2.tar.gz cd eigen-3.2 &&mkdir -p build && cd build cmake -DCMAKE_BUILD_TYPE=Release .. make -j4 $SUDO make install $SUDO rm -rf /usr/include/eigen3/ $SUDO ln -sf /usr/local/include/eigen3 /usr/include/ $SUDO make install ================================================ FILE: docker/script/11_libpcl_install.sh ================================================ #!/bin/bash set -e DEPS_DIR=${DEPS_PATH} pcl_version=https://github.com/PointCloudLibrary/pcl/archive/pcl-1.8.1.tar.gz SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi cd $DEPS_DIR $SUDO apt-get install -y libhdf5-dev python3-h5py python3-pip wget -t 3 -c $pcl_version tar -xvf pcl-1.8.1.tar.gz cd pcl-pcl-1.8.1 &&mkdir -p build && cd build cmake -DCMAKE_BUILD_TYPE=Release .. make -j4 $SUDO make install ================================================ FILE: docker/script/12_opencv_install.sh ================================================ #!/bin/bash DEPS_DIR=${DEPS_PATH} opencv_version=4.1.2 SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi #install opencv cd $DEPS_DIR $SUDO apt-get update && $SUDO apt-get install -y build-essential \ libgtk2.0-dev \ pkg-config \ libavcodec-dev \ libavformat-dev \ libswscale-dev \ python-dev \ python-numpy \ libtbb2 \ libtbb-dev \ libjpeg-dev \ libpng-dev \ libtiff-dev \ libdc1394-22-dev git clone --depth 1 https://github.com/opencv/opencv.git -b $opencv_version git clone --depth 1 https://github.com/opencv/opencv_contrib.git -b $opencv_version cd $DEPS_DIR/opencv mkdir -p build && cd build cd $DEPS_DIR/opencv/build cmake -D OPENCV_EXTRA_MODULES_PATH=$DEPS_DIR/opencv_contrib/modules \ -D CMAKE_BUILD_TYPE=Release \ -D CMAKE_INSTALL_PREFIX=/usr/local \ -D BUILD_EXAMPLES=ON \ -D BUILD_opencv_xfeatures2d=OFF \ .. make -j4 $SUDO make install echo "/usr/local/lib" | $SUDO tee /etc/ld.so.conf.d/opencv.conf $SUDO ldconfig ================================================ FILE: docker/script/13_openvino_install.sh ================================================ #!/bin/bash DEPS_DIR=${DEPS_PATH} MKL_URL=https://github.com/intel/mkl-dnn/releases/download/v0.19/mklml_lnx_2019.0.5.20190502.tgz MKL_VERSION=mklml_lnx_2019.0.5.20190502 OPENVINO_VERSION=2019_R3.1 SUDO=$1 if [ "$SUDO" == "" ];then SUDO="sudo" fi # install mkl 2019.0.5.20190502 $SUDO apt-get update && $SUDO apt-get install -y wget cd $DEPS_DIR wget -t 3 -c ${MKL_URL} &&\ tar -xvf ${MKL_VERSION}.tgz &&\ cd ${MKL_VERSION} &&\ $SUDO mkdir -p /usr/local/lib/mklml &&\ $SUDO cp -rf ./lib /usr/local/lib/mklml &&\ $SUDO cp -rf ./include /usr/local/lib/mklml &&\ $SUDO touch /usr/local/lib/mklml/version.info #install opencl 19.41.14441 cd $DEPS_DIR mkdir -p opencl && cd opencl &&\ wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-gmmlib_19.3.2_amd64.deb &&\ wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-igc-core_1.0.2597_amd64.deb &&\ wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-igc-opencl_1.0.2597_amd64.deb &&\ wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-opencl_19.41.14441_amd64.deb &&\ wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-ocloc_19.41.14441_amd64.deb &&\ $SUDO dpkg -i *.deb #install cmake 3.11 if [ $(cmake --version|grep "version"|awk '{print $3}') != "3.14.3" ];then cd $DEPS_DIR wget -t 3 -c https://www.cmake.org/files/v3.14/cmake-3.14.3.tar.gz && \ tar xf cmake-3.14.3.tar.gz && \ (cd cmake-3.14.3 && ./bootstrap --parallel=$(nproc --all) && make --jobs=$(nproc --all) && $SUDO make install) && \ rm -rf cmake-3.14.3 cmake-3.14.3.tar.gz fi #install openvino 2019_R3.1 cd $DEPS_DIR $SUDO apt-get update && $SUDO apt-get install -y git git clone --depth 1 https://github.com/openvinotoolkit/openvino -b ${OPENVINO_VERSION} cd $DEPS_DIR/openvino/inference-engine git submodule update --init --recursive &&\ chmod +x install_dependencies.sh &&\ $SUDO ./install_dependencies.sh mkdir -p build && cd build &&\ cmake -DCMAKE_BUILD_TYPE=Release \ -DCMAKE_INSTALL_PREFIX=/usr/local \ -DGEMM=MKL -DMKLROOT=/usr/local/lib/mklml \ -DTHREADING=OMP \ -DENABLE_MKL_DNN=ON \ -DENABLE_CLDNN=ON \ -DENABLE_OPENCV=OFF \ .. cd $DEPS_DIR/openvino/inference-engine/build make -j8 cd $DEPS_DIR/openvino/inference-engine/build $SUDO mkdir -p /usr/share/InferenceEngine &&\ $SUDO cp InferenceEngineConfig*.cmake /usr/share/InferenceEngine &&\ $SUDO cp targets.cmake /usr/share/InferenceEngine &&\ echo `pwd`/../bin/intel64/Release/lib | $SUDO tee -a /etc/ld.so.conf.d/openvino.conf &&\ $SUDO ldconfig $SUDO ln -sf $DEPS_DIR/openvino /opt/openvino_toolkit/openvino ================================================ FILE: docker/script/20_librealsense_install.sh ================================================ #!/bin/bash DEPS_DIR=${DEPS_PATH} librealsense_version=2.31.0-0~realsense0.1791 SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi # install librealsense v2.34.0-0~realsense0.2251 echo "install librealsense 2.34.0-0~realsense0.2251" cd $DEPS_DIR if [ $http_proxy == "" ];then $SUDO apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE else $SUDO apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --keyserver-options http-proxy=$http_proxy --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE fi $SUDO sh -c 'echo "deb http://realsense-hw-public.s3.amazonaws.com/Debian/apt-repo `lsb_release -cs` main" > /etc/apt/sources.list.d/librealsense.list' $SUDO apt-get update && $SUDO apt-get install -y librealsense2=${librealsense_version} \ librealsense2-dev=${librealsense_version} \ librealsense2-udev-rules=${librealsense_version} \ librealsense2-gl=${librealsense_version} \ librealsense2-utils=${librealsense_version} \ librealsense2-dbg=${librealsense_version} \ librealsense2-dkms ================================================ FILE: docker/script/30_gpg_install.sh ================================================ #!/bin/bash DEPS_DIR=${DEPS_PATH} SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi # install gpg cd $DEPS_DIR wget -t 3 -c https://github.com/atenpas/gpg/archive/3dcd656d70f095ad1bda3d2fb597a994198466ab.zip unzip 3dcd656d70f095ad1bda3d2fb597a994198466ab.zip cd gpg-3dcd656d70f095ad1bda3d2fb597a994198466ab mkdir -p build && cd build cmake .. && make $SUDO make install ls /usr/local/lib/libgrasp_candidates_generator.so ================================================ FILE: docker/script/31_gpd_install.sh ================================================ #!/bin/bash DEPS_DIR=${DEPS_PATH} SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi # install gpd cd $DEPS_DIR git clone --depth 1 https://github.com/sharronliu/gpd.git -b libgpd cd gpd/src/gpd mkdir -p build && cd build cmake -DUSE_OPENVINO=ON .. && make $SUDO make install ================================================ FILE: docker/script/32_ur_modern_driver_install.sh ================================================ #!/bin/bash DEPS_DIR=${DEPS_PATH} SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi # install ur_modern_driver cd $DEPS_DIR git clone --depth 1 https://github.com/RoboticsYY/ur_modern_driver.git -b libur_modern_driver cd ur_modern_driver/libur_modern_driver mkdir -p build && cd build cmake -DCMAKE_BUILD_TYPE=Release .. && make $SUDO make install ================================================ FILE: docker/script/50_ros2_deps.sh ================================================ #!/bin/bash SUDO=$1 if [ "$SUDO" == "sudo" ];then SUDO="sudo" else SUDO="" fi $SUDO apt-get install -y ros-dashing-object-msgs \ python3-scipy \ ros-dashing-eigen3-cmake-module WORK_DIR=${DEPS_PATH}/../ros2_ws mkdir -p $WORK_DIR/src &&cd $WORK_DIR/src git clone --depth 1 https://github.com/RoboticsYY/ros2_ur_description.git git clone --depth 1 https://github.com/RoboticsYY/handeye git clone --depth 1 https://github.com/RoboticsYY/criutils.git git clone --depth 1 https://github.com/RoboticsYY/baldor.git git clone --depth 1 https://github.com/intel/ros2_intel_realsense.git -b refactor git clone --depth 1 https://github.com/intel/ros2_grasp_library.git cd $WORK_DIR source /opt/ros/dashing/setup.sh export InferenceEngine_DIR=/opt/openvino_toolkit/openvino/inference-engine/build/ export export CPU_EXTENSION_LIB=/opt/openvino_toolkit/openvino/inference-engine/bin/intel64/Release/lib/libcpu_extension.so export GFLAGS_LIB=/opt/openvino_toolkit/openvino/inference-engine/bin/intel64/Release/lib/libgflags_nothreads.a export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$InferenceEngine_DIR/../bin/intel64/Release/lib:/usr/local/lib/mklml/lib colcon build --symlink-install ================================================ FILE: docker/script/install_ros2_grasp_library.sh ================================================ #!/bin/bash set -e deps_path=$1 if [ -z "$deps_path" ]; then echo -e "warring:\n install_ros2_grasp_library_deps.sh " echo -e "If you want to use'sudo' : install_ros2_grasp_library_deps.sh sudo" exit 0 fi shift # mkdir deps-path echo "DEPS_PATH = $deps_path" mkdir -p $deps_path export DEPS_PATH=$deps_path CURRENT_DIR=$(dirname "$(readlink -f "${BASH_SOURCE[0]}")") echo "CURRENT_DIR = ${CURRENT_DIR}" # install ros2 dashing bash ${CURRENT_DIR}/00_ros2_install.sh $@ # instal eigen 3.2 bash ${CURRENT_DIR}/10_eigen_install.sh $@ # install libpcl 1.8.1 bash ${CURRENT_DIR}/11_libpcl_install.sh $@ # install opencv 4.1.2 bash ${CURRENT_DIR}/12_opencv_install.sh $@ # install openvino 2019_R3.1 bash ${CURRENT_DIR}/13_openvino_install.sh $@ # install librealsense 2.31 bash ${CURRENT_DIR}/20_librealsense_install.sh $@ # install gpg bash ${CURRENT_DIR}/30_gpg_install.sh $@ # install gpd bash ${CURRENT_DIR}/31_gpd_install.sh $@ # install ur_modern_driver bash ${CURRENT_DIR}/32_ur_modern_driver_install.sh $@ # build ros2 other deps bash ${CURRENT_DIR}/50_ros2_deps.sh $@ ================================================ FILE: docker/script/ros_entrypoint.sh ================================================ #!/bin/bash set -e # setup ros2 environment source /opt/ros/dashing/setup.bash source /root/ros2_ws/install/setup.bash exec "$@" ================================================ FILE: docker/script/ros_env.sh ================================================ #!/bin/bash ROS_PATH=$(pwd) # setup ros2 environment source /opt/ros/dashing/setup.bash source ${ROS_PATH}/install/setup.bash export ROS_DOMAIN_ID=100 # robot_group_id export InferenceEngine_DIR=/opt/openvino_toolkit/openvino/inference-engine/build/ export export CPU_EXTENSION_LIB=/opt/openvino_toolkit/openvino/inference-engine/bin/intel64/Release/lib/libcpu_extension.so export GFLAGS_LIB=/opt/openvino_toolkit/openvino/inference-engine/bin/intel64/Release/lib/libgflags_nothreads.a export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$InferenceEngine_DIR/../bin/intel64/Release/lib:/usr/local/lib/mklml/lib ================================================ FILE: docker/setup_docker_display.sh ================================================ #!/bin/bash set -e # setup docker display XSOCK=/tmp/.X11-unix XAUTH=/tmp/.docker.xauth touch $XAUTH xauth nlist $DISPLAY | sed -e 's/^..../ffff/' | xauth -f $XAUTH nmerge - ================================================ FILE: grasp_apps/draw_x/CMakeLists.txt ================================================ # Copyright (c) 2019 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. cmake_minimum_required(VERSION 3.5) project(draw_x) if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic) endif() if(CMAKE_BUILD_TYPE EQUAL "RELEASE") message(STATUS "Create Release Build.") set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}") else() message(STATUS "Create Debug Build.") endif() find_package(ament_cmake REQUIRED) find_package(rclcpp REQUIRED) find_package(robot_interface REQUIRED) include_directories( include ${rclcpp_INCLUDE_DIRS} ${robot_interface_INCLUDE_DIRS} ) # draw_x app add_executable(${PROJECT_NAME} src/draw_x.cpp ) ament_target_dependencies(${PROJECT_NAME} "rclcpp" "robot_interface" ) target_link_libraries(${PROJECT_NAME} ${ament_LIBRARIES} ${robot_interface_LIBRARIES} ) # Install binaries install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION bin ) install(TARGETS ${PROJECT_NAME} DESTINATION lib/${PROJECT_NAME} ) # Install launch files. install(DIRECTORY launch DESTINATION share/${PROJECT_NAME}/ ) # Flags if(UNIX OR APPLE) # Linker flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # GCC specific flags. ICC is compatible with them. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # In Clang, -z flags are not compatible, they need to be passed to linker via -Wl. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") endif() # Compiler flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU") # GCC specific flags. if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9) set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong") else() set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") endif() elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # Clang is compatbile with some of the flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # Same as above, with exception that ICC compilation crashes with -fPIE option, even # though it uses -pie linker option that require -fPIE during compilation. Checksec # shows that it generates correct PIE anyway if only -pie is provided. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector") endif() # Generic flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \ -Wall -fopenmp") set( CUDA_PROPAGATE_HOST_FLAGS OFF ) set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie") endif() ament_package() ================================================ FILE: grasp_apps/draw_x/launch/draw_x.launch.py ================================================ # Copyright (c) 2019 Intel Corporation. All Rights Reserved # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import launch import launch.actions import launch.substitutions import launch_ros.actions from ament_index_python.packages import get_package_share_directory def generate_launch_description(): # .yaml file for configuring the parameters yaml = os.path.join( get_package_share_directory('draw_x'), 'launch', 'draw_x.yaml' ) return launch.LaunchDescription([ launch_ros.actions.Node( package='draw_x', node_executable='draw_x', output='screen', arguments=['__params:='+yaml]), ]) ================================================ FILE: grasp_apps/draw_x/launch/draw_x.yaml ================================================ robot_control: ros__parameters: host: "192.168.0.5" shutdown_on_disconnect: true joint_names: ["shoulder_pan_joint", "shoulder_lift_joint", "elbow_joint", "wrist_1_joint", "wrist_2_joint", "wrist_3_joint"] ================================================ FILE: grasp_apps/draw_x/package.xml ================================================ draw_x 0.5.0 A demo app for draw_x Yu Yan Yu Yan Apache License 2.0 ament_cmake rclcpp robot_interface rclcpp robot_interface ament_lint_auto ament_lint_common ament_cmake ================================================ FILE: grasp_apps/draw_x/src/draw_x.cpp ================================================ #include #include #include /* pose in joint values*/ static const std::vector HOME = {0.87, -1.44, 1.68, -1.81, -1.56, 0}; /* pose in [x, y, z, R, P, Y]*/ static const std::vector CORNER1_POSE = { 0.1, -0.65, 0.15, 3.14, 0, -3.14}; static const std::vector CORNER2_POSE = {-0.1, -0.45, 0.15, 3.14, 0, -3.14}; static const std::vector CORNER3_POSE = {-0.1, -0.65, 0.15, 3.14, 0, -3.14}; static const std::vector CORNER4_POSE = { 0.1, -0.45, 0.15, 3.14, 0, -3.14}; int main(int argc, char **argv) { rclcpp::init(argc, argv); // init robot control auto robot = std::make_shared("robot_control", rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true)); robot->parseArgs(); robot->startLoop(); rclcpp::sleep_for(2s); // Move to home robot->moveToJointValues(HOME, 1.05, 1.4); // Move to the first corner robot->moveToTcpPose(CORNER1_POSE[0], CORNER1_POSE[1], CORNER1_POSE[2], CORNER1_POSE[3], CORNER1_POSE[4], CORNER1_POSE[5], 1.05, 1.4); robot->moveToTcpPose(CORNER1_POSE[0], CORNER1_POSE[1], CORNER1_POSE[2] - 0.05, CORNER1_POSE[3], CORNER1_POSE[4], CORNER1_POSE[5], 1.05, 1.4); // Move to the second corner robot->moveToTcpPose(CORNER2_POSE[0], CORNER2_POSE[1], CORNER2_POSE[2] - 0.05, CORNER2_POSE[3], CORNER2_POSE[4], CORNER2_POSE[5], 1.05, 1.4); robot->moveToTcpPose(CORNER2_POSE[0], CORNER2_POSE[1], CORNER2_POSE[2], CORNER2_POSE[3], CORNER2_POSE[4], CORNER2_POSE[5], 1.05, 1.4); // Move to the third corner robot->moveToTcpPose(CORNER3_POSE[0], CORNER3_POSE[1], CORNER3_POSE[2], CORNER3_POSE[3], CORNER3_POSE[4], CORNER3_POSE[5], 1.05, 1.4); robot->moveToTcpPose(CORNER3_POSE[0], CORNER3_POSE[1], CORNER3_POSE[2] - 0.05, CORNER3_POSE[3], CORNER3_POSE[4], CORNER3_POSE[5], 1.05, 1.4); // Move to the fourth corner robot->moveToTcpPose(CORNER4_POSE[0], CORNER4_POSE[1], CORNER4_POSE[2] - 0.05, CORNER4_POSE[3], CORNER4_POSE[4], CORNER4_POSE[5], 1.05, 1.4); robot->moveToTcpPose(CORNER4_POSE[0], CORNER4_POSE[1], CORNER4_POSE[2], CORNER4_POSE[3], CORNER4_POSE[4], CORNER4_POSE[5], 1.05, 1.4); // Move back to home robot->moveToJointValues(HOME, 1.05, 1.4); rclcpp::shutdown(); return 0; } ================================================ FILE: grasp_apps/fixed_position_pick/CMakeLists.txt ================================================ # Copyright (c) 2019 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. cmake_minimum_required(VERSION 3.5) project(fixed_position_pick) if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic) endif() if(CMAKE_BUILD_TYPE EQUAL "RELEASE") message(STATUS "Create Release Build.") set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}") else() message(STATUS "Create Debug Build.") endif() find_package(ament_cmake REQUIRED) find_package(rclcpp REQUIRED) find_package(robot_interface REQUIRED) include_directories( include ${rclcpp_INCLUDE_DIRS} ${robot_interface_INCLUDE_DIRS} ) # draw_x app add_executable(${PROJECT_NAME} src/fixed_position_pick.cpp ) ament_target_dependencies(${PROJECT_NAME} "rclcpp" "robot_interface" ) target_link_libraries(${PROJECT_NAME} ${ament_LIBRARIES} ${robot_interface_LIBRARIES} ) # Install binaries install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION bin ) install(TARGETS ${PROJECT_NAME} DESTINATION lib/${PROJECT_NAME} ) # Install launch files. install(DIRECTORY launch DESTINATION share/${PROJECT_NAME}/ ) # Flags if(UNIX OR APPLE) # Linker flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # GCC specific flags. ICC is compatible with them. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # In Clang, -z flags are not compatible, they need to be passed to linker via -Wl. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") endif() # Compiler flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU") # GCC specific flags. if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9) set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong") else() set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") endif() elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # Clang is compatbile with some of the flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # Same as above, with exception that ICC compilation crashes with -fPIE option, even # though it uses -pie linker option that require -fPIE during compilation. Checksec # shows that it generates correct PIE anyway if only -pie is provided. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector") endif() # Generic flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \ -Wall -fopenmp") set( CUDA_PROPAGATE_HOST_FLAGS OFF ) set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie") endif() ament_package() ================================================ FILE: grasp_apps/fixed_position_pick/launch/fixed_position_pick.launch.py ================================================ # Copyright (c) 2019 Intel Corporation. All Rights Reserved # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import launch import launch.actions import launch.substitutions import launch_ros.actions from ament_index_python.packages import get_package_share_directory def generate_launch_description(): # .yaml file for configuring the parameters yaml = os.path.join( get_package_share_directory('fixed_position_pick'), 'launch', 'fixed_position_pick.yaml' ) return launch.LaunchDescription([ launch_ros.actions.Node( package='fixed_position_pick', node_executable='fixed_position_pick', output='screen', arguments=['__params:='+yaml]), ]) ================================================ FILE: grasp_apps/fixed_position_pick/launch/fixed_position_pick.yaml ================================================ robot_control: ros__parameters: host: "192.168.0.5" shutdown_on_disconnect: true joint_names: ["shoulder_pan_joint", "shoulder_lift_joint", "elbow_joint", "wrist_1_joint", "wrist_2_joint", "wrist_3_joint"] ================================================ FILE: grasp_apps/fixed_position_pick/package.xml ================================================ fixed_position_pick 0.5.0 A demo app for draw_x Yu Yan Yu Yan Apache License 2.0 ament_cmake rclcpp robot_interface rclcpp robot_interface ament_lint_auto ament_lint_common ament_cmake ================================================ FILE: grasp_apps/fixed_position_pick/src/fixed_position_pick.cpp ================================================ #include #include #include /* pose in joint values*/ static const std::vector HOME = {0.87, -1.44, 1.68, -1.81, -1.56, 0}; /* pose in [x, y, z, qx, qy, qz, qw]*/ static const std::vector PICK_POSE = { -0.157402, -0.679509, 0.094437, 0.190600, 0.948295, 0.239947, 0.082662}; static const std::vector PLACE_POSE = {-0.350, -0.296, 0.145, -0.311507, 0.950216, -0.004305, 0.005879}; int main(int argc, char **argv) { rclcpp::init(argc, argv); // init robot control auto robot = std::make_shared("robot_control", rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true)); robot->parseArgs(); robot->startLoop(); rclcpp::sleep_for(2s); // Move to home robot->moveToJointValues(HOME, 1.05, 1.4); // Pick geometry_msgs::msg::PoseStamped pose_pick; pose_pick.header.frame_id = "base"; pose_pick.header.stamp = robot->now(); pose_pick.pose.position.x = PICK_POSE[0]; pose_pick.pose.position.y = PICK_POSE[1]; pose_pick.pose.position.z = PICK_POSE[2]; pose_pick.pose.orientation.x = PICK_POSE[3]; pose_pick.pose.orientation.y = PICK_POSE[4]; pose_pick.pose.orientation.z = PICK_POSE[5]; pose_pick.pose.orientation.w = PICK_POSE[6]; robot->pick(pose_pick, 1.05, 1.4, 0.5, 0.1); // Place geometry_msgs::msg::PoseStamped pose_place; pose_place.header.frame_id = "base"; pose_place.header.stamp = robot->now(); pose_place.pose.position.x = PLACE_POSE[0]; pose_place.pose.position.y = PLACE_POSE[1]; pose_place.pose.position.z = PLACE_POSE[2]; pose_place.pose.orientation.x = PLACE_POSE[3]; pose_place.pose.orientation.y = PLACE_POSE[4]; pose_place.pose.orientation.z = PLACE_POSE[5]; pose_place.pose.orientation.w = PLACE_POSE[6]; robot->place(pose_place, 1.05, 1.4, 0.5, 0.1); // Move back to home robot->moveToJointValues(HOME, 1.05, 1.4); rclcpp::shutdown(); return 0; } ================================================ FILE: grasp_apps/random_pick/CMakeLists.txt ================================================ # Copyright (c) 2019 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. option(BUILD_RANDOM_PICK "build random_pick app" ON) if(NOT BUILD_RANDOM_PICK STREQUAL "ON") return() endif() cmake_minimum_required(VERSION 3.5) project(random_pick) if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic) endif() if(CMAKE_BUILD_TYPE EQUAL "RELEASE") message(STATUS "Create Release Build.") set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}") else() message(STATUS "Create Debug Build.") endif() find_package(ament_cmake REQUIRED) find_package(rclcpp REQUIRED) find_package(moveit_msgs REQUIRED) find_package(robot_interface REQUIRED) find_package(tf2_ros REQUIRED) include_directories( include ${rclcpp_INCLUDE_DIRS} ${moveit_msgs_INCLUDE_DIRS} ${robot_interface_INCLUDE_DIRS} ${tf2_ros_INCLUDE_DIRS} ) # random_pick app add_executable(${PROJECT_NAME} src/random_pick.cpp ) ament_target_dependencies(${PROJECT_NAME} "rclcpp" "moveit_msgs" "robot_interface" "tf2_ros" ) target_link_libraries(${PROJECT_NAME} ${ament_LIBRARIES} ${robot_interface_LIBRARIES} ) # Install binaries install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION bin ) install(TARGETS ${PROJECT_NAME} DESTINATION lib/${PROJECT_NAME} ) # Flags if(UNIX OR APPLE) # Linker flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # GCC specific flags. ICC is compatible with them. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # In Clang, -z flags are not compatible, they need to be passed to linker via -Wl. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") endif() # Compiler flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU") # GCC specific flags. if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9) set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong") else() set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") endif() elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # Clang is compatbile with some of the flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # Same as above, with exception that ICC compilation crashes with -fPIE option, even # though it uses -pie linker option that require -fPIE during compilation. Checksec # shows that it generates correct PIE anyway if only -pie is provided. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector") endif() # Generic flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \ -Wall -fopenmp") set( CUDA_PROPAGATE_HOST_FLAGS OFF ) set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie") endif() ament_package() ================================================ FILE: grasp_apps/random_pick/cfg/random_pick.yaml ================================================ robot_control: ros__parameters: host: "192.168.1.5" ================================================ FILE: grasp_apps/random_pick/package.xml ================================================ random_pick 0.5.0 A demo app for grasp detection, and random picking Sharron LIU Sharron LIU Apache License 2.0 ament_cmake rclcpp moveit_msgs people_msgs robot_interface tf2_ros rclcpp moveit_msgs people_msgs robot_interface tf2_ros ament_lint_auto ament_lint_common ament_cmake ================================================ FILE: grasp_apps/random_pick/src/random_pick.cpp ================================================ #include #include #include #include #include #include #include #include #define robot_enable using GraspPlanning = moveit_msgs::srv::GraspPlanning; /* pick position in [x, y, z, R, P, Y]*/ static std::vector pick_ = {0.0, -0.54, 0.145, 3.14, 0.0, 1.956}; /* place position in [x, y, z, R, P, Y]*/ static std::vector place_ = {-0.50, -0.30, 0.20, 3.14, 0.0, 1.956}; /* pre-pick position in joint values*/ static std::vector joint_values_pick = {1.065, -1.470, 1.477, -1.577, -1.556, 0}; /* place position in joint values*/ static std::vector joint_values_place = {0.385, -1.470, 1.477, -1.577, -1.556, 0}; static double vel_ = 0.9, acc_ = 0.9, vscale_ = 0.9, appr_ = 0.1; static std::shared_ptr robot_ = nullptr; static rclcpp::Node::SharedPtr node_ = nullptr; static std::shared_ptr result_ = nullptr; int main(int argc, char **argv) { rclcpp::init(argc, argv); // init robot control robot_ = std::make_shared("robot_control", rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true)); robot_->parseArgs(); robot_->startLoop(); rclcpp::sleep_for(2s); #ifdef robot_enable // reset joint robot_->moveToJointValues(joint_values_place, vel_, acc_); #endif // init random pick node node_ = rclcpp::Node::make_shared("random_pick"); tf2_ros::StaticTransformBroadcaster tfb(node_); // create client for grasp planning auto client = node_->create_client("plan_grasps"); // wait for service while (!client->wait_for_service(5s)) { RCLCPP_INFO(node_->get_logger(), "Wait for service"); } RCLCPP_INFO(node_->get_logger(), "Got service"); while(rclcpp::ok()) { // request grasp poses auto request = std::make_shared(); auto result_future = client->async_send_request(request); RCLCPP_INFO(node_->get_logger(), "Request sent"); // wait for response if (rclcpp::spin_until_future_complete(node_, result_future) != rclcpp::executor::FutureReturnCode::SUCCESS) { continue; } // get response if (moveit_msgs::msg::MoveItErrorCodes::SUCCESS == result_future.get()->error_code.val) { result_ = result_future.get(); RCLCPP_INFO(node_->get_logger(), "Response received %d", result_->error_code.val); } else continue; geometry_msgs::msg::PoseStamped p = result_->grasps[0].grasp_pose; // publish grasp pose tf2::Quaternion q(p.pose.orientation.x, p.pose.orientation.y, p.pose.orientation.z, p.pose.orientation.w); double roll, pitch, yaw; tf2::Matrix3x3 r; r.setRotation(q); r.getRPY(roll, pitch, yaw); RCLCPP_INFO(node_->get_logger(), "**********pick pose [position %f %f %f, quat %f %f %f %f, RPY %f %f %f]", p.pose.position.x, p.pose.position.y, p.pose.position.z, p.pose.orientation.x, p.pose.orientation.y, p.pose.orientation.z, p.pose.orientation.w, roll, pitch, yaw); geometry_msgs::msg::TransformStamped tf_msg; tf_msg.header = p.header; tf_msg.child_frame_id = "grasp_pose"; tf_msg.transform.translation.x = p.pose.position.x; tf_msg.transform.translation.y = p.pose.position.y; tf_msg.transform.translation.z = p.pose.position.z; tf_msg.transform.rotation = p.pose.orientation; tfb.sendTransform(tf_msg); #ifdef robot_enable // pick robot_->moveToJointValues(joint_values_pick, vel_, acc_); robot_->pick(p, vel_, acc_, vscale_, appr_); // place robot_->moveToJointValues(joint_values_place, vel_, acc_); robot_->place(place_[0], place_[1], place_[2], place_[3], place_[4], place_[5], vel_, acc_, vscale_, appr_); #endif } rclcpp::shutdown(); return 0; } ================================================ FILE: grasp_apps/recognize_pick/CMakeLists.txt ================================================ # Copyright (c) 2019 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. option(BUILD_RECOGNIZE_PICK "build recognize_pick app" OFF) if(NOT BUILD_RECOGNIZE_PICK STREQUAL "ON") return() endif() cmake_minimum_required(VERSION 3.5) project(recognize_pick) if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic) endif() if(CMAKE_BUILD_TYPE EQUAL "RELEASE") message(STATUS "Create Release Build.") set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}") else() message(STATUS "Create Debug Build.") endif() find_package(ament_cmake REQUIRED) find_package(rclcpp REQUIRED) find_package(moveit_msgs REQUIRED) find_package(people_msgs REQUIRED) find_package(robot_interface REQUIRED) find_package(tf2_ros REQUIRED) include_directories( include ${rclcpp_INCLUDE_DIRS} ${moveit_msgs_INCLUDE_DIRS} ${people_msgs_INCLUDE_DIRS} ${robot_interface_INCLUDE_DIRS} ${tf2_ros_INCLUDE_DIRS} ) # recognize_pick app add_executable(${PROJECT_NAME} src/recognize_pick.cpp ) ament_target_dependencies(${PROJECT_NAME} "rclcpp" "moveit_msgs" "people_msgs" "robot_interface" "tf2_ros" ) target_link_libraries(${PROJECT_NAME} ${ament_LIBRARIES} ${robot_interface_LIBRARIES} ) # place publisher app set(PLACE_PUBLISHER place_publisher) add_executable(${PLACE_PUBLISHER} src/place_publisher.cpp ) ament_target_dependencies(${PLACE_PUBLISHER} "rclcpp" "moveit_msgs" ) target_link_libraries(${PLACE_PUBLISHER} ${ament_LIBRARIES} ) # Install binaries install(TARGETS ${PROJECT_NAME} ${PLACE_PUBLISHER} RUNTIME DESTINATION bin ) install(TARGETS ${PROJECT_NAME} ${PLACE_PUBLISHER} DESTINATION lib/${PROJECT_NAME} ) # Flags if(UNIX OR APPLE) # Linker flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # GCC specific flags. ICC is compatible with them. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # In Clang, -z flags are not compatible, they need to be passed to linker via -Wl. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") endif() # Compiler flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU") # GCC specific flags. if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9) set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong") else() set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") endif() elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # Clang is compatbile with some of the flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # Same as above, with exception that ICC compilation crashes with -fPIE option, even # though it uses -pie linker option that require -fPIE during compilation. Checksec # shows that it generates correct PIE anyway if only -pie is provided. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector") endif() # Generic flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \ -Wall -fopenmp") set( CUDA_PROPAGATE_HOST_FLAGS OFF ) set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie") endif() ament_package() ================================================ FILE: grasp_apps/recognize_pick/cfg/recognize_pick.yaml ================================================ robot_control: ros__parameters: host: "192.168.1.5" ================================================ FILE: grasp_apps/recognize_pick/package.xml ================================================ recognize_pick 0.5.0 A demo app for object segmentation, grasp detection, and picking Sharron LIU Sharron LIU Apache License 2.0 ament_cmake rclcpp moveit_msgs people_msgs robot_interface tf2_ros rclcpp moveit_msgs people_msgs robot_interface tf2_ros ament_lint_auto ament_lint_common ament_cmake ================================================ FILE: grasp_apps/recognize_pick/src/place_publisher.cpp ================================================ // Copyright (c) 2019 Intel Corporation // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #include #include #include int main(int argc, char ** argv) { std::vector args = rclcpp::init_and_remove_ros_arguments(argc, argv); auto node = rclcpp::Node::make_shared("PlacePublisher"); auto pub = node->create_publisher("/recognize_pick/place", 10); rclcpp::Clock clock(RCL_ROS_TIME); moveit_msgs::msg::PlaceLocation p; if (args.size() < 5) { p.place_pose.pose.position.x = -0.45; p.place_pose.pose.position.y = -0.30; p.place_pose.pose.position.z = 0.125; } else { p.place_pose.pose.position.x = atof(args[2].c_str()); p.place_pose.pose.position.y = atof(args[3].c_str()); p.place_pose.pose.position.z = atof(args[4].c_str()); } if (args.size() < 2) { RCLCPP_INFO(node->get_logger(), "Place publisher specifying object name and place position."); RCLCPP_INFO(node->get_logger(), "Usage: place_publisher object_name [x y z]"); RCLCPP_INFO(node->get_logger(), "Example: place_publisher sports_ball"); RCLCPP_INFO(node->get_logger(), "Example: place_publisher sports_ball -0.45 -0.30 0.125"); rclcpp::shutdown(); return 0; } else { p.id = args[1]; } RCLCPP_INFO(node->get_logger(), "place publisher %s [%f %f %f]", p.id.c_str(), p.place_pose.pose.position.x, p.place_pose.pose.position.y, p.place_pose.pose.position.z); while (rclcpp::ok()) { p.place_pose.header.stamp = clock.now(); p.place_pose.header.frame_id = "base"; pub->publish(p); rclcpp::Rate(0.5).sleep(); } rclcpp::spin(node); rclcpp::shutdown(); return 0; } ================================================ FILE: grasp_apps/recognize_pick/src/recognize_pick.cpp ================================================ #include #include #include #include #include #include #include #include #define robot_enable using GraspPlanning = moveit_msgs::srv::GraspPlanning; /* pick position in [x, y, z, R, P, Y]*/ static std::vector pick_ = {0.0, -0.54, 0.145, 3.14, 0.0, 1.956}; /* place position in [x, y, z, R, P, Y]*/ static std::vector place_ = {-0.45, -0.30, 0.125, 3.14, 0.0, 1.956}; /* pre-pick position in joint values*/ static std::vector joint_values_pick = {1.065, -1.470, 1.477, -1.577, -1.556, 0}; /* place position in joint values*/ static std::vector joint_values_place = {0.385, -1.470, 1.477, -1.577, -1.556, 0}; static double vel_ = 0.4, acc_ = 0.4, vscale_ = 0.5, appr_ = 0.1; static std::shared_ptr robot_ = nullptr; static rclcpp::Node::SharedPtr node_ = nullptr; static std::shared_ptr result_ = nullptr; static moveit_msgs::msg::PlaceLocation::SharedPtr place_pose_ = nullptr; static void place_callback(const moveit_msgs::msg::PlaceLocation::SharedPtr msg) { place_pose_ = msg; } int main(int argc, char **argv) { rclcpp::init(argc, argv); // init robot control robot_ = std::make_shared("robot_control", rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true)); robot_->parseArgs(); robot_->startLoop(); rclcpp::sleep_for(2s); #ifdef robot_enable // reset joint robot_->moveToJointValues(joint_values_place, vel_, acc_); #endif // init random pick node node_ = rclcpp::Node::make_shared("random_pick"); tf2_ros::StaticTransformBroadcaster tfb(node_); // subscribe place callback auto sub = node_->create_subscription( "/recognize_pick/place", rclcpp::QoS(rclcpp::KeepLast(0)), place_callback); // create client auto client = node_->create_client("plan_grasps"); // wait for service while (!client->wait_for_service(5s)) { RCLCPP_INFO(node_->get_logger(), "Wait for service"); } RCLCPP_INFO(node_->get_logger(), "Got service"); while(rclcpp::ok()) { if (place_pose_ == nullptr) { RCLCPP_INFO(node_->get_logger(), "Wait for place mission"); rclcpp::spin_some(node_); rclcpp::sleep_for(std::chrono::seconds(2)); continue; } moveit_msgs::msg::PlaceLocation::SharedPtr place = place_pose_; RCLCPP_INFO(node_->get_logger(), "Place %s", place->id.c_str()); // get grasp poses auto request = std::make_shared(); request->target.id = place->id; auto result_future = client->async_send_request(request); RCLCPP_INFO(node_->get_logger(), "Request sent"); // wait for response if (rclcpp::spin_until_future_complete(node_, result_future) != rclcpp::executor::FutureReturnCode::SUCCESS) { continue; } // get response if (moveit_msgs::msg::MoveItErrorCodes::SUCCESS == result_future.get()->error_code.val) { result_ = result_future.get(); RCLCPP_INFO(node_->get_logger(), "Response received %d", result_->error_code.val); } else continue; geometry_msgs::msg::PoseStamped p = result_->grasps[0].grasp_pose; // publish grasp pose tf2::Quaternion q(p.pose.orientation.x, p.pose.orientation.y, p.pose.orientation.z, p.pose.orientation.w); double roll, pitch, yaw; tf2::Matrix3x3 r; r.setRotation(q); r.getRPY(roll, pitch, yaw); RCLCPP_INFO(node_->get_logger(), "**********pick pose [position %f %f %f, quat %f %f %f %f, RPY %f %f %f]", p.pose.position.x, p.pose.position.y, p.pose.position.z, p.pose.orientation.x, p.pose.orientation.y, p.pose.orientation.z, p.pose.orientation.w, roll, pitch, yaw); geometry_msgs::msg::TransformStamped tf_msg; tf_msg.header = p.header; tf_msg.child_frame_id = "grasp_pose"; tf_msg.transform.translation.x = p.pose.position.x; tf_msg.transform.translation.y = p.pose.position.y; tf_msg.transform.translation.z = p.pose.position.z; tf_msg.transform.rotation = p.pose.orientation; tfb.sendTransform(tf_msg); #ifdef robot_enable // pick robot_->moveToJointValues(joint_values_pick, vel_, acc_); robot_->pick(p, vel_, acc_, vscale_, appr_); // place robot_->moveToJointValues(joint_values_place, vel_, acc_); robot_->place(place_[0], place_[1], place_[2], place_[3], place_[4], place_[5], vel_, acc_, vscale_, appr_); #endif rclcpp::spin_some(node_); place_pose_ = nullptr; } rclcpp::shutdown(); return 0; } ================================================ FILE: grasp_msgs/CMakeLists.txt ================================================ cmake_minimum_required(VERSION 3.5) project(grasp_msgs) if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic) endif() find_package(ament_cmake REQUIRED) find_package(rosidl_default_generators REQUIRED) find_package(builtin_interfaces REQUIRED) find_package(std_msgs REQUIRED) find_package(geometry_msgs REQUIRED) set(msg_files "msg/GraspConfig.msg" "msg/GraspConfigList.msg" "msg/SamplesMsg.msg" ) rosidl_generate_interfaces(${PROJECT_NAME} ${msg_files} DEPENDENCIES builtin_interfaces std_msgs geometry_msgs ADD_LINTER_TESTS ) ament_export_dependencies(rosidl_default_runtime) ament_package() ================================================ FILE: grasp_msgs/msg/CloudIndexed.msg ================================================ # This message holds a point cloud and a list of indices into the point cloud # at which to sample grasp candidates. # The point cloud. gpd/CloudSources cloud_sources # The indices into the point cloud at which to sample grasp candidates. std_msgs/Int64[] indices ================================================ FILE: grasp_msgs/msg/CloudSamples.msg ================================================ # This message holds a point cloud and a list of samples at which the grasp # detector should search for grasp candidates. # The point cloud. gpd/CloudSources cloud_sources # The samples, as (x,y,z) points, at which to search for grasp candidates. geometry_msgs/Point[] samples ================================================ FILE: grasp_msgs/msg/CloudSources.msg ================================================ # This message holds a point cloud that can be a combination of point clouds # from different camera sources (at least one). For each point in the cloud, # this message also stores the index of the camera that produced the point. # The point cloud. sensor_msgs/PointCloud2 cloud # For each point in the cloud, the index of the camera that acquired the point. std_msgs/Int64[] camera_source # A list of camera positions at which the point cloud was acquired. geometry_msgs/Point[] view_points ================================================ FILE: grasp_msgs/msg/GraspConfig.msg ================================================ # This message describes a 2-finger grasp configuration by its 6-DOF pose, # consisting of a 3-DOF position and 3-DOF orientation, and the opening # width of the robot hand. # Position geometry_msgs/Point bottom # centered bottom/base of the robot hand) geometry_msgs/Point top # centered top/fingertip of the robot hand) geometry_msgs/Point surface # centered position on object surface # Orientation represented as three axis (R = [approach binormal axis]) geometry_msgs/Vector3 approach # The grasp approach direction geometry_msgs/Vector3 binormal # The binormal geometry_msgs/Vector3 axis # The hand axis geometry_msgs/Point sample # Point at which the grasp was found std_msgs/Float32 width # Required aperture (opening width) of the robot hand std_msgs/Float32 score # Score assigned to the grasp by the classifier ================================================ FILE: grasp_msgs/msg/GraspConfigList.msg ================================================ # This message stores a list of grasp configurations. # The time of acquisition, and the coordinate frame ID. std_msgs/Header header # The list of grasp configurations. grasp_msgs/GraspConfig[] grasps # Name of the known object these grasps associated to. string object_name ================================================ FILE: grasp_msgs/msg/SamplesMsg.msg ================================================ # This message describes a set of point samples at which to detect grasps. # Header std_msgs/Header header # The samples, as (x,y,z) points, at which to search for grasp candidates. geometry_msgs/Point[] samples ================================================ FILE: grasp_msgs/package.xml ================================================ grasp_msgs 0.5.0 ROS2 messages definitions for grasp library Sharron LIU Apache License 2.0 ament_cmake rosidl_default_generators builtin_interfaces std_msgs geometry_msgs rosidl_default_runtime builtin_interfaces std_msgs geometry_msgs ament_lint_common rosidl_interface_packages ament_cmake ================================================ FILE: grasp_ros2/CMakeLists.txt ================================================ # Copyright (c) 2018 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. cmake_minimum_required(VERSION 3.5) project(grasp_ros2) if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic) endif() if(CMAKE_BUILD_TYPE EQUAL "RELEASE") message(STATUS "Create Release Build.") set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}") else() message(STATUS "Create Debug Build.") endif() if(BUILD_RECOGNIZE_PICK EQUAL "ON") add_definitions("-DRECOGNIZE_PICK") endif() find_package(ament_cmake REQUIRED) find_package(builtin_interfaces REQUIRED) find_package(rclcpp REQUIRED) find_package(rclcpp_components REQUIRED) find_package(grasp_msgs REQUIRED) find_package(sensor_msgs REQUIRED) find_package(moveit_msgs) if(BUILD_RECOGNIZE_PICK STREQUAL "ON") find_package(people_msgs) endif() find_package(visualization_msgs) find_package(tf2) find_package(tf2_ros REQUIRED) find_package(tf2_geometry_msgs REQUIRED) find_package(trajectory_msgs REQUIRED) find_package(pcl_conversions REQUIRED) find_package(Eigen3 REQUIRED) # GPG find_library(gpg_LIBRARIES grasp_candidates_generator) find_path(gpg_INCLUDE_DIRS gpg/grasp.h) # GPD find_library(gpd_LIBRARIES grasp_pose_detection) find_path(gpd_INCLUDE_DIRS gpd/grasp_detector.h) # PCL find_package(PCL 1.8.1 EXACT) include_directories(${PCL_INCLUDE_DIRS}) link_directories(${PCL_LIBRARY_DIRS}) add_definitions(${PCL_DEFINITIONS}) include_directories( include ${rclcpp_INCLUDE_DIRS} ${builtin_interfaces_INCLUDE_DIRS} ${grasp_msgs_INCLUDE_DIRS} ${sensor_msgs_INCLUDE_DIRS} ${moveit_msgs_INCLUDE_DIRS} if(BUILD_RECOGNIZE_PICK STREQUAL "ON") ${people_msgs_INCLUDE_DIRS} endif ${tf2_geometry_msgs_INCLUDE_DIRS} ${trajectory_msgs_INCLUDE_DIRS} ${visualization_msgs_INCLUDE_DIRS} ${pcl_conversions_INCLUDE_DIRS} ${tf2_INCLUDE_DIRS} ${gpg_INCLUDE_DIRS} ${gpd_INCLUDE_DIRS} ) # create ament index resource which references the libraries in the binary dir set(node_plugins "") # grasp detect set(libgrasp_detect "grasp_detect") add_library(${libgrasp_detect} SHARED src/consts.cpp src/ros_params.cpp src/grasp_detector_gpd.cpp) target_compile_definitions(${libgrasp_detect} PRIVATE "GRASP_ROS2_BUILDING_DLL") ament_target_dependencies(${libgrasp_detect} "class_loader" "rclcpp" "rclcpp_components" "grasp_msgs" "sensor_msgs" "moveit_msgs" if(BUILD_RECOGNIZE_PICK STREQUAL "ON") "people_msgs" endif() "visualization_msgs") target_link_libraries(${libgrasp_detect} ${gpg_LIBRARIES} ${gpd_LIBRARIES} ${PCL_LIBRARIES} ) rclcpp_components_register_nodes(${libgrasp_detect} "grasp_ros2::GraspDetectorGPD") set(node_plugins "${node_plugins}grasp_ros2::GraspDetectorGPD;$\n") # grasp plan set(libgrasp_plan "grasp_plan") add_library(${libgrasp_plan} SHARED src/consts.cpp src/ros_params.cpp src/grasp_planner.cpp) target_compile_definitions(${libgrasp_plan} PRIVATE "GRASP_ROS2_BUILDING_DLL") ament_target_dependencies(${libgrasp_plan} "class_loader" "rclcpp" "rclcpp_components" "grasp_msgs" "moveit_msgs" "tf2" "tf2_ros" "tf2_geometry_msgs" "trajectory_msgs") target_link_libraries(${libgrasp_plan} ) rclcpp_components_register_nodes(${libgrasp_plan} "grasp_ros2::GraspPlanner") set(node_plugins "${node_plugins}grasp_ros2::GraspPlanner;$\n") add_executable(${PROJECT_NAME} src/grasp_composition.cpp ) ament_target_dependencies(${PROJECT_NAME} "rclcpp" "builtin_interfaces" "grasp_msgs" "sensor_msgs" "moveit_msgs" "visualization_msgs" "tf2" "tf2_ros" "tf2_geometry_msgs" "trajectory_msgs" "pcl_conversions" ) target_link_libraries(${PROJECT_NAME} ${ament_LIBRARIES} ${gpg_LIBRARIES} ${gpd_LIBRARIES} ${PCL_LIBRARIES} ${libgrasp_detect} ${libgrasp_plan} ) # Install libs install(TARGETS ${libgrasp_detect} ${libgrasp_plan} ARCHIVE DESTINATION lib LIBRARY DESTINATION lib RUNTIME DESTINATION bin) # Install binaries install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION bin ) install(TARGETS ${PROJECT_NAME} DESTINATION lib/${PROJECT_NAME} ) # Install header files install( DIRECTORY include/ DESTINATION include ) # Flags if(UNIX OR APPLE) # Linker flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # GCC specific flags. ICC is compatible with them. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # In Clang, -z flags are not compatible, they need to be passed to linker via -Wl. set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \ -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now") endif() # Compiler flags. if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU") # GCC specific flags. if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9) set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong") else() set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") endif() elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang") # Clang is compatbile with some of the flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector") elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel") # Same as above, with exception that ICC compilation crashes with -fPIE option, even # though it uses -pie linker option that require -fPIE during compilation. Checksec # shows that it generates correct PIE anyway if only -pie is provided. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector") endif() # Generic flags. set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \ -Wall -fopenmp") set( CUDA_PROPAGATE_HOST_FLAGS OFF ) set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2") set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie") endif() if(BUILD_TESTING) find_package(ament_lint_auto REQUIRED) ament_lint_auto_find_test_dependencies() add_subdirectory(tests) endif() ament_package() ================================================ FILE: grasp_ros2/cfg/grasp_ros2_params.yaml ================================================ GraspDetectorGPD: ros__parameters: cloud_topic: /camera/pointcloud #cloud_topic: /mechmind/color_point_cloud rviz: true device: 0 # 0:CPU, 1:GPU, 2:VPU auto_mode: true plane_remove: true # grasp workspace in camera frames workspace: [-0.21, 0.29, -0.22, 0.15, 0.0, 1.0] # Realsense #workspace: [-0.16, 0.34, -0.26, 0.14, 1.4, 1.8] # Mechmind # gripper geometry parameters in metre # finger_width: the finger thickness # hand_outer_diameter: the maximum robot hand aperture # hand_depth: the hand depth (the finger length) # hand_height: the finger breadth finger_width: 0.005 hand_outer_diameter: 0.100 hand_depth: 0.038 hand_height: 0.020 GraspPlanner: ros__parameters: grasp_score_threshold: 20 grasp_frame_id: "camera_color_optical_frame" # Realsense #grasp_frame_id: "mechmind_camera" # Mechmind grasp_offset: [0.000, 0.000, 0.0] eef_offset: 0.174 eef_yaw_offset: 0.7854 # M_PI/4 finger_joint_names: ["panda_finger_joint1", "panda_finger_joint2"] ================================================ FILE: grasp_ros2/cfg/random_pick.yaml ================================================ GraspDetectorGPD: ros__parameters: #cloud_topic: /camera/pointcloud cloud_topic: /mechmind/color_point_cloud rviz: true device: 0 # 0:CPU, 1:GPU, 2:VPU auto_mode: true plane_remove: true # grasp workspace in camera frames #workspace: [-0.21, 0.29, -0.22, 0.15, 0.0, 1.0] # Realsense workspace: [-0.16, 0.28, -0.26, 0.14, 1.4, 1.65] # Mechmind # gripper geometry parameters in metre # finger_width: the finger thickness # hand_outer_diameter: the maximum robot hand aperture # hand_depth: the hand depth (the finger length) # hand_height: the finger breadth finger_width: 0.005 hand_outer_diameter: 0.100 hand_depth: 0.038 hand_height: 0.020 num_samples: 200 GraspPlanner: ros__parameters: grasp_score_threshold: 0 grasp_frame_id: "base" grasp_approach: [0.0, 0.0, -1.0] grasp_approach_angle: 0.523 # 1.047=PI/3 # 0.785=PI/4 # 0.523=PI/6 # 0.345=PI/9 # acceptable approaching angle grasp_offset: [0.004, 0.000, 0.0] # grasp boundry in grasp_frame_id grasp_boundry: [-0.2, 0.2, -0.65, -0.30, -0.15, 0.15] eef_offset: 0.162 eef_yaw_offset: -0.7854 # M_PI/4 finger_joint_names: ["panda_finger_joint1", "panda_finger_joint2"] ================================================ FILE: grasp_ros2/cfg/recognize_pick.yaml ================================================ GraspDetectorGPD: ros__parameters: cloud_topic: /camera/pointcloud # cloud_topic: "/camera/aligned_depth_to_color/color/points" object_topic: "/ros2_openvino_toolkit/segmented_obejcts" rviz: true device: 1 # 0:CPU, 1:GPU, 2:VPU auto_mode: false plane_remove: true object_detect: true # grasp workspace in camera frames workspace: [-0.23, 0.23, -0.33, 0.05, 0.0, 1.0] # gripper geometry parameters in metre # finger_width: the finger thickness # hand_outer_diameter: the maximum robot hand aperture # hand_depth: the hand depth (the finger length) # hand_height: the finger breadth finger_width: 0.005 hand_outer_diameter: 0.100 hand_depth: 0.038 hand_height: 0.020 GraspPlanner: ros__parameters: grasp_score_threshold: 1 grasp_frame_id: "base" grasp_approach: [0.0, 0.0, -1.0] # expect approaching in -z axis grasp_approach_angle: 0.7 # 1.047=PI/3 # 0.785=PI/4 # 0.523=PI/6 # 0.345=PI/9 # acceptable approaching angle grasp_offset: [0.006, -0.003, 0.000] # grasp boundry in grasp_frame_id grasp_boundry: [-0.2, 0.2, -0.65, -0.30, -0.15, 0.15] eef_offset: 0.154 eef_yaw_offset: 0.7854 # M_PI/4 finger_joint_names: ["panda_finger_joint1", "panda_finger_joint2"] ================================================ FILE: grasp_ros2/cfg/test_grasp_ros2.yaml ================================================ GraspDetectorGPD: ros__parameters: cloud_topic: /camera/pointcloud rviz: false device: 0 auto_mode: false GraspPlanner: ros__parameters: grasp_score_threshold: 0 grasp_frame_id: "camera_color_optical_frame" ================================================ FILE: grasp_ros2/include/grasp_library/ros2/consts.hpp ================================================ // Copyright (c) 2019 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #ifndef GRASP_LIBRARY__ROS2__CONSTS_HPP_ #define GRASP_LIBRARY__ROS2__CONSTS_HPP_ #include namespace grasp_ros2 { /** Consts class * * \brief A class contains global constatnts definition for grasp library. * */ class Consts { public: /** Topic name of "PointCloud2" message published by an RGBD sensor.*/ static const char kTopicPointCloud2[]; /** Topic name of "detected objects" message published by Object Detector.*/ static const char kTopicDetectedObjects[]; /** Topic name of "detected grasps" message published by this Grasp Detector.*/ static const char kTopicDetectedGrasps[]; /** Topic name of "rviz grasps" message published by this Grasp Detector.*/ static const char kTopicVisualGrasps[]; /** Topic name of "tabletop pointcloud" message published by this Grasp Detector.*/ static const char kTopicTabletop[]; }; } // namespace grasp_ros2 #endif // GRASP_LIBRARY__ROS2__CONSTS_HPP_ ================================================ FILE: grasp_ros2/include/grasp_library/ros2/grasp_detector_base.hpp ================================================ // Copyright (c) 2019 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #ifndef GRASP_LIBRARY__ROS2__GRASP_DETECTOR_BASE_HPP_ #define GRASP_LIBRARY__ROS2__GRASP_DETECTOR_BASE_HPP_ #include #include namespace grasp_ros2 { /** GraspCallback class * * \brief Abstract base class for grasp callback. * * A grasp planner inherits from this class get called back for grasp detection resutls. */ class GraspCallback { public: /** * \brief Callback for grasp detection results. * * \param msg Pointer to grasp detection results. */ virtual void grasp_callback(const grasp_msgs::msg::GraspConfigList::SharedPtr msg) = 0; }; /** GraspDetectorBase class * * \brief A base class for detecting grasp poses from visual input. * * This class defines uniform interface for grasp library, regardless whichever algorithm * is used for grasp detection. */ class GraspDetectorBase { public: /** * \brief Constructor. */ GraspDetectorBase() : object_name_(""), grasp_cb_(nullptr) { } /** * \brief Destructor. */ ~GraspDetectorBase() { } /** * \brief Start grasp detection. * When this function is called, GraspDetector starts processing visual input. * \param name Name of the object for which to detect grasps */ void start(std::string name = "") { started_ = true; object_name_ = name; } /** * \brief Stop grasp detection. * When this function is called, GraspDetector stops processing visual input. */ void stop() { started_ = false; } /** * \brief Register grasp callback function. * * \param cb Callback function to be registered. */ void add_callback(GraspCallback * cb) { grasp_cb_ = cb; } protected: bool started_ = false; std::string object_name_; GraspCallback * grasp_cb_; }; } // namespace grasp_ros2 #endif // GRASP_LIBRARY__ROS2__GRASP_DETECTOR_BASE_HPP_ ================================================ FILE: grasp_ros2/include/grasp_library/ros2/grasp_detector_gpd.hpp ================================================ // Copyright (c) 2018 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #ifndef GRASP_LIBRARY__ROS2__GRASP_DETECTOR_GPD_HPP_ #define GRASP_LIBRARY__ROS2__GRASP_DETECTOR_GPD_HPP_ // ROS2 #include #include #ifdef RECOGNIZE_PICK #include #endif #include #include #include #include // PCL #include #include #include // eigen #include // GPG #include // this project (messages) #include #include #include // system #include #include #include #include #include #include "grasp_library/ros2/consts.hpp" #include "grasp_library/ros2/grasp_detector_base.hpp" #include "grasp_library/ros2/grasp_planner.hpp" namespace grasp_ros2 { typedef pcl::PointCloud PointCloudRGBA; typedef pcl::PointCloud PointCloudPointNormal; /** GraspDetectorGPD class * * \brief A ROS node that can detect grasp poses in a point cloud. * * This class is a ROS node that handles all the ROS topics. * */ class GraspDetectorGPD : public rclcpp::Node, public GraspDetectorBase { public: /** * \brief Constructor. */ explicit GraspDetectorGPD(const rclcpp::NodeOptions & options); /** * \brief Destructor. */ ~GraspDetectorGPD() { delete cloud_camera_; // todo stop and delete threads } private: /** * \brief Run the ROS node. Loops while waiting for incoming ROS messages. */ void onInit(); /** * \brief Detect grasp poses in a point cloud received from a ROS topic. * \return the list of grasp poses */ std::vector detectGraspPosesInTopic(); /** * \brief Callback function for the ROS topic that contains the input point cloud. * \param msg the incoming ROS message */ void cloud_callback(const sensor_msgs::msg::PointCloud2::SharedPtr msg); #ifdef RECOGNIZE_PICK /** * \brief Callback function for the ROS topic that contains the detected and segmented objects * \param msg The detected objects message */ void object_callback(const people_msgs::msg::ObjectsInMasks::SharedPtr msg); #endif /** * \brief Create a ROS message that contains a list of grasp poses from a list of handles. * \param hands the list of grasps * \return the ROS message that contains the grasp poses */ grasp_msgs::msg::GraspConfigList createGraspListMsg(const std::vector & hands); /** * \brief Convert GPD Grasp into grasp message. * \param hand A GPD grasp * \return The Grasp message converted */ grasp_msgs::msg::GraspConfig convertToGraspMsg(const Grasp & hand); /** * \brief Convert GPD Grasps into visual grasp messages. */ visualization_msgs::msg::MarkerArray convertToVisualGraspMsg( const std::vector & hands, double outer_diameter, double hand_depth, double finger_width, double hand_height, const std::string & frame_id); /** * \brief Create finger marker for visual grasp messages */ visualization_msgs::msg::Marker createFingerMarker( const Eigen::Vector3d & center, const Eigen::Matrix3d & frame, double length, double width, double height, int id, const std::string & frame_id); /** * \brief Create hand base marker for visual grasp messages */ visualization_msgs::msg::Marker createHandBaseMarker( const Eigen::Vector3d & start, const Eigen::Vector3d & end, const Eigen::Matrix3d & frame, double length, double height, int id, const std::string & frame_id); /** Converts an Eigen Vector into a Point message. Todo ROS2 eigen_conversions*/ void pointEigenToMsg(const Eigen::Vector3d & e, geometry_msgs::msg::Point & m) { m.x = e(0); m.y = e(1); m.z = e(2); } /** Converts an Eigen Vector into a Vector message. Todo ROS2 eigen_conversions*/ void vectorEigenToMsg(const Eigen::Vector3d & e, geometry_msgs::msg::Vector3 & m) { m.x = e(0); m.y = e(1); m.z = e(2); } Eigen::Vector3d view_point_; /**< (input) view point of the camera onto the point cloud*/ /** stores point cloud with (optional) camera information and surface normals*/ CloudCamera * cloud_camera_; std_msgs::msg::Header cloud_camera_header_; /**< stores header of the point cloud*/ /** status variables for received (input) messages*/ bool has_cloud_; std::string frame_; /**< point cloud frame*/ bool auto_mode_; /**< grasp detection mode*/ bool plane_remove_; /**< whether enable object detection>*/ #ifdef RECOGNIZE_PICK /** the latest message on detected objects*/ people_msgs::msg::ObjectsInMasks::SharedPtr object_msg_; #endif std::vector grasp_ws_; rclcpp::callback_group::CallbackGroup::SharedPtr callback_group_subscriber1_; rclcpp::callback_group::CallbackGroup::SharedPtr callback_group_subscriber2_; /** ROS2 subscriber for point cloud messages*/ rclcpp::Subscription::SharedPtr cloud_sub_; #ifdef RECOGNIZE_PICK /** ROS2 subscriber for object messages*/ rclcpp::Subscription::SharedPtr object_sub_; #endif /** ROS2 publisher for grasp list messages*/ rclcpp::Publisher::SharedPtr grasps_pub_; /** ROS2 publisher for filtered point clouds*/ rclcpp::Publisher::SharedPtr filtered_pub_; /** ROS2 publisher for grasps in rviz (visualization)*/ rclcpp::Publisher::SharedPtr grasps_rviz_pub_; std::shared_ptr grasp_detector_; /**< used to run the grasp pose detection*/ GraspDetector::GraspDetectionParameters detection_param_; /**< grasp detector parameters*/ rclcpp::Logger logger_ = rclcpp::get_logger("GraspDetectorGPD"); std::thread * detector_thread_; /**< thread for grasp detection*/ }; } // namespace grasp_ros2 #endif // GRASP_LIBRARY__ROS2__GRASP_DETECTOR_GPD_HPP_ ================================================ FILE: grasp_ros2/include/grasp_library/ros2/grasp_planner.hpp ================================================ // Copyright (c) 2018 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #ifndef GRASP_LIBRARY__ROS2__GRASP_PLANNER_HPP_ #define GRASP_LIBRARY__ROS2__GRASP_PLANNER_HPP_ #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include "grasp_library/ros2/grasp_detector_base.hpp" namespace grasp_ros2 { /** GraspPlanner class * * \brief A MoveIt grasp planner * * This class provide ROS service for MoveIt grasp planning. Grasp Planner drives grasp detection * and takes the results from Grasp Detector. */ class GraspPlanner : public rclcpp::Node, public GraspCallback { public: struct GraspPlanningParameters { /** timeout in seconds for a service request waiting for grasp detection result*/ int grasp_service_timeout_; /** minimum score expected for grasps returned from this service*/ int grasp_score_threshold_; /** frame id expected for grasps returned from this service*/ std::string grasp_frame_id_; /** approach direction in grasp_frame_id_ expected for grasps*/ tf2::Vector3 grasp_approach_; /** maxmimum angle in radian acceptable between the expected 'approach_' and * the real approach returned from this service*/ double grasp_approach_angle_; /** offset [x, y, z] in metres applied to the grasps detected*/ std::vector grasp_offset_; /** boundry cube in grasp_frame_id_ expected for grasps returned from this service*/ std::vector grasp_boundry_; /** offset in metres from the gripper base (finger root) to the parent link of gripper*/ double eef_offset; /** gripper yaw offset to its parent link, in radian (e.g. 0.0, or M_PI/4)*/ double eef_yaw_offset; /** minimum distance in metres for a grasp to approach and retreat*/ double grasp_min_distance_; /** desired distance in metres for a grasp to approach and retreat*/ double grasp_desired_distance_; /** joint names of gripper fingers*/ std::vector finger_joint_names_; /** trajectory points in 'open' status, for joints in the same order as 'finger_joint_names_'*/ trajectory_msgs::msg::JointTrajectoryPoint finger_points_open_; /** trajectory points in 'close' status, for joints in the same order as 'finger_joint_names_'*/ trajectory_msgs::msg::JointTrajectoryPoint finger_points_close_; }; /** * \brief Constructor. * \param grasp_detector Grasp Detector used by this planner. */ explicit GraspPlanner( const rclcpp::NodeOptions & options, GraspDetectorBase * grasp_detector = nullptr); /** * \brief Destructor. */ ~GraspPlanner() { delete tfBuffer_; } void grasp_callback(const grasp_msgs::msg::GraspConfigList::SharedPtr msg); /** * \brief Grasp planning service handler. * When a grasp service request comes, Grasp Planner tells the Grasp Detector to start grasp * detection, waits for grasp callback arrival or till a configurable timeout period, then stops * grasp detection, skips grasps with low scores, transforms grasps into the specified frame_id * (if TF available), applies the configured offset, skips grasps out of boundry, and returns the * results via grasp service response. */ void grasp_service( const std::shared_ptr request_header, const std::shared_ptr req, const std::shared_ptr res); private: /** * \brief Transform a grasp from original frame to the 'grasp_frame_id_' frame. * Keep 'to' grasp identical to 'from' grasp, in case of transform missing or failure. * \param from The grasp to transform. * \param to The transformed output. * \param header Message header for the frame of the 'from' grasp. * \return true if transformation success, otherwise false. */ bool transform( grasp_msgs::msg::GraspConfig & from, grasp_msgs::msg::GraspConfig & to, const std_msgs::msg::Header & header); /** * \brief Check if the grasp position is in boundary. * \param p Grasp position. * \return True if the grasp position in boundary, otherwise False. */ bool check_boundry(const geometry_msgs::msg::Point & p); /** * \brief Translate a grasp message to MoveIt message. * 'Grasp.grasp_pose.pose.position' was translated from 'GraspConfig.bottom', which is the * position closest to the 'parent_link' of the end-effector. * \param grasp Grasp message to be translated. * \header Message header for the frame where the 'grasp' was detected. * \return MoveIt message */ moveit_msgs::msg::Grasp toMoveIt( grasp_msgs::msg::GraspConfig & grasp, const std_msgs::msg::Header & header); std::mutex m_; std::condition_variable cv_; GraspPlanningParameters param_; rclcpp::Logger logger_ = rclcpp::get_logger("GraspPlanner"); /*buffer for grasps to be returned from this service*/ std::vector moveit_grasps_; rclcpp::callback_group::CallbackGroup::SharedPtr callback_group_subscriber3_; rclcpp::Service::SharedPtr grasp_srv_; /*grasp service*/ tf2_ros::Buffer * tfBuffer_; /*buffer for transformation listener*/ std::shared_ptr tfListener_; /*Transform listener*/ tf2_ros::StaticTransformBroadcaster tfBroadcaster_; /*grasp pose transformation broadcaster*/ GraspDetectorBase * grasp_detector_; /*grasp detector node*/ }; } // namespace grasp_ros2 #endif // GRASP_LIBRARY__ROS2__GRASP_PLANNER_HPP_ ================================================ FILE: grasp_ros2/include/grasp_library/ros2/ros_params.hpp ================================================ // Copyright (c) 2018 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #ifndef GRASP_LIBRARY__ROS2__ROS_PARAMS_HPP_ #define GRASP_LIBRARY__ROS2__ROS_PARAMS_HPP_ // ROS2 core #include // ROS2 projects #include #include "grasp_library/ros2/grasp_planner.hpp" namespace grasp_ros2 { /** ROSParameters class * * \brief A class to bridge parameters passed from ROS. * */ class ROSParameters { public: static void getDetectionParams( rclcpp::Node * node, GraspDetector::GraspDetectionParameters & param); static void getPlanningParams(rclcpp::Node * Node, GraspPlanner::GraspPlanningParameters & param); }; } // namespace grasp_ros2 #endif // GRASP_LIBRARY__ROS2__ROS_PARAMS_HPP_ ================================================ FILE: grasp_ros2/package.xml ================================================ grasp_ros2 0.5.0 ROS2 grasp library as MoveIt plug-in Sharron LIU Apache License 2.0 ament_cmake rosidl_default_generators builtin_interfaces rclcpp rclcpp_components std_msgs sensor_msgs grasp_msgs moveit_msgs people_msgs visualization_msgs pcl_conversions tf2 tf2_ros tf2_geometry_msgs trajectory_msgs rosidl_default_runtime builtin_interfaces rclcpp rclcpp_components std_msgs sensor_msgs grasp_msgs moveit_msgs people_msgs visualization_msgs pcl_conversions tf2 tf2_ros tf2_geometry_msgs trajectory_msgs ament_lint_auto ament_lint_common ament_cmake ================================================ FILE: grasp_ros2/src/consts.cpp ================================================ // Copyright (c) 2019 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #include "grasp_library/ros2/consts.hpp" namespace grasp_ros2 { const char Consts::kTopicPointCloud2[] = "/camera/pointcloud"; const char Consts::kTopicDetectedObjects[] = "/ros2_openvino_toolkit/segmented_obejcts"; const char Consts::kTopicDetectedGrasps[] = "/grasp_library/clustered_grasps"; const char Consts::kTopicVisualGrasps[] = "/grasp_library/grasps_rviz"; const char Consts::kTopicTabletop[] = "/grasp_library/tabletop_points"; } // namespace grasp_ros2 ================================================ FILE: grasp_ros2/src/grasp_composition.cpp ================================================ // Copyright (c) 2019 Intel Corporation // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #include #include #include "grasp_library/ros2/grasp_detector_gpd.hpp" #include "grasp_library/ros2/grasp_planner.hpp" using GraspDetectorGPD = grasp_ros2::GraspDetectorGPD; using GraspDetectorBase = grasp_ros2::GraspDetectorBase; using GraspPlanner = grasp_ros2::GraspPlanner; int main(int argc, char ** argv) { rclcpp::init(argc, argv); rclcpp::executors::MultiThreadedExecutor exec; auto detect_node = std::make_shared( rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true)); exec.add_node(detect_node); GraspDetectorBase * grasp_detector = dynamic_cast(detect_node.get()); auto plan_node = std::make_shared( rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true), grasp_detector); exec.add_node(plan_node); exec.spin(); detect_node = nullptr; plan_node = nullptr; rclcpp::shutdown(); return 0; } ================================================ FILE: grasp_ros2/src/grasp_detector_gpd.cpp ================================================ // Copyright (c) 2019 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #include #include #include #include #include #include #include #include #include #include "grasp_library/ros2/grasp_detector_gpd.hpp" #include "grasp_library/ros2/ros_params.hpp" namespace grasp_ros2 { GraspDetectorGPD::GraspDetectorGPD(const rclcpp::NodeOptions & options) : Node("GraspDetectorGPD", options), GraspDetectorBase(), cloud_camera_(NULL), has_cloud_(false), frame_(""), #ifdef RECOGNIZE_PICK object_msg_(nullptr), object_sub_(nullptr), #endif filtered_pub_(nullptr), grasps_rviz_pub_(nullptr) { std::vector camera_position; this->get_parameter_or("camera_position", camera_position, std::vector(std::initializer_list({0, 0, 0}))); view_point_ << camera_position[0], camera_position[1], camera_position[2]; this->get_parameter_or("auto_mode", auto_mode_, true); std::string cloud_topic, grasp_topic, rviz_topic, tabletop_topic, object_topic; this->get_parameter_or("cloud_topic", cloud_topic, std::string(Consts::kTopicPointCloud2)); bool rviz, object_detect; this->get_parameter_or("rviz", rviz, false); this->get_parameter_or("plane_remove", plane_remove_, false); this->get_parameter_or("object_detect", object_detect, false); callback_group_subscriber1_ = this->create_callback_group( rclcpp::callback_group::CallbackGroupType::MutuallyExclusive); auto sub1_opt = rclcpp::SubscriptionOptions(); sub1_opt.callback_group = callback_group_subscriber1_; auto callback = [this](const sensor_msgs::msg::PointCloud2::SharedPtr msg) -> void { this->cloud_callback(msg); }; cloud_sub_ = this->create_subscription(cloud_topic, rclcpp::QoS(10), callback, sub1_opt); grasps_pub_ = this->create_publisher( Consts::kTopicDetectedGrasps, 10); if (rviz) { grasps_rviz_pub_ = this->create_publisher( Consts::kTopicVisualGrasps, 10); filtered_pub_ = this->create_publisher( Consts::kTopicTabletop, 10); } #ifdef RECOGNIZE_PICK if (object_detect) { callback_group_subscriber2_ = this->create_callback_group( rclcpp::callback_group::CallbackGroupType::MutuallyExclusive); auto sub2_opt = rclcpp::SubscriptionOptions(); sub2_opt.callback_group = callback_group_subscriber2_; this->get_parameter_or("object_topic", object_topic, std::string(Consts::kTopicDetectedObjects)); auto callback = [this](const people_msgs::msg::ObjectsInMasks::SharedPtr msg) -> void { this->object_callback(msg); }; object_sub_ = this->create_subscription(object_topic, rclcpp::QoS(10), callback, sub2_opt); } #endif // GraspDetector::GraspDetectionParameters detection_param; ROSParameters::getDetectionParams(this, detection_param_); grasp_detector_ = std::make_shared(detection_param_); RCLCPP_INFO(logger_, "ROS2 Grasp Library node up..."); detector_thread_ = new std::thread(&GraspDetectorGPD::onInit, this); detector_thread_->detach(); } void GraspDetectorGPD::onInit() { rclcpp::Rate rate(100); RCLCPP_INFO(logger_, "Waiting for point cloud to arrive ..."); while (rclcpp::ok()) { if (has_cloud_) { // detect grasps in point cloud std::vector grasps = detectGraspPosesInTopic(); // visualize grasps in rviz if (grasps_rviz_pub_) { const HandSearch::Parameters & params = grasp_detector_->getHandSearchParameters(); grasps_rviz_pub_->publish(convertToVisualGraspMsg(grasps, params.hand_outer_diameter_, params.hand_depth_, params.finger_width_, params.hand_height_, frame_)); } // reset the system has_cloud_ = false; RCLCPP_INFO(logger_, "Waiting for point cloud to arrive ..."); } // rclcpp::spin(shared_from_this()); rate.sleep(); } } std::vector GraspDetectorGPD::detectGraspPosesInTopic() { // detect grasp poses std::vector grasps; { // preprocess the point cloud grasp_detector_->preprocessPointCloud(*cloud_camera_); // detect grasps in the point cloud grasps = grasp_detector_->detectGrasps(*cloud_camera_); } // Publish the selected grasps. grasp_msgs::msg::GraspConfigList selected_grasps_msg = createGraspListMsg(grasps); if (grasp_cb_) { grasp_cb_->grasp_callback( std::make_shared(selected_grasps_msg)); } grasps_pub_->publish(selected_grasps_msg); RCLCPP_INFO(logger_, "Published %d highest-scoring grasps.", selected_grasps_msg.grasps.size()); return grasps; } void GraspDetectorGPD::cloud_callback(const sensor_msgs::msg::PointCloud2::SharedPtr msg) { if (!auto_mode_ && !started_) {return;} #ifdef RECOGNIZE_PICK people_msgs::msg::ObjectsInMasks::SharedPtr object_msg; if (object_sub_) { if (object_name_.empty()) { RCLCPP_INFO(logger_, "Waiting for object name..."); return; } object_msg = object_msg_; object_msg_ = nullptr; if (nullptr == object_msg || object_msg->objects_vector.empty()) { RCLCPP_INFO(logger_, "Waiting for object callback..."); return; } } #endif RCLCPP_DEBUG(logger_, "PCD callback..."); if (!has_cloud_) { delete cloud_camera_; cloud_camera_ = NULL; Eigen::Matrix3Xd view_points(3, 1); view_points.col(0) = view_point_; if (msg->fields.size() == 6 && msg->fields[3].name == "normal_x" && msg->fields[4].name == "normal_y" && msg->fields[5].name == "normal_z") { PointCloudPointNormal::Ptr cloud(new PointCloudPointNormal); pcl::fromROSMsg(*msg, *cloud); cloud_camera_ = new CloudCamera(cloud, 0, view_points); cloud_camera_header_ = msg->header; } else { PointCloudRGBA::Ptr cloud(new PointCloudRGBA); pcl::fromROSMsg(*msg, *cloud); // filter workspace for (uint32_t i = 0; i < cloud->size(); i++) { if (cloud->points[i].x > detection_param_.workspace_[0] && cloud->points[i].x < detection_param_.workspace_[1] && cloud->points[i].y > detection_param_.workspace_[2] && cloud->points[i].y < detection_param_.workspace_[3] && cloud->points[i].z > detection_param_.workspace_[4] && cloud->points[i].z < detection_param_.workspace_[5]) { continue; } else { cloud->points[i].x = std::numeric_limits::quiet_NaN(); cloud->points[i].y = std::numeric_limits::quiet_NaN(); cloud->points[i].z = std::numeric_limits::quiet_NaN(); } } // remove table plane if (plane_remove_) { pcl::ModelCoefficients::Ptr coefficients(new pcl::ModelCoefficients); pcl::PointIndices::Ptr inliers(new pcl::PointIndices); pcl::SACSegmentation seg; seg.setOptimizeCoefficients(true); seg.setModelType(pcl::SACMODEL_PLANE); seg.setMethodType(pcl::SAC_RANSAC); seg.setDistanceThreshold(0.025); seg.setInputCloud(cloud); seg.segment(*inliers, *coefficients); for (size_t i = 0; i < inliers->indices.size(); ++i) { cloud->points[inliers->indices[i]].x = std::numeric_limits::quiet_NaN(); cloud->points[inliers->indices[i]].y = std::numeric_limits::quiet_NaN(); cloud->points[inliers->indices[i]].z = std::numeric_limits::quiet_NaN(); } } #ifdef RECOGNIZE_PICK // filter object location if (object_sub_) { bool found = false; for (auto obj : object_msg->objects_vector) { if (0 == obj.object_name.compare(object_name_)) { RCLCPP_INFO(logger_, "obj name %s prob %f roi [%d %d %d %d] %d %d", obj.object_name.c_str(), obj.probability, obj.roi.x_offset, obj.roi.y_offset, obj.roi.width, obj.roi.height, msg->width, msg->height); std::vector indices; for (size_t i = 0; i < obj.roi.height; i++) { // rows int idx = (i + obj.roi.y_offset) * msg->width + obj.roi.x_offset; for (size_t j = 0; j < obj.roi.width; j++) { // columns // todo use mask_array from from object msg if (!isnan(cloud->points[idx + j].x) && !isnan(cloud->points[idx + j].y) && !isnan(cloud->points[idx + j].z)) { indices.push_back(idx + j); } } } pcl::ExtractIndices filter; filter.setInputCloud(cloud); filter.setIndices(boost::make_shared>(indices)); filter.filter(*cloud); Eigen::Matrix3Xf xyz = cloud->getMatrixXfMap(3, sizeof(pcl::PointXYZRGBA) / sizeof(float), 0); RCLCPP_INFO(logger_, "*************** %f %f, %f %f, %f %f", xyz.row(0).minCoeff(), xyz.row(0).maxCoeff(), xyz.row(1).minCoeff(), xyz.row(1).maxCoeff(), xyz.row(2).minCoeff(), xyz.row(2).maxCoeff()); grasp_ws_ = {xyz.row(0).minCoeff(), xyz.row(0).maxCoeff(), xyz.row(1).minCoeff(), xyz.row(1).maxCoeff(), xyz.row(2).minCoeff(), xyz.row(2).maxCoeff()}; found = true; break; } } if (!found) {return;} } #endif if (filtered_pub_) { sensor_msgs::msg::PointCloud2 msg2; pcl::toROSMsg(*cloud, msg2); // workaround rviz rgba msg2.fields[3].name = "rgb"; msg2.fields[3].datatype = 7; filtered_pub_->publish(msg2); } cloud_camera_ = new CloudCamera(cloud, 0, view_points); cloud_camera_header_ = msg->header; } RCLCPP_INFO(logger_, "Received cloud with %d points and normals.", cloud_camera_->getCloudProcessed()->size()); has_cloud_ = true; frame_ = msg->header.frame_id; } } #ifdef RECOGNIZE_PICK void GraspDetectorGPD::object_callback(const people_msgs::msg::ObjectsInMasks::SharedPtr msg) { RCLCPP_INFO(logger_, "Object callback *************************[%d]", msg->objects_vector.size()); for (auto obj : msg->objects_vector) { RCLCPP_INFO(logger_, "obj name %s prob %f roi[%d %d %d %d]", obj.object_name.c_str(), obj.probability, obj.roi.x_offset, obj.roi.y_offset, obj.roi.width, obj.roi.height); if (0 == obj.object_name.compare("orange")) { for (size_t i = 0; i < obj.roi.height; i++) { // rows // std::cout << "\n"; for (size_t j = 0; j < obj.roi.width; j++) { // columns // int a = obj.mask_array[i * obj.roi.width + j] * 10; // if (a>5) std::cout << a; else std::cout << "*"; } } } } if (msg->objects_vector.size() > 0) { object_msg_ = msg; } } #endif grasp_msgs::msg::GraspConfigList GraspDetectorGPD::createGraspListMsg( const std::vector & hands) { grasp_msgs::msg::GraspConfigList msg; for (uint32_t i = 0; i < hands.size(); i++) { msg.grasps.push_back(convertToGraspMsg(hands[i])); } msg.header = cloud_camera_header_; msg.object_name = object_name_; return msg; } grasp_msgs::msg::GraspConfig GraspDetectorGPD::convertToGraspMsg(const Grasp & hand) { grasp_msgs::msg::GraspConfig msg; pointEigenToMsg(hand.getGraspBottom(), msg.bottom); pointEigenToMsg(hand.getGraspTop(), msg.top); pointEigenToMsg(hand.getGraspSurface(), msg.surface); vectorEigenToMsg(hand.getApproach(), msg.approach); vectorEigenToMsg(hand.getBinormal(), msg.binormal); vectorEigenToMsg(hand.getAxis(), msg.axis); msg.width.data = hand.getGraspWidth(); msg.score.data = hand.getScore(); pointEigenToMsg(hand.getSample(), msg.sample); return msg; } visualization_msgs::msg::MarkerArray GraspDetectorGPD::convertToVisualGraspMsg( const std::vector & hands, double outer_diameter, double hand_depth, double finger_width, double hand_height, const std::string & frame_id) { double width = outer_diameter; double hw = 0.5 * width; visualization_msgs::msg::MarkerArray marker_array; visualization_msgs::msg::Marker left_finger, right_finger, base, approach; Eigen::Vector3d left_bottom, right_bottom, left_top, right_top, left_center, right_center, approach_center, base_center; for (uint32_t i = 0; i < hands.size(); i++) { left_bottom = hands[i].getGraspBottom() - (hw - 0.5 * finger_width) * hands[i].getBinormal(); right_bottom = hands[i].getGraspBottom() + (hw - 0.5 * finger_width) * hands[i].getBinormal(); left_top = left_bottom + hand_depth * hands[i].getApproach(); right_top = right_bottom + hand_depth * hands[i].getApproach(); left_center = left_bottom + 0.5 * (left_top - left_bottom); right_center = right_bottom + 0.5 * (right_top - right_bottom); base_center = left_bottom + 0.5 * (right_bottom - left_bottom) - 0.01 * hands[i].getApproach(); approach_center = base_center - 0.04 * hands[i].getApproach(); base = createHandBaseMarker(left_bottom, right_bottom, hands[i].getFrame(), 0.02, hand_height, i, frame_id); left_finger = createFingerMarker(left_center, hands[i].getFrame(), hand_depth, finger_width, hand_height, i * 3, frame_id); right_finger = createFingerMarker(right_center, hands[i].getFrame(), hand_depth, finger_width, hand_height, i * 3 + 1, frame_id); approach = createFingerMarker(approach_center, hands[i].getFrame(), 0.08, finger_width, hand_height, i * 3 + 2, frame_id); marker_array.markers.push_back(left_finger); marker_array.markers.push_back(right_finger); marker_array.markers.push_back(approach); marker_array.markers.push_back(base); } return marker_array; } visualization_msgs::msg::Marker GraspDetectorGPD::createFingerMarker( const Eigen::Vector3d & center, const Eigen::Matrix3d & frame, double length, double width, double height, int id, const std::string & frame_id) { visualization_msgs::msg::Marker marker; marker.header.frame_id = frame_id; marker.header.stamp = rclcpp::Clock(RCL_ROS_TIME).now(); marker.ns = "finger"; marker.id = id; marker.type = visualization_msgs::msg::Marker::CUBE; marker.action = visualization_msgs::msg::Marker::ADD; marker.pose.position.x = center(0); marker.pose.position.y = center(1); marker.pose.position.z = center(2); marker.lifetime = rclcpp::Duration(20.0, 0); // use orientation of hand frame Eigen::Quaterniond quat(frame); marker.pose.orientation.x = quat.x(); marker.pose.orientation.y = quat.y(); marker.pose.orientation.z = quat.z(); marker.pose.orientation.w = quat.w(); // these scales are relative to the hand frame (unit: meters) marker.scale.x = length; // forward direction marker.scale.y = width; // hand closing direction marker.scale.z = height; // hand vertical direction marker.color.a = 0.5; marker.color.r = 0.0; marker.color.g = 0.0; marker.color.b = 0.5; return marker; } visualization_msgs::msg::Marker GraspDetectorGPD::createHandBaseMarker( const Eigen::Vector3d & start, const Eigen::Vector3d & end, const Eigen::Matrix3d & frame, double length, double height, int id, const std::string & frame_id) { Eigen::Vector3d center = start + 0.5 * (end - start); visualization_msgs::msg::Marker marker; marker.header.frame_id = frame_id; marker.header.stamp = rclcpp::Clock(RCL_ROS_TIME).now(); marker.ns = "hand_base"; marker.id = id; marker.type = visualization_msgs::msg::Marker::CUBE; marker.action = visualization_msgs::msg::Marker::ADD; marker.pose.position.x = center(0); marker.pose.position.y = center(1); marker.pose.position.z = center(2); marker.lifetime = rclcpp::Duration(20.0, 0); // use orientation of hand frame Eigen::Quaterniond quat(frame); marker.pose.orientation.x = quat.x(); marker.pose.orientation.y = quat.y(); marker.pose.orientation.z = quat.z(); marker.pose.orientation.w = quat.w(); // these scales are relative to the hand frame (unit: meters) marker.scale.x = length; // forward direction marker.scale.y = (end - start).norm(); // hand closing direction marker.scale.z = height; // hand vertical direction marker.color.a = 0.5; marker.color.r = 0.0; marker.color.g = 0.0; marker.color.b = 1.0; return marker; } } // namespace grasp_ros2 #include "rclcpp_components/register_node_macro.hpp" RCLCPP_COMPONENTS_REGISTER_NODE(grasp_ros2::GraspDetectorGPD) ================================================ FILE: grasp_ros2/src/grasp_planner.cpp ================================================ // Copyright (c) 2019 Intel Corporation // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #include #include #include #include #include #include #include #include #include #include #include "grasp_library/ros2/consts.hpp" #include "grasp_library/ros2/grasp_planner.hpp" #include "grasp_library/ros2/ros_params.hpp" namespace grasp_ros2 { using GraspPlanning = moveit_msgs::srv::GraspPlanning; GraspPlanner::GraspPlanner(const rclcpp::NodeOptions & options, GraspDetectorBase * grasp_detector) : Node("GraspPlanner", options), GraspCallback(), tfBroadcaster_(this), grasp_detector_(grasp_detector) { ROSParameters::getPlanningParams(this, param_); callback_group_subscriber3_ = this->create_callback_group( rclcpp::callback_group::CallbackGroupType::MutuallyExclusive); auto service = [this](const std::shared_ptr request_header, const std::shared_ptr req, const std::shared_ptr res) -> void { this->grasp_service(request_header, req, res); }; grasp_srv_ = this->create_service("plan_grasps", service, rmw_qos_profile_default, callback_group_subscriber3_); grasp_detector_->add_callback(this); tfBuffer_ = new tf2_ros::Buffer(std::make_shared(RCL_ROS_TIME)); tfListener_ = std::make_shared(*tfBuffer_); RCLCPP_INFO(logger_, "ROS2 Grasp Planning Service up..."); } void GraspPlanner::grasp_callback(const grasp_msgs::msg::GraspConfigList::SharedPtr msg) { RCLCPP_INFO(logger_, "Received grasp callback"); rclcpp::Time rclcpp_time = std::make_shared(RCL_ROS_TIME)->now() + rclcpp::Duration(6, 0); static bool tf_needed = (param_.grasp_frame_id_ != msg->header.frame_id); RCLCPP_INFO(logger_, "tf_needed %d", tf_needed); std_msgs::msg::Header header; header.frame_id = tf_needed ? param_.grasp_frame_id_ : msg->header.frame_id; header.stamp = msg->header.stamp; grasp_msgs::msg::GraspConfig to_grasp; for (auto from_grasp : msg->grasps) { // skip low score grasp if (from_grasp.score.data < param_.grasp_score_threshold_) { RCLCPP_INFO(logger_, "skip low score grasps %f", from_grasp.score.data); continue; } // transform grasp to grasp_frame_id if (tf_needed) { if (!transform(from_grasp, to_grasp, msg->header)) { // skip transformation failure continue; } } if (param_.grasp_approach_angle_ != M_PI) { // skip unacceptable approach tf2::Vector3 approach(to_grasp.approach.x, to_grasp.approach.y, to_grasp.approach.z); double ang = tf2::tf2Angle(param_.grasp_approach_, approach); if (std::isnan(ang) || ang < -param_.grasp_approach_angle_ || ang > param_.grasp_approach_angle_) { RCLCPP_INFO(logger_, "skip unacceptable approach"); continue; } } // apply grasp offset to_grasp.bottom.x += param_.grasp_offset_[0]; to_grasp.bottom.y += param_.grasp_offset_[1]; to_grasp.bottom.z += param_.grasp_offset_[2]; // skip out of boundary grasps if (!tf_needed || check_boundry(to_grasp.bottom)) { // translate into moveit grasp moveit_msgs::msg::Grasp moveit_msg = toMoveIt(to_grasp, header); std::unique_lock lock(m_); moveit_grasps_.push_back(moveit_msg); } } } bool GraspPlanner::transform( grasp_msgs::msg::GraspConfig & from, grasp_msgs::msg::GraspConfig & to, const std_msgs::msg::Header & header) { geometry_msgs::msg::PointStamped from_top, to_top, from_surface, to_surface, from_bottom, to_bottom; geometry_msgs::msg::Vector3Stamped from_approach, to_approach, from_binormal, to_binormal, from_axis, to_axis; to = from; from_top.point = from.top; from_top.header = header; from_surface.point = from.surface; from_surface.header = header; from_bottom.point = from.bottom; from_bottom.header = header; from_approach.vector = from.approach; from_approach.header = header; from_binormal.vector = from.binormal; from_binormal.header = header; from_axis.vector = from.axis; from_axis.header = header; while (rclcpp::ok()) { try { tfBuffer_->transform(from_top, to_top, param_.grasp_frame_id_); tfBuffer_->transform(from_surface, to_surface, param_.grasp_frame_id_); tfBuffer_->transform(from_bottom, to_bottom, param_.grasp_frame_id_); tfBuffer_->transform(from_approach, to_approach, param_.grasp_frame_id_); tfBuffer_->transform(from_binormal, to_binormal, param_.grasp_frame_id_); tfBuffer_->transform(from_axis, to_axis, param_.grasp_frame_id_); } catch (tf2::TransformException & ex) { RCLCPP_WARN(logger_, "transform exception"); rclcpp::Rate(1).sleep(); continue; } break; } to.top = to_top.point; to.surface = to_surface.point; to.bottom = to_bottom.point; to.approach = to_approach.vector; to.binormal = to_binormal.vector; to.axis = to_axis.vector; return true; } bool GraspPlanner::check_boundry(const geometry_msgs::msg::Point & p) { RCLCPP_INFO(logger_, "point [%f %f %f]", p.x, p.y, p.z); return p.x >= param_.grasp_boundry_[0] && p.x <= param_.grasp_boundry_[1] && p.y >= param_.grasp_boundry_[2] && p.y <= param_.grasp_boundry_[3] && p.z >= param_.grasp_boundry_[4] && p.z <= param_.grasp_boundry_[5]; } moveit_msgs::msg::Grasp GraspPlanner::toMoveIt( grasp_msgs::msg::GraspConfig & grasp, const std_msgs::msg::Header & header) { moveit_msgs::msg::Grasp msg; msg.grasp_pose.header = header; msg.grasp_quality = grasp.score.data; double offset = param_.eef_offset; // set grasp position, translation from hand-base to the parent-link of EEF msg.grasp_pose.pose.position.x = grasp.bottom.x - grasp.approach.x * offset; msg.grasp_pose.pose.position.y = grasp.bottom.y - grasp.approach.y * offset; msg.grasp_pose.pose.position.z = grasp.bottom.z - grasp.approach.z * offset; // rotation matrix https://github.com/atenpas/gpd/blob/master/tutorials/hand_frame.png tf2::Matrix3x3 r( grasp.binormal.x, grasp.axis.x, grasp.approach.x, grasp.binormal.y, grasp.axis.y, grasp.approach.y, grasp.binormal.z, grasp.axis.z, grasp.approach.z); tf2::Quaternion quat; r.getRotation(quat); // EEF yaw-offset to its parent-link (last link of arm) quat *= tf2::Quaternion(tf2::Vector3(0, 0, 1), param_.eef_yaw_offset); quat.normalize(); // set grasp orientation msg.grasp_pose.pose.orientation = tf2::toMsg(quat); RCLCPP_INFO(logger_, "==============offset is %f quat [%f %f %f %f]", offset, msg.grasp_pose.pose.orientation.x, msg.grasp_pose.pose.orientation.y, msg.grasp_pose.pose.orientation.z, msg.grasp_pose.pose.orientation.w); // set pre-grasp approach msg.pre_grasp_approach.direction.header = header; msg.pre_grasp_approach.direction.vector = grasp.approach; msg.pre_grasp_approach.min_distance = param_.grasp_min_distance_; msg.pre_grasp_approach.desired_distance = param_.grasp_desired_distance_; // set post-grasp retreat msg.post_grasp_retreat.direction.header = header; msg.post_grasp_retreat.direction.vector.x = -grasp.approach.x; msg.post_grasp_retreat.direction.vector.y = -grasp.approach.y; msg.post_grasp_retreat.direction.vector.z = -grasp.approach.z; msg.post_grasp_retreat.min_distance = param_.grasp_min_distance_; msg.post_grasp_retreat.desired_distance = param_.grasp_desired_distance_; // set pre-grasp posture msg.pre_grasp_posture.joint_names = param_.finger_joint_names_; msg.pre_grasp_posture.points.push_back(param_.finger_points_open_); // set grasp posture msg.grasp_posture.joint_names = param_.finger_joint_names_; msg.grasp_posture.points.push_back(param_.finger_points_close_); return msg; } void GraspPlanner::grasp_service( const std::shared_ptr request_header, const std::shared_ptr req, const std::shared_ptr res) { (void)request_header; (void)req; RCLCPP_INFO(logger_, "Received Grasp Planning request"); { std::unique_lock lock(m_); moveit_grasps_.clear(); grasp_detector_->start(req->target.id); } // blocking till grasps found while (moveit_grasps_.empty()) { rclcpp::Rate(20).sleep(); } grasp_detector_->stop(); res->grasps = moveit_grasps_; if (res->grasps.empty()) { RCLCPP_INFO(logger_, "No expected grasp found."); res->error_code.val = moveit_msgs::msg::MoveItErrorCodes::FAILURE; } else { RCLCPP_INFO(logger_, "%ld grasps found.", res->grasps.size()); res->error_code.val = moveit_msgs::msg::MoveItErrorCodes::SUCCESS; } } } // namespace grasp_ros2 #include "rclcpp_components/register_node_macro.hpp" RCLCPP_COMPONENTS_REGISTER_NODE(grasp_ros2::GraspPlanner) ================================================ FILE: grasp_ros2/src/ros_params.cpp ================================================ // Copyright (c) 2018 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #include #include #include "grasp_library/ros2/ros_params.hpp" namespace grasp_ros2 { void ROSParameters::getDetectionParams( rclcpp::Node * node, GraspDetector::GraspDetectionParameters & param) { // Read hand geometry parameters. node->get_parameter_or("finger_width", param.hand_search_params.finger_width_, 0.005); node->get_parameter_or("hand_outer_diameter", param.hand_search_params.hand_outer_diameter_, 0.12); node->get_parameter_or("hand_depth", param.hand_search_params.hand_depth_, 0.06); node->get_parameter_or("hand_height", param.hand_search_params.hand_height_, 0.02); node->get_parameter_or("init_bite", param.hand_search_params.init_bite_, 0.01); // Read local hand search parameters. node->get_parameter_or("nn_radius", param.hand_search_params.nn_radius_frames_, 0.01); node->get_parameter_or("num_orientations", param.hand_search_params.num_orientations_, 8); node->get_parameter_or("num_samples", param.hand_search_params.num_samples_, 100); node->get_parameter_or("num_threads", param.hand_search_params.num_threads_, 4); node->get_parameter_or("rotation_axis", param.hand_search_params.rotation_axis_, 2); // Read plotting parameters. node->get_parameter_or("plot_samples", param.plot_samples_, false); node->get_parameter_or("plot_normals", param.plot_normals_, false); param.generator_params.plot_normals_ = param.plot_normals_; node->get_parameter_or("plot_filtered_grasps", param.plot_filtered_grasps_, false); node->get_parameter_or("plot_valid_grasps", param.plot_valid_grasps_, false); node->get_parameter_or("plot_clusters", param.plot_clusters_, false); node->get_parameter_or("plot_selected_grasps", param.plot_selected_grasps_, false); // Read general parameters. param.generator_params.num_samples_ = param.hand_search_params.num_samples_; param.generator_params.num_threads_ = param.hand_search_params.num_threads_; node->get_parameter_or("plot_candidates", param.generator_params.plot_grasps_, false); // Read preprocessing parameters. node->get_parameter_or("remove_outliers", param.generator_params.remove_statistical_outliers_, false); node->get_parameter_or("voxelize", param.generator_params.voxelize_, true); node->get_parameter_or("workspace", param.generator_params.workspace_, std::vector(std::initializer_list({-1.0, 1.0, -1.0, 1.0, -1.0, 1.0}))); param.workspace_ = param.generator_params.workspace_; // Read classification parameters and create classifier. node->get_parameter_or("model_file", param.model_file_, std::string("")); node->get_parameter_or("trained_file", param.weights_file_, std::string("")); node->get_parameter_or("min_score_diff", param.min_score_diff_, 500.0); node->get_parameter_or("create_image_batches", param.create_image_batches_, false); node->get_parameter_or("device", param.device_, 0); // Read grasp image parameters. node->get_parameter_or("image_outer_diameter", param.image_params.outer_diameter_, param.hand_search_params.hand_outer_diameter_); node->get_parameter_or("image_depth", param.image_params.depth_, param.hand_search_params.hand_depth_); node->get_parameter_or("image_height", param.image_params.height_, param.hand_search_params.hand_height_); node->get_parameter_or("image_size", param.image_params.size_, 60); node->get_parameter_or("image_num_channels", param.image_params.num_channels_, 15); // Read learning parameters. node->get_parameter_or("remove_plane_before_image_calculation", param.remove_plane_, false); // Read grasp filtering parameters node->get_parameter_or("filter_grasps", param.filter_grasps_, false); node->get_parameter_or("filter_half_antipodal", param.filter_half_antipodal_, false); param.gripper_width_range_.push_back(0.03); param.gripper_width_range_.push_back(0.10); // node->get_parameter("gripper_width_range", param.gripper_width_range_); // Read clustering parameters node->get_parameter_or("min_inliers", param.min_inliers_, 1); // Read grasp selection parameters node->get_parameter_or("num_selected", param.num_selected_, 5); } void ROSParameters::getPlanningParams( rclcpp::Node * node, GraspPlanner::GraspPlanningParameters & param) { node->get_parameter_or("grasp_service_timeout", param.grasp_service_timeout_, 0); node->get_parameter_or("grasp_score_threshold", param.grasp_score_threshold_, 200); node->get_parameter_or("grasp_frame_id", param.grasp_frame_id_, std::string("base")); std::vector approach; node->get_parameter_or("grasp_approach", approach, std::vector(std::initializer_list({0.0, 0.0, -1.0}))); param.grasp_approach_ = tf2::Vector3(approach[0], approach[1], approach[2]); node->get_parameter_or("grasp_approach_angle", param.grasp_approach_angle_, M_PI); node->get_parameter_or("grasp_offset", param.grasp_offset_, std::vector(std::initializer_list({0.0, 0.0, 0.0}))); node->get_parameter_or("grasp_boundry", param.grasp_boundry_, std::vector(std::initializer_list({-1.0, 1.0, -1.0, 1.0, -1.0, 1.0}))); node->get_parameter_or("eef_offset", param.eef_offset, 0.154); node->get_parameter_or("eef_yaw_offset", param.eef_yaw_offset, 0.0); node->get_parameter_or("grasp_min_distance", param.grasp_min_distance_, 0.06); node->get_parameter_or("grasp_desired_distance", param.grasp_desired_distance_, 0.1); // gripper parameters std::vector finger_opens, finger_closes; node->get_parameter_or("finger_joint_names", param.finger_joint_names_, std::vector(std::initializer_list({std::string("panda_finger_joint1"), std::string("panda_finger_joint2")}))); node->get_parameter_or("finger_positions_open", param.finger_points_open_.positions, std::vector(std::initializer_list({-0.01, 0.01}))); node->get_parameter_or("finger_positions_close", param.finger_points_close_.positions, std::vector(std::initializer_list({-0.0, 0.0}))); } } // namespace grasp_ros2 ================================================ FILE: grasp_ros2/tests/CMakeLists.txt ================================================ find_package(ament_cmake REQUIRED) find_package(ament_cmake_gtest REQUIRED) find_package(rclcpp REQUIRED) find_package(grasp_msgs REQUIRED) find_package(moveit_msgs REQUIRED) find_package(pcl_conversions REQUIRED) set(TEST_NAME tgrasp_ros2) ament_add_gtest(${TEST_NAME} tgrasp_ros2.cpp ${CMAKE_CURRENT_SOURCE_DIR}/../src/consts.cpp) if(TARGET ${TEST_NAME}) get_filename_component(RESOURCE_DIR "resource" ABSOLUTE) configure_file(tgrasp_ros2.h.in tgrasp_ros2.h) include_directories(${CMAKE_CURRENT_BINARY_DIR} ${PCL_INCLUDE_DIRS}) link_directories(${PCL_LIBRARY_DIRS}) target_include_directories(${TEST_NAME} PUBLIC ${grasp_library_INCLUDE_DIRS} ) ament_target_dependencies(${TEST_NAME} pcl_conversions rclcpp grasp_msgs moveit_msgs sensor_msgs ) target_link_libraries(${TEST_NAME} ${GTEST_LIBRARIES} ${PCL_LIBRARIES}) # Install binaries install(TARGETS ${TEST_NAME} RUNTIME DESTINATION bin ) install(TARGETS ${TEST_NAME} DESTINATION lib/${PROJECT_NAME} ) endif() ================================================ FILE: grasp_ros2/tests/resource/table_top.pcd ================================================ [File too large to display: 13.2 MB] ================================================ FILE: grasp_ros2/tests/tgrasp_ros2.cpp ================================================ // Copyright (c) 2019 Intel Corporation // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #include #include #include #include #include #include #include #include #include #include #include #include "grasp_library/ros2/consts.hpp" #include "./tgrasp_ros2.h" using Consts = grasp_ros2::Consts; using GraspPlanning = moveit_msgs::srv::GraspPlanning; static bool received_topic = false; static int num_grasps = 0; static bool pcd_stop = false; static rclcpp::Node::SharedPtr node = nullptr; static std::shared_ptr result = nullptr; static void pcd_publisher() { char path[512]; snprintf(path, sizeof(path), "%s/table_top.pcd", RESOURCE_DIR); pcl::PointCloud cloud; if (0 != pcl::io::loadPCDFile(path, cloud)) { return; } sensor_msgs::msg::PointCloud2 msg; pcl::toROSMsg(cloud, msg); msg.header.frame_id = "camera_color_optical_frame"; auto pcd_node = rclcpp::Node::make_shared("PCDPublisher"); auto pcd_pub = pcd_node->create_publisher( Consts::kTopicPointCloud2, 10); rclcpp::Rate loop_rate(30); while (!pcd_stop && rclcpp::ok()) { pcd_pub->publish(msg); loop_rate.sleep(); } } static void topic_cb(const grasp_msgs::msg::GraspConfigList::SharedPtr msg) { RCLCPP_INFO(node->get_logger(), "Topic received"); received_topic = true; num_grasps = msg->grasps.size(); } TEST(GraspLibraryTests, TestGraspService) { EXPECT_TRUE(result->error_code.val == moveit_msgs::msg::MoveItErrorCodes::SUCCESS); EXPECT_GT(result->grasps.size(), uint32_t(0)); } TEST(GraspLibraryTests, TestGraspTopic) { rclcpp::Rate(1).sleep(); EXPECT_TRUE(received_topic); EXPECT_GT(num_grasps, 0); } int main(int argc, char * argv[]) { rclcpp::init(argc, argv); std::thread pcd_thread(pcd_publisher); pcd_thread.detach(); node = rclcpp::Node::make_shared("GraspLibraryTest"); auto sub = node->create_subscription( Consts::kTopicDetectedGrasps, rclcpp::QoS(rclcpp::KeepLast(1)), topic_cb); auto client = node->create_client("plan_grasps"); while (!client->wait_for_service(std::chrono::seconds(1))) { if (!rclcpp::ok()) { RCLCPP_ERROR(node->get_logger(), "Client interrupted"); return 1; } RCLCPP_INFO(node->get_logger(), "Wait for service"); } auto request = std::make_shared(); auto result_future = client->async_send_request(request); RCLCPP_INFO(node->get_logger(), "Request sent"); if (rclcpp::spin_until_future_complete(node, result_future) != rclcpp::executor::FutureReturnCode::SUCCESS) { RCLCPP_ERROR(node->get_logger(), "Request failed"); return 1; } result = result_future.get(); RCLCPP_INFO(node->get_logger(), "Response received %d", result->error_code.val); testing::InitGoogleTest(&argc, argv); int ret = RUN_ALL_TESTS(); pcd_stop = true; rclcpp::Rate(3).sleep(); // pcd_thread.join() disabled. It causes runtest exit abnormally node = nullptr; rclcpp::shutdown(); return ret; } ================================================ FILE: grasp_ros2/tests/tgrasp_ros2.h.in ================================================ // Copyright (c) 2018 Intel Corporation. All Rights Reserved // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #ifndef GRASP_LIBRARY__TGRASP_LIBRARY_H_ #define GRASP_LIBRARY__TGRASP_LIBRARY_H_ #include #include #define RESOURCE_DIR "@RESOURCE_DIR@" #endif // GRASP_LIBRARY__TGRASP_LIBRARY_H_ ================================================ FILE: grasp_tutorials/CMakeLists.txt ================================================ # Copyright (c) 2019 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. cmake_minimum_required(VERSION 3.5) project(grasp_tutorials) find_package(ament_cmake REQUIRED) ament_package() ================================================ FILE: grasp_tutorials/README.md ================================================ # ROS2 Grasp Library Tutorials This tutorials aim to introduce how to * Install, build, and launch the ROS2 Grasp Planner and Detector * Use launch options to customize in a new workspace * Bring up the intelligent visual grasp solution on a new robot * Do hand-eye calibration for a new camera setup * Launch the example applications ## Build and test this tutorial ```bash cd ros2_grasp_library/grasp_tutorials sphinx-build . build # check the outputs in the ./build/ folder cd ros2_grasp_library/grasp_utils/robot_interface doxygen Doxyfile # check the outputs in the ./build/ folder ``` ================================================ FILE: grasp_tutorials/conf.py ================================================ # -*- coding: utf-8 -*- # # app_tutorials documentation build configuration file, created by # sphinx-quickstart on Thu Oct 18 17:31:36 2018. # # This file is execfile()d with the current directory set to its # containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys import os # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. #sys.path.insert(0, os.path.abspath('.')) # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.doctest', 'sphinx.ext.intersphinx', 'sphinx.ext.todo', 'sphinx.ext.coverage', 'sphinx.ext.mathjax', 'sphinx.ext.ifconfig', 'sphinx.ext.viewcode', ] # Add any paths that contain templates here, relative to this directory. # templates_path = [] # The suffix(es) of source filenames. # You can specify multiple suffix as a list of string: # source_suffix = ['.rst', '.md'] source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = u'ROS2 Grasp Library Tutorials' copyright = u'2019, Intel Corporation' author = u'sharron, liu; yu, yan' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = u'0.5.0' # The full version, including alpha/beta/rc tags. release = u'0.5.0' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # # This is also used if you do content translation via gettext catalogs. # Usually you set "language" from the command line for these cases. language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. # exclude_patterns = ['doc/ur5_setup_with_moveit.rst', 'doc/franka_setup_with_moveit.rst'] # The reST default role (used for this markup: `text`) to use for all # documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # If true, keep warnings as "system message" paragraphs in the built documents. #keep_warnings = False # If true, `todo` and `todoList` produce output, else they produce nothing. todo_include_todos = True # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'sphinx_rtd_theme' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (relative to this directory) to use as a favicon of # the docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # Add any extra paths that contain custom files (such as robots.txt or # .htaccess) here, relative to this directory. These files are copied # directly to the root of the documentation. #html_extra_path = [] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Language to be used for generating the HTML full-text search index. # Sphinx supports the following languages: # 'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja' # 'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr' #html_search_language = 'en' # A dictionary with options for the search language support, empty by default. # Now only 'ja' uses this config value #html_search_options = {'type': 'default'} # The name of a javascript file (relative to the configuration directory) that # implements a search results scorer. If empty, the default will be used. #html_search_scorer = 'scorer.js' # Output file base name for HTML help builder. htmlhelp_basename = 'grasp_tutorialsdoc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). #'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). #'pointsize': '10pt', # Additional stuff for the LaTeX preamble. #'preamble': '', # Latex figure (float) alignment #'figure_align': 'htbp', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ (master_doc, 'grasp_tutorials.tex', u'grasp\\_tutorials Documentation', u'Intel Corporation', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ (master_doc, 'grasp_tutorials', u'grasp_tutorials Documentation', [author], 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ (master_doc, 'grasp_tutorials', u'grasp_tutorials Documentation', author, 'grasp_tutorials', 'One line description of project.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' # If true, do not generate a @detailmenu in the "Top" node's menu. #texinfo_no_detailmenu = False # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = {'https://docs.python.org/': None} ================================================ FILE: grasp_tutorials/doc/bringup_robot.rst ================================================ Bring up a New Robot ==================== This tutorial explains what is expected to do when bringing up this ROS2 Grasp Library on a new robot. .. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html .. _GPD: https://github.com/atenpas/gpd .. _OpenVINO™: https://software.intel.com/en-us/openvino-toolkit .. _Grasp: http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html Minimum APIs to Implement ------------------------- - `moveToTcpPose` - Move the TCP (tool center point, usually the end effector of the robot arm, not the hand) to a pose specified with position [x, y, z] and orientation [alpha, betta, gamma] (also called [roll, pitch, yaw]). This function returns when the robot `moved` to the specified pose. - `moveToJointValues` - Move each joint of the robot to the specified values (usually angles). This function differs from the `moveToTcpPose` since a same TCP pose may be reached with various solutions of joint values. This function is used when the application expect the joints of the robot in specific state, that is proper to performe any successive picking or place action. This function returns when the robot `moved` to the specified joint values. - `open` - Open the gripper. - `close` - Close the gripper. - `startLoop` - Start a loop to read and publish the robot state. Robot states are subsribed by Rviz for visualization. Optional implementation and possible extentions ----------------------------------------------- - Optionally you may implement the `pick` and `place` interface to customize the pick and placed pipeline, or even plug-in the collision avoidance motion planning. - Python extention is not supported. It's possible to implement the Robot Interface in python and bind to C++. Refer to `Robot Interface API <../api/html/index.html>`_ for more detailed definition. Example UR5 Implementation -------------------------- Refer to the UR5 `example `_ implementatino for Robot Interface. Test Your Implementation ------------------------ It's important to test your implementation before integrating this part with other components in ROS2 Grasp Library. Refer to `UR5 tests `_, adapt it to your robot tests. Bring up Robot Control Applications ----------------------------------- Once finished the testing, you may start to bring up the `Draw X `_ app or the `fixed position pick and place `_ app on your new robot. These application does not require camera, instead they control the robot only. ================================================ FILE: grasp_tutorials/doc/draw_x.rst ================================================ Draw X ====== Overview -------------- This demo shows how to use the robot interface to draw letter ``X`` at the fixed positions with an UR5 robot arm. Requirement ------------ Before running the code, make sure you have followed the instructions below to setup the robot correctly. - Hardware - Host running ROS2 - `UR5`_ - Software - `ROS2 Dashing`_ Desktop - `robot_interface`_ .. _UR5: https://www.universal-robots.com/products/ur5-robot .. _ROS2 Dashing: https://index.ros.org/doc/ros2/Installation/Dashing/Linux-Install-Debians/ .. _robot_interface: https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/robot_interface Download and Build the Example Code ------------------------------------ Within your ROS2 workspace, download and compile the example code: :: cd /src git clone https://github.com/intel/ros2_grasp_library.git cd .. colcon build --base-paths src/ros2_grasp_library/grasp_apps/draw_x Launch the Application ---------------------- - Launch the application :: ros2 launch draw_x draw_x.launch.py .. note:: Please make sure the emergency button on the teach pendant is in your hand, in case there is any accident. - Expected Outputs: 1. The robot moves its arm to the home pose 2. The robot moves its arm to the pose above the first corner of X 3. The robot moves its arm down to the first corner of X 4. The robot moves its arm to the second corner of X 5. The robot moves its arm up to the pose above the second corner of X 6. The robot moves its arm to the pose above the third corner of X 7. The robot moves its arm down to the third corner of X 8. The robot moves its arm to the fourth corner of X 9. The robot moves its arm up to the pose above the fourth corner of X 10. The robot moves its arm to the home pose again ================================================ FILE: grasp_tutorials/doc/fixed_position_pick.rst ================================================ Fixed Position Pick ==================== Overview -------------- This demo shows how to use the robot interface to pick and place a object at a predefined location with an UR5 robot arm. Requirement ------------ Before running the code, make sure you have followed the instructions below to setup the robot correctly. - Hardware - Host running ROS2 - `UR5`_ - `Robot Gripper`_ - Software - `ROS2 Dashing`_ Desktop - `robot_interface`_ .. _UR5: https://www.universal-robots.com/products/ur5-robot .. _ROS2 Dashing: https://index.ros.org/doc/ros2/Installation/Dashing/Linux-Install-Debians/ .. _robot_interface: https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/robot_interface .. _Robot Gripper: https://www.universal-robots.com/plus/end-effectors/hitbot-electric-gripper Download and Build the Example Code ------------------------------------ Within your ROS2 workspace, download and compile the example code: :: cd /src git clone https://github.com/intel/ros2_grasp_library.git cd .. colcon build --base-paths src/ros2_grasp_library/grasp_apps/fixed_position_pick Launch the Application ---------------------- - Launch the application :: ros2 launch fixed_position_pick fixed_position_pick.launch.py .. note:: Please make sure the emergency button on the teach pendant is in your hand, in case there is any accident. - Expected Outputs: 1. The robot moves to the home pose 2. The robot picks up an object from the predefined location 3. The robot places the object to another location 4. The robot moves back to the home pose ================================================ FILE: grasp_tutorials/doc/getting_start.rst ================================================ Getting Start ============= This tutorial introduces getting start to use this ROS2 Grasp Library. .. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html .. _GPD: https://github.com/atenpas/gpd .. _OpenVINO™: https://software.intel.com/en-us/openvino-toolkit .. _Grasp: http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html ROS2 Grasp Planner and Detector ------------------------------- In this section, you will start with an RGBD sensor connecting to a Ubuntu host machine. The grasp detection relys on OpenVINO™ toolkit. Follow this `grasp_planner `_ instruction to install the toolkit, then build and install the ROS2 Grasp Planner and Detector with your camera. After launch the grasp planner, from rviz you will see grasp detection results highlighted as blue markers. .. image:: ./grasp_ros2/img/ros2_grasp_library.png :width: 643 px :height: 435 px :align: center Use Launch Options for Customization ------------------------------------ ROS2 parameters are supported to customize the Grasp Detector and Grasp Planner for your local workspace. For example, the topic name of point cloud from RGBD sensor, the camera workspace (in the frame_id of the point cloud image), the grasp approach direction and angle, the grasp boundary (in the frame_id of the robot base). Robot Interface --------------- In this section, you will bring up your robot by implementing the Robot Interface. Currently the robot interface is defined in C++, python vesion is still working in progress. Robot Interface are the minimum APIs a robot should provide to enable this solution. Follow this `robot_interface `_ insturction to implement the required `move`, `open`, `close`, `startLoop` interfaces. Then make sure your implementation passed the Robot Interface tests, to garantee later integration with the example applications. Also you may try the "Robot Control Applications" (like Draw X, fixed position pick and place) to verify your implemntation working well. Hand-eye Calibration -------------------- Now start to generate transformation between the camera and the robot. Follow this `handeye_calibration `_ insturtion to finish the procedure of hand-eye calibration. The calibration procedure need to be done at the time when camera is setup. The resulting transformation will be remembered in your local environment for later publishing when launching the applications. Launch Intelligent Visual Grasp Applications -------------------------------------------- To this step, you may start to launch the applications. `Random Picking `_ runs OpenVINO grasp detection on GPU, and sends request to ROS2 MoveIt Grasp Planner for grasp planning and detection. The most likely successful grasps are returned by the Grasp Pose Detection from CNN inference, taking 3D point cloud inputs from the camera. The picking order is not pre-defined, so called random picking. `Recognition Picking `_ runs OpenVINO grasp detection on GPU, and runs OpenVINO object segmentation on CPU or Movidius VPU. The masks of recognized objects are returned from the `mask_rcnn` model. The `place_publisher` publishing the name of the object to pick and the position to place, so called recognition picking. ================================================ FILE: grasp_tutorials/doc/grasp_api.rst ================================================ ROS2 Grasp Library APIs ======================= .. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html .. _GPD: https://github.com/atenpas/gpd .. _OpenVINO™: https://software.intel.com/en-us/openvino-toolkit .. _Grasp: http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html .. _PointCloud2: https://github.com/ros2/common_interfaces/blob/master/sensor_msgs/msg/PointCloud2.msg .. _ObjectsInMasks: https://github.com/intel/ros2_openvino_toolkit/blob/master/people_msgs/msg/ObjectsInMasks.msg .. _Image: https://github.com/ros2/common_interfaces/blob/master/sensor_msgs/msg/Image.msg .. _TransformStamped: https://github.com/ros2/common_interfaces/blob/master/geometry_msgs/msg/TransformStamped.msg Grasp Planning ROS2 Interfaces ------------------------------ - Subscribed Topics - PointCloud2 topic from RGBD sensor (sensor_msgs::msg::`PointCloud2`_) - Segmented object topic (people_msgs::msg::`ObjectsInMasks`_) - Delivered Services - plan_grasps (moveit_msgs::srv::`GraspPlanning`_) Hand-Eye Calibration ROS2 Interfaces ------------------------------------ - Subscribed Topics - RGB image from sensor (sensor_msgs::msg::`Image`_) - Broadcasted Transforms - Static transform btw camera and robot (geometry_msgs::msg::`TransformStamped`_) Robot Interface API ------------------- - `API <../api/html/index.html>`_ ================================================ FILE: grasp_tutorials/doc/grasp_planner.rst ================================================ Grasp Planner ============= Tutorials --------- - `Install OpenVINO™ toolkit`_ .. _Install OpenVINO™ toolkit: https://github.com/intel/ros2_grasp_library/tree/master/grasp_tutorials/doc/grasp_ros2/install_openvino.md - `Launch ROS2 Grasp Planner and Detector`_ .. _Launch ROS2 Grasp Planner and Detector: https://github.com/intel/ros2_grasp_library/tree/master/grasp_tutorials/doc/grasp_ros2/tutorials_1_grasp_ros2_with_camera.md - `Launch tests`_ .. _Launch tests: https://github.com/intel/ros2_grasp_library/tree/master/grasp_tutorials/doc/grasp_ros2/tutorials_2_grasp_ros2_test.md - `Use launch options`_ .. _Use launch options: https://github.com/intel/ros2_grasp_library/tree/master/grasp_tutorials/doc/grasp_ros2/tutorials_3_grasp_ros2_launch_options.md ================================================ FILE: grasp_tutorials/doc/grasp_ros2/install_gpd.md ================================================ Installation guide for Grasp Pose Detection ### Install [GPG](https://github.com/atenpas/gpg) 1. Get the code ```bash git clone https://github.com/atenpas/gpg.git cd gpg ``` 2. Build the library ```bash mkdir build && cd build cmake .. make sudo make install # by default, "libgrasp_candidates_generator.so" shall be installed to "/usr/local/lib" ``` ### Install [GPD](https://github.com/sharronliu/gpd) 1. Get the code, originally derived from [GPD](https://github.com/atenpas/gpd) tag 1.5.0 ```bash git clone https://github.com/sharronliu/gpd.git git checkout libgpd cd gpd/src/gpd ``` 2. Build the library ```bash mkdir build && cd build cmake -DUSE_OPENVINO=ON .. make sudo make install # by default, "libgrasp_pose_detection.so" shall be installed to "/usr/local/lib" # and header files installed to "/usr/local/include/gpd" ``` ================================================ FILE: grasp_tutorials/doc/grasp_ros2/install_openvino.md ================================================ # Intel® DLDT toolkit and Intel® OpenVINO™ toolkit This tutorial introduces the DLDT toolkit and OpenVINO toolkit. Intel® [DLDT](https://github.com/opencv/dldt) is a Deep Learning Deployment Toolkit common to all architectures. The toolkit allows developers to convert pre-trained deep learning models into optimized Intermediate Representation (IR) models, then deploy the IR models through a high-level C++ Inference Engine API integrated with application logic. Additionally, [Open Model Zoo](https://github.com/opencv/open_model_zoo) provides more than 100 pre-trained optimized deep learning models and a set of demos to expedite development of high-performance deep learning inference applications. Online tutorials are availble for * [Inference Engine Build Instructions](https://github.com/opencv/dldt/blob/2019/inference-engine/README.md) Intel® [OpenVINO™](https://software.intel.com/en-us/openvino-toolkit) (Open Visual Inference & Neural Network Optimization) toolkit enables CNN-based deep learning inference at the edge computation, extends workloads across Intel® hardware (including accelerators) and maximizes performance. The toolkit supports heterogeneous execution across various compution vision devices -- CPU, GPU, Intel® Movidius™ NCS, and FPGA -- using a common API. Online tutorials are available for * [Model Optimize Developer Guide](https://software.intel.com/en-us/articles/OpenVINO-ModelOptimizer) * [Inference Engine Developer Guide](https://software.intel.com/en-us/articles/OpenVINO-InferEngine) * [Intel® Neural Compute Stick 2](https://software.intel.com/en-us/neural-compute-stick/get-started) ## Install DLDT and OpenVINO It's recommended to refer to the online documents of the toolkits for the latest installation instruction. Below is detailed steps we verified with Ubuntu 18.04 on Intel NUC6i7KYK for your ref. 1. Build and install Inference Engine ```bash git clone https://github.com/opencv/dldt.git git checkout 2019_R3 # follow the instructions below to install all dependents, including mklml, opencl, etc. # https://github.com/opencv/dldt/blob/2019_R3/inference-engine/README.md#build-on-linux-systems # build mkdir build && cd build cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/usr/local -DGEMM=MKL -DMKLROOT=/usr/local/lib/mklml -DENABLE_MKL_DNN=ON -DENABLE_CLDNN=ON .. make -j8 sudo make install ``` 2. Share the CMake configures for Inference Engine to be found by other packages ```bash sudo mkdir /usr/share/InferenceEngine sudo cp InferenceEngineConfig*.cmake /usr/share/InferenceEngine sudo cp targets.cmake /usr/share/InferenceEngine ``` Then Inference Engine will be found when adding "find_package(InferenceEngine)" into the CMakeLists.txt 3. Configure library path for dynamic loading ```bash echo `pwd`/../bin/intel64/Release/lib | sudo tee -a /etc/ld.so.conf.d/openvino.conf sudo ldconfig ``` 4. Optionally install plug-ins for InferenceEngine deployment on heterogeneous devices * Install [plug-in](https://github.com/opencv/dldt/blob/2019_R3/inference-engine/README.md#optional-additional-installation-steps-for-the-intel-movidius-neural-compute-stick-and-neural-compute-stick-2) for deployment on Intel Movidius Neural Computation Sticks Myriad X. ================================================ FILE: grasp_tutorials/doc/grasp_ros2/tutorials_1_grasp_ros2_with_camera.md ================================================ # OpenVINO Grasp Library with RGBD Camera This tutorial introduce the OpenVINO environment setup, how to build and launch Grasp Library with RGBD camera. ## Requirements ### Hardware * Host running ROS2/ROS * RGBD sensor ### Software We verified the software with Ubuntu 18.04 Bionic and ROS2 Dashing release. Verification with ROS2 MoveIt is still work in progress. Before this, we have verified the grasp detection with MoveIt Melodic branch (tag 0.10.8) and our visual pick & place application to be shared as [MoveIt Example Apps](https://github.com/ros-planning/moveit_example_apps). * Install ROS2 packages [ros-dashing-desktop](https://index.ros.org/doc/ros2/Installation/Dashing/Linux-Install-Debians) * Install non ROS packages ```bash sudo apt-get install libpcl-dev libeigen3-dev ``` * Install [Intel OpenVINO Toolkit](install_openvino.md) * Install [GPD](install_gpd.md) ## Build Grasp Library ```bash # get the source codes mkdir ~/ros2_ws/src -p cd ~/ros2_ws/src git clone https://github.com/intel/ros2_grasp_library.git # copy GPD models cp -a /models ros2_grasp_library/gpd # build cd .. source /opt/ros/dashing/setup.bash colcon build --symlink-install --packages-select grasp_msgs moveit_msgs grasp_ros2 source ./install/local_setup.bash ``` ## Launch Grasp Library ```bash # Terminal 1, Optionally, launch Rviz2 to illustrate detection results. ros2 run rviz2 rviz2 -d src/ros2_grasp_library/grasp_ros2/rviz2/grasp.rviz # Note You may customize the ".rviz" file for your own camera, for example: # To change to fixed frame: "Global Options -> Fixed Frame" # To change the point cloud topic: "Point Cloud 2 -> Topic" # Terminal 2, launch RGBD camera # e.g. launch [ROS2 Realsenes](https://github.com/intel/ros2_intel_realsense/tree/refactor) # or, with a ros-bridge, launch any ROS OpenNI RGBD cameras, like [ROS Realsense](https://github.com/intel-ros/realsense) ros2 run realsense_node realsense_node # Terminal 3, launch Grasp Library ros2 run grasp_ros2 grasp_ros2 __params:=src/ros2_grasp_library/grasp_ros2/cfg/grasp_ros2_params.yaml ``` ================================================ FILE: grasp_tutorials/doc/grasp_ros2/tutorials_2_grasp_ros2_test.md ================================================ # Grasp Library Tests and Exampels This tutorial documents Grasp Library tests which also serve as example codes for the usage of Grasp Library. ## Grasp Library Tests Test Suites enabled: * ROS2 built-in tests for static code scanning like, copyright tests, cppcheck tests, cpplint tests, lint_cmake tests, uncrustify tests, xmllint tests. * Grasp ROS2 basic functional tests: tgrasp_ros2, basic tests cover ROS2 topic and ROS2 service of Grasp Library. Before test, make sure you have setup the environment to build the Grasp Library, following tutorials [Grasp Library with RGBD Camera](tutorials_1_grasp_library_with_camera.md). The tests take inputs from a pre-stored PointCloud file (.pcd). Thus it's unnecessary to launch an RGBD camera. ```bash # Terminal 1, launch Grasp Library ros2 run grasp_ros2 grasp_ros2 __params:=src/ros2_grasp_library/grasp_ros2/cfg/test_grasp_ros2.yaml # Terminal 2, run tests colcon test --packages-select grasp_msgs grasp_ros2 ``` For failed cases check detailed logs at "log/latest_test/grasp_ros2/stdout.log". ## Grasp Library Examples The [grasp test codes](../grasp_ros2/tests/tgrasp_ros2.cpp) also demonstrate how to use this Grasp Library for grasp detection and grasp planning. ### Grasp Detection Example (Non-MoveIt App) This example creats ROS2 subscription to the "Detected Grasps" topic and get the detection results from callback. Grasp Library is expected to work in 'auto_mode=true', sensor-driven grasp detection, see example launch options [here](../grasp_ros2/cfg/grasp_ros2_params.yaml). ```bash #include #include #include "grasp_ros2/consts.hpp" static rclcpp::Node::SharedPtr node = nullptr; static void topic_cb(const grasp_msgs::msg::GraspConfigList::SharedPtr msg) { RCLCPP_INFO(node->get_logger(), "Grasp Callback Received"); } int main(int argc, char * argv[]) { // init ROS2 rclcpp::init(argc, argv); // create ROS2 node node = rclcpp::Node::make_shared("GraspDetectionExample"); // subscribe to the "Detected Grasps" topic auto sub = node->create_subscription (Consts::kTopicDetectedGrasps, rclcpp::QoS(rclcpp::KeepLast(1)), topic_cb); // create ROS2 executor to process any pending in/out messages rclcpp::spin(node); node = nullptr; rclcpp::shutdown(); return 0; } ``` ### Grasp Planning Example (MoveIt App) This example creates ROS2 client for the "plan_grasps" service and get the palnning results from async service response. Grasp Library is expected to work in 'auto_mode=false', service-driven grasp detection, see launch option example [here](../grasp_ros2/cfg/test_grasp_ros2.yaml). ```bash #include #include #include #include "grasp_ros2/consts.hpp" static rclcpp::Node::SharedPtr node = nullptr; static std::shared_ptr result = nullptr; int main(int argc, char * argv[]) { // init ROS2 rclcpp::init(argc, argv); // create ROS2 node node = rclcpp::Node::make_shared("GraspPlanningExample"); // create ROS2 client for MoveIt "plan_grasps" service auto client = node->create_client("plan_grasps"); // wait for ROS2 service ready while (!client->wait_for_service(std::chrono::seconds(1))) { if (!rclcpp::ok()) { RCLCPP_ERROR(node->get_logger(), "Client interrupted"); return 1; } RCLCPP_INFO(node->get_logger(), "Wait for service"); } // fill in a request auto request = std::make_shared(); // send async request auto result_future = client->async_send_request(request); RCLCPP_INFO(node->get_logger(), "Request sent"); // wait for response if (rclcpp::spin_until_future_complete(node, result_future) != rclcpp::executor::FutureReturnCode::SUCCESS) { RCLCPP_ERROR(node->get_logger(), "Request failed"); return 1; } // get grasp planning results from response result = result_future.get(); RCLCPP_INFO(node->get_logger(), "Response received %d", result->error_code.val); node = nullptr; rclcpp::shutdown(); return 0; } ``` ================================================ FILE: grasp_tutorials/doc/grasp_ros2/tutorials_3_grasp_ros2_launch_options.md ================================================ # Grasp Library Launch Options and Customization Notes This tutorial documents the launch options which are used for customization. Each option will be introduced in the following format: * **option_name** [**default_value**|other_values]: Description of this option. Customization Notes. ## GraspDetectorGPD Launch Options * **cloud_topic** [**"/camera/depth_registered/points"**|"string"]: Name of point cloud topic as input to the grasp detection, default value compliant with an RGBD OpenNI camera. * **device** [**0**|1|2|3]: Configure device for grasp pose inference to execute, 0 for CPU, 1 for GPU, 2 for VPU, 3 for FPGA. In case OpenVINO plug-ins are installed ([tutorial](install_openvino.md)), this configure deploy the CNN based deep learning inference on to the target device. Deploying the inference onto **GPU or VPU** will save CPU loads for other computation tasks. * **auto_mode** [false|**true**]: Configure grasp detection mode. When auto_mode is true, Grasp Library works in sensor-driven mode, processing grasp detection when point cloud message arrives. When auto_mode is false, Grasp Library works in service-driven mode, processing grasp detection when a service request arrives. Configure to **service-driven** mode will save most CPU loads against that of the sensor-driven mode. * **plane_remove** [**false**|true]: Configure whether or not remove the planes (like the table plane) from point cloud input. Enabling this helps to avoid generating grasp poses across the table. * **workspace** [**[-1.0, 1.0, -1.0, 1.0, -1.0, 1,0]**|[1*6 double]]: Configure a boundry cube in camera frame for grasp generation and detection. *This need to be customized according to user's setup.* * **finger_width** [**0.005**|double]: The finger thickness in metres. *This need to be customized according to user's robot hand.* * **hand_outer_diameter** [**0.12**|double]: The maximum robot hand aperture in metres. *This need to be customized according to user's robot hand.* * **hand_depth** [**0.06**|double]: The hand depth (the finger length) in metres. Tuning this parameter will affect the "GraspConfig::bottom" field (the hand base) in the grasp detection results. *This need to be customized according to user's robot hand.* * **hand_height** [**0.02**|double]: The finger breadth in metres. *This need to be customized according to user's robot hand.* ## GraspPlanner Launch Options * **grasp_service_timeout** [**5**|double]: Timeout in seconds for a service request waiting for grasp detection result. Grasp Planner will not take point could inputs from the history buffer. Indeed after receiving a service request, Grasp Planner will start grasp detection on the coming point cloud input. This parameter configures the timeout period for Grasp Planner to wait for the grasp detection result. Usually this's an amount of max latencies in RGBD sensor, Grasp Detector, Grasp Planner, any other nodes in the pipeline, additionally with an estimated worst delay in the system. * **grasp_score_threshold** [**200**|integer]: Minimum score expected for grasps returned from this service. * **grasp_frame_id** [**"base"**|"string"]: Frame id expected for grasps returned from this service. When this parameter is specified, Grasp Planner try to transform the grasp from the original frame (usually a camera's color frame) to this target frame, given the TF available. * **grasp_approach** [**[0.0, 0.0, -1.0]**|[1*3 double]]: Specify expected approach direction in the target frame specified by 'grasp_frame_id'. Grasp Planner will return grasp poses with approach direction approximate to this parameter. This is useful when a MoveIt application wants to constraint the approaching direction. *This need to be customized according to user's setup.* * **grasp_approach_angle** [**M_PI**|3.14|1.57|double]: Maximum angle in radian acceptable between the expected 'grasp_approach' and the real approach returned from this service. Default is [-M_PI, M_PI], which implies any approach directions are acceptable. *This need to be customized according to user's setup.* * **grasp_offset** [**[0.0, 0.0, 0.0]**|[1*3 double]]: Offset [x, y, z] in metres applied to the grasps detected. This offset allows adjustment over the final grasp position, to overcome erros that might be accumulated from camera calibration, hand-eye calibration, grasp pose detection, etc. *This need to be customized according to user's setup.* * **grasp_boundry** [**[-1.0, 1.0, -1.0, 1.0, -1.0, 1,0]**|[1*6 double]]: Boundry cube in grasp_frame_id expected for grasps returned from this service. This parameter takes effect only after transformation into the target frame specified by "grasp_frame_id". When the transformation is unavailalbe, boundry checking will be skipped, and in such case the "GraspDetectorGPD::workspace" parameter still takes effect. *This need to be customized according to user's setup.* * **eef_offset** [**0.16**|double]: Offset in metres from the gripper base (finger root) to the parent link of gripper. The parent link is usually the end of the robot arm. *This need to be customized according to the gripper geometry.* * **eef_yaw_offset** [**0.0**|double]: Gripper yaw offset to its parent link, in radian. *This need to be customized if the gripper has yaw offset to TCP (tool center point) of the robot arm.* * **finger_joint_names** [**["panda_finger_joint1", "panda_finger_joint2"]**|[1*2 "string"]]: Joint names of gripper fingers. Joint names are filled into MoveIt's grasp interface, to control the posture of hand for the position of 'pre_grasp_posture' and 'grasp_posture' (see [moveit_msgs::msg::Grasp](http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html)). Joint names are usually defined in URDF of the robot hand. *This need to be customized according to user's setup.* Other ROS parameters not mentioned here, refer to the codes [ros_params.cpp](../grasp_ros2/src/ros_params.cpp) for details. ## Customization Notes * **Model training for grasp detection**: It depends on which back-end grasp detection algorithm is used. For [Grasp Pose Detection](https://github.com/atenpas/gpd), the model was trained with 185K labeled grasps and 55 object models from [bigBIRD](http://rll.berkeley.edu/bigbird). In case of any necessity to re-train, please refer to the discussion [#49](https://github.com/atenpas/gpd/issues/49) in the upstream project. ================================================ FILE: grasp_tutorials/doc/handeye_calibration.rst ================================================ Hand-eye Calibration ===================== Hand-eye calibration is used to get the camera pose with respect to the robot. - `handeye_target_detection `_ - `handeye_dashboard `_ ================================================ FILE: grasp_tutorials/doc/overview.rst ================================================ Overview ======== .. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html .. _GPD: https://github.com/atenpas/gpd .. _OpenVINO™: https://software.intel.com/en-us/openvino-toolkit .. _Grasp: http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html ROS2 Grasp Library consists of .. image:: ../_static/images/ros2_grasp_library.png :width: 523 px :height: 311 px :align: center - A ROS2 Grasp Planner providing grasp planning service, as an extensible capability of MoveIt (moveit_msgs::srv::`GraspPlanning`_), translating grasp detection results into MoveIt Interfaces (moveit_msgs::msg::`Grasp`_). A ROS2 Grasp Detctor abstracting interfaces for grasp detection results - A ROS2 hand-eye calibration module generating transformation from camera frame to robot frame - Robot interfaces controlling the phsical robot to move, pick, place, as well as to feedback robot states - ROS2 example applications demonstrating how to use this ROS2 Grasp Library in advanced industrial usages for intelligent visual grasp ================================================ FILE: grasp_tutorials/doc/random_pick.rst ================================================ Random Pick (OpenVINO Grasp Detection) ====================================== Overview -------- A simple application demonstrating how to pick up objects from clutter scenarios with an industrial robot arm. The application interact with Grasp Planner and Robot Interface from this Grasp Library. The Grasp Planner takes grasp detection results from `OpenVINO GPD `_, transforms the grasp pose from camera view to the robot view with the `Hand-Eye Calibration`_, translates the Grasp Pose into `moveit_msgs Grasp `_. The Robot Interface takes the grasp poses and place poses, to pick and place the object. Watch this `demo_video `_ to see the output of this application. .. raw:: html Requirement ----------- Before running the code, make sure you have followed the instructions below to setup the environment. - Hardware - Host running ROS2 - RGBD sensor - `Robot Arm `_ - `Robot Gripper`_ - Software - `ROS2 `_ - `Grasp Planner `_ - `Robot Interface `_ - `Hand-Eye Calibration `_ - RGBD Sensor - `realsense `_ Download and Build the Application ---------------------------------- Within your catkin workspace, download and compile the example code :: cd /src git clone https://github.com/intel/ros2_grasp_library.git cd .. colcon build --symlink-install - Build Options - BUILD_RANDOM_PICK (**ON** | OFF) Switch on/off building of this application Launch the Application with Real Robot and Camera ------------------------------------------------- - Publish handeye transform, refer to `Hand-Eye Calibration`_ - Launch UR description :: ros2 launch ur_description view_ur5_ros2.launch.py #load rviz2 configure file "src/ros2_grasp_library/grasp_apps/random_pick/rviz2/random_pick.rviz" - Launch RGBD sensor :: ros2 run realsense_node realsense_node - Launch random pick app :: ros2 run random_pick random_pick - Launch grasp planner :: ros2 run grasp_ros2 grasp_ros2 __params:=src/ros2_grasp_library/grasp_ros2/cfg/random_pick.yaml ================================================ FILE: grasp_tutorials/doc/recognize_pick.rst ================================================ Recognize Pick (OpenVINO Grasp Detection + OpenVINO Object Segmentation) ======================================================================== Overview -------- A simple application demonstrating how to pick up recognized objects with an industrial robot arm. The application interact with Grasp Planner and Robot Interface from this Grasp Library. Comparing against the `random picking `_ application, this recognition picking takes the place commands published from the `place_publisher` which specifying the name the object to pick and the position to place. The Grasp Detector then takes the object segmentation results from the `OpenVINO Mask-rcnn `_ to identify the location of the object in the point cloud image and generates grasp poses for that specific object. Watch this `demo_video `_ to see the output of this application. .. raw:: html Requirement ----------- Before running the code, make sure you have followed the instructions below to setup the environment. - Hardware - Host running ROS2 - RGBD sensor - `Robot Arm `_ - `Robot Gripper`_ - Software - `ROS2 `_ - `Grasp Planner `_ - `Robot Interface `_ - `Hand-Eye Calibration `_ - `ROS2 OpenVINO `_ - RGBD Sensor - `realsense `_ Download and Build the Application ---------------------------------- Within your catkin workspace, download and compile the example code :: cd /src git clone https://github.com/intel/ros2_grasp_library.git cd .. colcon build --symlink-install - Build Options - BUILD_RECOGNIZE_PICK (**ON** | OFF) Switch on/off building of this application Launch the Application with Real Robot and Camera ------------------------------------------------- - Publish handeye transform, refer to `Hand-Eye Calibration`_ - Publish place object :: ros2 run recognize_pick place_publisher sports_ball - Launch UR description :: ros2 launch ur_description view_ur5_ros2.launch.py #load rviz2 configure file "src/ros2_grasp_library/grasp_apps/recognize_pick/rviz2/recognize_pick.rviz" - Launch RGBD sensor :: ros2 run realsense_node realsense_node - Launch object segmentation :: ros2 launch dynamic_vino_sample pipeline_segmentation.launch.py # close the rviz2 window - Launch recognize pick app :: ros2 run recognize_pick recognize_pick - Launch grasp planner :: ros2 run grasp_ros2 grasp_ros2 __params:=src/ros2_grasp_library/grasp_ros2/cfg/recognize_pick.yaml ================================================ FILE: grasp_tutorials/doc/robot_interface.rst ================================================ Robot Interface =============== - `Robot Interface `_ Robot Control Apps ------------------- .. toctree:: :maxdepth: 2 ./draw_x ./fixed_position_pick ================================================ FILE: grasp_tutorials/doc/template.rst ================================================ [App Tutorial Template] ======================= Overview -------- *(Describe what this application is in one topic sentence, followed by a paragraph telling what this application does in details. E.g.)* A template of application tutorial. The application tutorial contains an overview of the application, requirements on hardware and software, guidance to download/build/launch the application, expected output from the application, and customization notes. Requirements ------------ *(Describe the hardware and software requred to setup the environment for this application. Provide hyperlinkage to the procurement info or installation guides. E.g.)* - Hardware - Host running ROS2 - `Robot Arm `_ (optional) - Software - `ROS2 Dashing `_ Desktop - `robot_interface`_ .. _robot_interface: https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/robot_interface Download and Build the Application ---------------------------------- *(Describe how to download and build the application. List build options specific to this application. E.g.)* :: cd /src git clone https://github.com/intel/ros2_grasp_library.git cd .. colcon build --symlink-install --ament-cmake-args -DBUILD_RANDOM_PICK=ON - Build Options - BUILD_RANDOM_PICK (ON | **OFF** ) Switch on/off building of this application Launch the Application ---------------------- *(Describe how to launch the application. Provide hyperlinkage to launch robot contollers. List launch options specific to this application. E.g.)* - Launch this application :: ros2 launch template template - Launch Options - grasp_xyz (double | **"0.545 0.107 0.15"**) Specify pick position in the "base" frame - place_xyz (double | **"-0.107 -0.545 -0.10"**) Specify place position in the "base" frame Expected Outputs ---------------- *(Describe expected outputs from this application. Illustrate with screen snapshot when necessary. E.g.)* You should see Rviz output like this: .. image:: ../_static/images/pick_place.png Customization Notes ------------------- *(List possible customization items. Guide how to customize the application on new environment and new robots. E.g.)* - **Change the pick position** Use launch option "grasp_xyz" to change the pick position. - **Change the place position** Use launch option "place_xyz" to change to place position. ================================================ FILE: grasp_tutorials/index.rst ================================================ Welcome to ROS2 Grasp Library Tutorials ======================================= ROS2 Grasp Library is a ROS2 intelligent visual grasp solution for advanced industrial usages, with OpenVINO™ grasp detection and MoveIt Grasp Planning. These tutorials aim to help quickly bringup the solution in a new working environment. The tutorials introduce how to - Install, build, and launch the ROS2 Grasp Planner and Detector - Use launch options to customize in a new workspace - Bring up the intelligent visual grasp solution on a new robot - Do hand-eye calibration for a new camera setup - Launch the example applications Contents: --------- .. toctree:: :maxdepth: 2 doc/overview doc/getting_start doc/grasp_planner doc/robot_interface doc/bringup_robot doc/handeye_calibration doc/random_pick doc/recognize_pick doc/grasp_api doc/template ================================================ FILE: grasp_tutorials/package.xml ================================================ grasp_tutorials 0.5.0 Instructions and demo code for developing intelligent manipulation app with this ROS2 Grasp Library Apache License 2.0 Sharron Liu Yu Yan Sharron Liu Yu Yan ament_cmake rclcpp rclcpp ament_lint_auto ament_lint_common ament_cmake ================================================ FILE: grasp_utils/handeye_dashboard/README.md ================================================ # handeye_dashboard ## 1.Prerequisite * System install * Install [ROS2 Dashing](https://index.ros.org/doc/ros2/Installation/Dashing/Linux-Install-Debians/) * Install [handeye](https://github.com/RoboticsYY/handeye) * Install [criutils](https://github.com/RoboticsYY/criutils) * Install [baldor](https://github.com/RoboticsYY/baldor) * Install [handeye_tf_service](https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/handeye_tf_service) * Install [handeye_target_detection](https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/handeye_target_detection) ## 2. Build and install ```shell sudo apt install python3-numpy python3-scipy ``` Build with the `ros2_grasp_library` package. Installation instructions refer to [here](https://github.com/intel/ros2_grasp_library/blob/master/grasp_tutorials/doc/grasp_ros2/tutorials_1_grasp_ros2_with_camera.md). ## 3.Run ### 3.1 Bring up realsense camera ```shell ros2 run realsense_node realsense_node __params:=`ros2 pkg prefix realsense_examples`/share/realsense_examples/config/d435.yaml ``` > Note: other cameras can be used, only need to check that the image topic and camera info topic are published ### 3.2 Bring up calibration board detection ```shell ros2 launch handeye_target_detection pose_estimation.launch.py ``` For detailed information, please refer to the package [handeye_target_detection](https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/handeye_target_detection) If runing successfully, you should see something similar to: The detection result is displayed on the left panel of the Rviz2. ### 3.3 Bring up UR5 robot ```shell # Terminal 1 (robot frames tf update) ros2 launch robot_interface ur_test.launch.py move:=false # Terminal 2 (robot state display in Rviz2) ros2 launch ur_description view_ur5_ros2.launch.py ``` The realtime robot state is displayed: > Note: any robot can be used, only ensure that the robot ROS2 driver is publishing the joint states and link TFs at rate of at least 125Hz ### 3.4 Bring up calibration dashboard ```shell ros2 launch handeye_dashboard handeye_dashboard.launch.py ``` If running successfully, a rqt dashboard similar to the below photo should show up: On the panel of the dashboard, user can input the names of `Camera-Frame`, `Object-Frame`, `Robot-Base-Frame` and `End-Effector-Frame`. The calibration will lookup the TF transforms: * From `Camera-Frame` to `Object-Frame` * From `Robot-Base-Frame` to `End-Effector-Frame` The calibration process is controlled by the four buttons on the left panel of the dashboard: * Step 0: Select camera mount type, `attached on robot` or `fixed beside robot`. * Step 1: Use the first button to take snapshots of the two transforms. * Step 2: After enough samples are taken, use the second button to save the snapshots and make the AX=XB calculation. * Step 3: Use the fourth button to publish the static TF transform between `Camera-Frame` and `Robot-Base-Frame`. Please check TF or point cloud on Rviz2 to make sure the camera pose published. > Note: Be careful with the third button, it is used to clear the snapshots and the calculation result. ### 3.5 Publish calibration result Please check the result at `/tmp/camera-robot.txt`: ```yaml camera-robot pose: Translation: [-0.032727495589941216, -0.09304065368400717, 0.0003508296697299189] Rotation: in Quaternion [0.9997471812284859, 0.01090594636560865, -0.009141740972837598, -0.0174086912647742] ``` The result can be published without launching the whole GUI on command line: ```shell #ros2 run tf2_ros static_transform_publisher # NOTE the quaternion stored in "camera_robot.txt" is ros2 run tf2_ros static_transform_publisher -0.032727495589941216, -0.09304065368400717, 0.0003508296697299189 0.01090594636560865, -0.009141740972837598, -0.0174086912647742, 0.9997471812284859 base camera_link ``` ## 4.Result A result with the camera mounted on the robot end-effector looks like this: A video of the calibration process can be found at: [handeye_calibration_demo](https://videoportal.intel.com/media/Industrial-robot-hand-eye-calibration/0_8ddlp0p1) ###### *Any security issue should be reported using process at https://01.org/security* ================================================ FILE: grasp_utils/handeye_dashboard/config/Default.perspective ================================================ { "keys": {}, "groups": { "mainwindow": { "keys": { "geometry": { "repr(QByteArray.hex)": "QtCore.QByteArray(b'01d9d0cb000200000000044a0000004e0000070b000003620000044a0000006c0000070b0000036200000000000000000780')", "type": "repr(QByteArray.hex)", "pretty-print": " J N b J l b " }, "state": { "repr(QByteArray.hex)": "QtCore.QByteArray(b'000000ff00000000fd0000000100000003000002c2000002cdfc010000000afb0000006a006300720069006e00730070006500630074005f00640061007300680062006f006100720064005f005f00450064006700650044006500740065006300740069006f006e005f005f0031005f005f004500640067006500200044006500740065006300740069006f006e0100000000000002580000000000000000fb0000003c007200710074005f00700079005f0063006f006e0073006f006c0065005f005f005000790043006f006e0073006f006c0065005f005f0031005f005f0100000000000002580000000000000000fb0000006c007200710074005f007200650063006f006e006600690067007500720065005f005f0050006100720061006d005f005f0031005f005f005f0070006c007500670069006e0063006f006e007400610069006e00650072005f0074006f0070005f00770069006400670065007401000000000000073f0000000000000000fb00000044007200710074005f00670072006100700068005f005f0052006f007300470072006100700068005f005f0031005f005f0052006f007300470072006100700068005500690100000000000007800000000000000000fb0000005a007200710074005f0069006d006100670065005f0076006900650077005f005f0049006d0061006700650056006900650077005f005f0031005f005f0049006d00610067006500560069006500770057006900640067006500740100000000000007800000000000000000fb0000008c00680061006e0064006500790065005f00640061007300680062006f006100720064005f005f00480061006e006400450079006500430061006c006900620072006100740069006f006e005f005f0031005f005f0020004300520049002000470072006f00750070003a00200042006f007200640065007200200044006500740065006300740069006f006e01000000000000073f0000000000000000fb000000a600680061006e0064006500790065005f00640061007300680062006f006100720064005f005f00480061006e006400450079006500430061006c006900620072006100740069006f006e005f005f0031005f005f00200049006e00740065006c0020004f0054004300200052006f0062006f0074006900630073003a002000480061006e0064002d004500790065002000430061006c006900620072006100740069006f006e0100000000000002c20000017200fffffffb00000026007200710074005f007200760069007a005f005f005200560069007a005f005f0031005f005f0100000000000007800000000000000000fb0000004c007200710074005f00740066005f0074007200650065005f005f0052006f0073005400660054007200650065005f005f0031005f005f0052006f0073005400660054007200650065005500690100000000000007800000000000000000fb00000058007200710074005f007000750062006c00690073006800650072005f005f005000750062006c00690073006800650072005f005f0031005f005f005000750062006c006900730068006500720057006900640067006500740100000000000007800000000000000000000002c20000000000000004000000040000000800000008fc00000001000000030000000100000036004d0069006e0069006d0069007a006500640044006f0063006b00570069006400670065007400730054006f006f006c0062006100720000000000ffffffff0000000000000000')", "type": "repr(QByteArray.hex)", "pretty-print": " jcrinspect_dashboard__EdgeDetection__1__Edge Detection handeye_dashboard 0.1.0 The handeye_dashboard package Yu Yan Apache License 2.0 Yu Yan rclpy rqt_gui rqt_gui_py python_qt_binding tf2 tf2_ros handeye ament_python ================================================ FILE: grasp_utils/handeye_dashboard/plugin.xml ================================================ Dashboard for the Hand-Eye Calibration images/Intel.png Plugins created by Intel OTC-Robotics images/tool-calibration.png Dashboard for the hand-eye calibration ================================================ FILE: grasp_utils/handeye_dashboard/resource/handeye_dashboard ================================================ ================================================ FILE: grasp_utils/handeye_dashboard/setup.py ================================================ from glob import glob from setuptools import setup from setuptools import find_packages package_name = 'handeye_dashboard' setup( name=package_name, version='0.1.0', packages=find_packages('src', exclude=['test']), data_files=[ ('share/ament_index/resource_index/packages', ['resource/' + package_name]), ('share/' + package_name, ['package.xml']), ('share/' + package_name, ['plugin.xml']), ('share/' + package_name + '/images', glob('images/*.png')), ('share/' + package_name + '/data', glob('data/*.json')), ('share/' + package_name + '/launch', glob('launch/*.launch.py')), ('share/' + package_name + '/config', glob('config/*.perspective')), ], install_requires=['setuptools'], zip_safe=True, author='Yu Yan', author_email='yu.yan@intel.com', maintainer='Yu Yan', maintainer_email='yu.yan@intel.com', keywords=['ROS'], classifiers=[ 'Intended Audience :: Developers', 'License :: OSI Approved :: BSD 3-Clause License', 'Programming Language :: Python', 'Topic :: Software Development', ], description=( 'The handeye_dashboard package.' ), license='Apache License 2.0', tests_require=['pytest'], package_dir={'':'src'}, entry_points={ 'console_scripts': [ 'handeye_dashboard = handeye_dashboard.main:main', ], }, ) ================================================ FILE: grasp_utils/handeye_dashboard/src/handeye_dashboard/__init__.py ================================================ ================================================ FILE: grasp_utils/handeye_dashboard/src/handeye_dashboard/handeye_calibration.py ================================================ #!/usr/bin/env python import json import rclpy import numpy as np import baldor as br from rclpy.clock import ROSClock from geometry_msgs.msg import TransformStamped from handeye_tf_service.srv import HandeyeTF from ament_index_python.resources import get_resource # Rqt widgets from rqt_gui_py.plugin import Plugin from python_qt_binding import QtCore from python_qt_binding.QtGui import QIcon, QImage, QPixmap, QStandardItem, \ QIntValidator, QStandardItemModel from python_qt_binding.QtWidgets import (QComboBox, QAction, QToolBar, QStatusBar, QLineEdit, QWidget, QVBoxLayout, QLabel, QTextEdit, QFrame, QHBoxLayout, QTreeView) class bcolors: HEADER = '\033[95m' OKBLUE = '\033[94m' OKGREEN = '\033[92m' WARNING = '\033[93m' FAIL = '\033[91m' ENDC = '\033[0m' BOLD = '\033[1m' UNDERLINE = '\033[4m' def save_samples_to_file(samples, file_name='dataset.json', pkg='handeye_dashboard'): """ Saving transform samples to a disc file Parameters ------------- samples: list A list of transforms. file_name: string The destination file. Returns -------- Success: bool Execution status """ success = False samples_list = [] for sample in samples: samples_list +=[[sample[0].tolist(), sample[1].tolist()]] _, path_pkg = get_resource('packages', pkg) import json # If the file name exists, write a JSON string into the file. if file_name != None: filename = '/tmp/' + file_name # Writing JSON data with open(filename, 'w') as f: json.dump(samples_list, f) success = True return success class HandEyeCalibration(Plugin): PLUGIN_TITLE = ' Intel OTC Robotics: Hand-Eye Calibration' def __init__(self, context): super(HandEyeCalibration, self).__init__(context) self.context = context self.node = context.node self.widget = QWidget() self.widget.setObjectName(self.PLUGIN_TITLE) self.widget.setWindowTitle(self.PLUGIN_TITLE) # Data self.Tsamples = [] # Toolbar _, path_pkg = get_resource('packages', 'handeye_dashboard') print("{}".format(path_pkg)) self.snapshot_action = QAction(QIcon.fromTheme('camera-photo'), 'Take a snapshot', self.widget) path = path_pkg + '/share/handeye_dashboard/images/capture.png' self.calibrate_action = QAction(QIcon(QPixmap.fromImage(QImage(path))), 'Get the camera/robot transform', self.widget) self.clear_action = QAction(QIcon.fromTheme('edit-clear'), 'Clear the record data.', self.widget) path = path_pkg + '/share/handeye_dashboard/images/UR5.png' self.execut_action = QAction(QIcon(QPixmap.fromImage(QImage(path))), 'EStart the publishing the TF.', self.widget) self.toolbar = QToolBar() self.toolbar.addAction(self.snapshot_action) self.toolbar.addAction(self.calibrate_action) self.toolbar.addAction(self.clear_action) self.toolbar.addAction(self.execut_action) # Toolbar0 self.l0 = QLabel(self.widget) self.l0.setText("Camera-Mount-Type: ") self.l0.setFixedWidth(150) self.l0.setAlignment(QtCore.Qt.AlignRight | QtCore.Qt.AlignVCenter) self.combobox = QComboBox(self.widget) self.combobox.addItem('attached on robot') self.combobox.addItem('fixed beside robot') self.toolbar0 = QToolBar() self.toolbar0.addWidget(self.l0) self.toolbar0.addWidget(self.combobox) # Toolbar1 self.l1 = QLabel(self.widget) self.l1.setText("Camera-Frame: ") self.l1.setFixedWidth(150) self.l1.setAlignment(QtCore.Qt.AlignRight | QtCore.Qt.AlignVCenter) self.camera_frame = QLineEdit(self.widget) self.camera_frame.setText("camera_link") self.toolbar1 = QToolBar() self.toolbar1.addWidget(self.l1) self.toolbar1.addWidget(self.camera_frame) # Toolbar2 self.l2 = QLabel(self.widget) self.l2.setText("Object-Frame: ") self.l2.setFixedWidth(150) self.l2.setAlignment(QtCore.Qt.AlignRight | QtCore.Qt.AlignVCenter) self.object_frame = QLineEdit(self.widget) self.object_frame.setText("calib_board") self.toolbar2 = QToolBar() self.toolbar2.addWidget(self.l2) self.toolbar2.addWidget(self.object_frame) # Toolbar3 self.l3 = QLabel(self.widget) self.l3.setText("Robot-Base-Frame: ") self.l3.setFixedWidth(150) self.l3.setAlignment(QtCore.Qt.AlignRight | QtCore.Qt.AlignVCenter) self.base_frame = QLineEdit(self.widget) self.base_frame.setText("base") self.toolbar3 = QToolBar() self.toolbar3.addWidget(self.l3) self.toolbar3.addWidget(self.base_frame) # Toolbar4 self.l4 = QLabel(self.widget) self.l4.setText("End-Effector-Frame: ") self.l4.setFixedWidth(150) self.l4.setAlignment(QtCore.Qt.AlignRight | QtCore.Qt.AlignVCenter) self.endeffector_frame = QLineEdit(self.widget) self.endeffector_frame.setText("tool0") self.toolbar4 = QToolBar() self.toolbar4.addWidget(self.l4) self.toolbar4.addWidget(self.endeffector_frame) # Toolbar5 self.l5 = QLabel(self.widget) self.l5.setText("Sample-Number: ") self.l5.setFixedWidth(150) self.l5.setAlignment(QtCore.Qt.AlignRight | QtCore.Qt.AlignVCenter) self.le5 = QLineEdit(self.widget) self.le5.setValidator(QIntValidator()) self.le5.setText('10') self.le5.setReadOnly(True) self.toolbar5 = QToolBar() self.toolbar5.addWidget(self.l5) self.toolbar5.addWidget(self.le5) # TreeView self.treeview = QTreeView() self.treeview.setAlternatingRowColors(True) self.model = QStandardItemModel(self.treeview) self.treeview.setModel(self.model) self.treeview.setHeaderHidden(True) # TextEdit self.textedit = QTextEdit(self.widget) self.textedit.setReadOnly(True) # Layout self.layout = QVBoxLayout() self.layout.addWidget(self.toolbar0) self.layout.addWidget(self.toolbar1) self.layout.addWidget(self.toolbar2) self.layout.addWidget(self.toolbar3) self.layout.addWidget(self.toolbar4) self.layout.addWidget(self.toolbar5) self.layout.addWidget(self.toolbar) self.layoutH = QHBoxLayout() self.layoutH.addWidget(self.treeview) self.layoutH.addWidget(self.textedit) self.layout.addLayout(self.layoutH) self.widget.setLayout(self.layout) # Add the widget to the user interface if context.serial_number() > 1: self.widget.setWindowTitle(self.widget.windowTitle() + (' (%d)' % context.serial_number())) context.add_widget(self.widget) # Make the connections self.snapshot_action.triggered.connect(self.take_snapshot) self.calibrate_action.triggered.connect(self.calibration) self.clear_action.triggered.connect(self.clear) self.execut_action.triggered.connect(self.execution) # Package path self.path_pkg = path_pkg # Set up TF self.cli = self.node.create_client(HandeyeTF, 'handeye_tf_service') while not self.cli.wait_for_service(timeout_sec=1.0): self.node.get_logger().info('service not available, waiting again...') self.req = HandeyeTF.Request() def clear(self): # >>> Clear the recorded samples self.textedit.append('Clearing the recorded data ...') self.textedit.clear() self.Tsamples = [] self.model.clear() def get_tf_transform(self, frame_id, child_frame_id): self.req.transform.header.frame_id = frame_id self.req.transform.child_frame_id = child_frame_id self.req.publish.data = False future = self.cli.call_async(self.req) rclpy.spin_until_future_complete(self.node, future) transform = TransformStamped() try: result = future.result() except Exception as e: self.node.get_logger().info('Service call failed %r' % (e,)) else: transform = result.tf_lookup_result return transform def publish_tf_transform(self, transform_to_publish): self.req.publish.data = True self.req.transform = transform_to_publish future = self.cli.call_async(self.req) rclpy.spin_until_future_complete(self.node, future) try: future.result() except Exception as e: self.node.get_logger().info('Service call failed %r' % (e,)) else: self.node.get_logger().info('Send the camera-robot transform :\n\tfrom `{}` to `{}`.'. format(self.req.transform.header.frame_id, self.req.transform.child_frame_id)) def take_snapshot(self): # >>> Take the snapshot self.textedit.append('Taking snapshot ...') # Get the transform from `tool0` to `base_link` T = self.get_tf_transform(self.base_frame.text(), self.endeffector_frame.text()) bTe = np.zeros((4,4)) q = [T.transform.rotation.w, T.transform.rotation.x, T.transform.rotation.y, T.transform.rotation.z] bTe = br.quaternion.to_transform(q) bTe[:3, 3] = np.array([T.transform.translation.x, T.transform.translation.y, T.transform.translation.z]) self.textedit.append('Lookup transform: from `{}` to `{}`.'. format(self.base_frame.text(), self.endeffector_frame.text())) self.node.get_logger().info(bcolors.OKGREEN + 'bTe:' + bcolors.ENDC + '\n{}'.format(bTe)) # Get the transform from `calib_board` to `camera_link` T = self.get_tf_transform(self.camera_frame.text(), self.object_frame.text()) cTo = np.zeros((4,4)) q = [T.transform.rotation.w, T.transform.rotation.x, T.transform.rotation.y, T.transform.rotation.z] cTo = br.quaternion.to_transform(q) cTo[:3, 3] = np.array([T.transform.translation.x, T.transform.translation.y, T.transform.translation.z]) self.textedit.append('Lookup transform: from `{}` to `{}`.'. format(self.camera_frame.text(), self.object_frame.text())) self.node.get_logger().info(bcolors.OKGREEN + 'cTo:' + bcolors.ENDC + '\n{}'.format(cTo)) parent = QStandardItem('Snapshot {}'.format(len(self.Tsamples))) child_1 = QStandardItem('bTe:\n{}\n{}\n{}\n{}'.format(bTe[0, :], bTe[1, :], bTe[2, :], bTe[3, :])) child_2 = QStandardItem('cTo:\n{}\n{}\n{}\n{}'.format(cTo[0, :], cTo[1, :], cTo[2, :], cTo[3, :])) parent.appendRow(child_1) parent.appendRow(child_2) self.model.appendRow(parent) self.Tsamples.append((bTe, cTo)) self.le5.setText(str(len(self.Tsamples))) def calibration(self): # >>> Compute the calibration self.textedit.append('Making the calibration ...') if len(self.Tsamples) == 0: self.textedit.append('No transform recorded, please take snapshots.') return # save samples to `dataset.json` file save_samples_to_file(self.Tsamples) import handeye if self.combobox.currentIndex() == 0: solver_cri = handeye.calibrator.HandEyeCalibrator(setup='Moving') if self.combobox.currentIndex() == 1: solver_cri = handeye.calibrator.HandEyeCalibrator(setup='Fixed') for sample in self.Tsamples: solver_cri.add_sample(sample[0], sample[1]) try: bTc = solver_cri.solve(method=handeye.solver.Daniilidis1999) # save the calibration result to 'camera-robot.json' file file_output = '/tmp/' + 'camera-robot.json' with open(file_output, 'w') as f: json.dump(bTc.tolist(), f) except Exception: self.textedit.append("Failed to solve the hand-eye calibration.") def execution(self): # >>> Publish the camera-robot transform self.textedit.append('Publishing the camera TF ...') file_input = '/tmp/' + 'camera-robot.json' with open(file_input, 'r') as f: datastore = json.load(f) to_frame = self.camera_frame.text() if self.combobox.currentIndex() == 0: from_frame = self.endeffector_frame.text() if self.combobox.currentIndex() == 1: from_frame = self.base_frame.text() bTc = np.array(datastore) static_transformStamped = TransformStamped() static_transformStamped.header.stamp = ROSClock().now().to_msg() static_transformStamped.header.frame_id = from_frame static_transformStamped.child_frame_id = to_frame static_transformStamped.transform.translation.x = bTc[0,3] static_transformStamped.transform.translation.y = bTc[1,3] static_transformStamped.transform.translation.z = bTc[2,3] q = br.transform.to_quaternion(bTc) static_transformStamped.transform.rotation.x = q[1] static_transformStamped.transform.rotation.y = q[2] static_transformStamped.transform.rotation.z = q[3] static_transformStamped.transform.rotation.w = q[0] self.publish_tf_transform(static_transformStamped) output_string = "camera-robot pose:\n" output_string += " Translation: [{}, {}, {}]\n".format(bTc[0,3], bTc[1,3], bTc[2,3]) output_string += " Rotation: in Quaternion [{}, {}, {}, {}]".format(q[0], q[1], q[2], q[3]) file_path = '/tmp/' + 'camera-robot.txt' with open(file_path, 'w') as f: f.write(output_string) def shutdown_plugin(self): """ Unregister subscribers when the plugin shutdown """ pass def save_settings(self, plugin_settings, instance_settings): # Nothing to be done here pass def restore_settings(self, plugin_settings, instance_settings): # Nothing to be done here pass ================================================ FILE: grasp_utils/handeye_dashboard/src/handeye_dashboard/main.py ================================================ #!/usr/bin/env python3 import sys from rqt_gui.main import Main def main(): sys.exit(Main().main(sys.argv, standalone='handeye_dashboard.plugin.HandEyeCalibration')) if __name__ == '__main__': main() ================================================ FILE: grasp_utils/handeye_dashboard/src/handeye_dashboard/plugin.py ================================================ #!/usr/bin/env python from .handeye_calibration import HandEyeCalibration ================================================ FILE: grasp_utils/handeye_target_detection/.clang-format ================================================ --- BasedOnStyle: Google AccessModifierOffset: -2 ConstructorInitializerIndentWidth: 2 AlignEscapedNewlinesLeft: false AlignTrailingComments: true AllowAllParametersOfDeclarationOnNextLine: false AllowShortIfStatementsOnASingleLine: false AllowShortLoopsOnASingleLine: false AllowShortFunctionsOnASingleLine: None AllowShortLoopsOnASingleLine: false AlwaysBreakTemplateDeclarations: true AlwaysBreakBeforeMultilineStrings: false BreakBeforeBinaryOperators: false BreakBeforeTernaryOperators: false BreakConstructorInitializersBeforeComma: true BinPackParameters: true ColumnLimit: 120 ConstructorInitializerAllOnOneLineOrOnePerLine: true DerivePointerBinding: false PointerBindsToType: true ExperimentalAutoDetectBinPacking: false IndentCaseLabels: true MaxEmptyLinesToKeep: 1 NamespaceIndentation: None ObjCSpaceBeforeProtocolList: true PenaltyBreakBeforeFirstCallParameter: 19 PenaltyBreakComment: 60 PenaltyBreakString: 100 PenaltyBreakFirstLessLess: 1000 PenaltyExcessCharacter: 1000 PenaltyReturnTypeOnItsOwnLine: 70 SpacesBeforeTrailingComments: 2 Cpp11BracedListStyle: false Standard: Auto IndentWidth: 2 TabWidth: 2 UseTab: Never IndentFunctionDeclarationAfterType: false SpacesInParentheses: false SpacesInAngles: false SpaceInEmptyParentheses: false SpacesInCStyleCastParentheses: false SpaceAfterControlStatementKeyword: true SpaceBeforeAssignmentOperators: true ContinuationIndentWidth: 4 SortIncludes: false SpaceAfterCStyleCast: false # Configure each individual brace in BraceWrapping BreakBeforeBraces: Custom # Control of individual brace wrapping cases BraceWrapping: AfterClass: 'true' AfterControlStatement: 'true' AfterEnum : 'true' AfterFunction : 'true' AfterNamespace : 'true' AfterStruct : 'true' AfterUnion : 'true' BeforeCatch : 'true' BeforeElse : 'true' IndentBraces : 'false' ... ================================================ FILE: grasp_utils/handeye_target_detection/CMakeLists.txt ================================================ # Copyright (c) 2019 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. cmake_minimum_required(VERSION 3.5) project(handeye_target_detection) # Default to C99 if(NOT CMAKE_C_STANDARD) set(CMAKE_C_STANDARD 99) endif() # Default to C++14 if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic) endif() set (WITH_OPENCL OFF) # find dependencies find_package(ament_cmake REQUIRED) find_package(tf2 REQUIRED) find_package(tf2_msgs REQUIRED) find_package(tf2_ros REQUIRED) find_package(std_msgs REQUIRED) find_package(cv_bridge REQUIRED) find_package(sensor_msgs REQUIRED) find_package(image_transport REQUIRED) find_package(OpenCV REQUIRED) if((${OpenCV_VERSION} LESS 3.3)) message(WARNING "handeye_target_detection works better with OpenCV version >= 3.3") endif() # Set include directory path include_directories( include ${rclcpp_INCLUDE_DIRS} ${Boost_INCLUDE_DIRS} ) set(SOURCES_POSEESTIMATION src/pose_estimation_node.cpp src/pose_estimator.cpp ) add_executable(pose_estimation ${SOURCES_POSEESTIMATION}) ament_target_dependencies(pose_estimation rclcpp tf2 tf2_msgs tf2_ros std_msgs sensor_msgs cv_bridge image_transport) target_link_libraries(pose_estimation ${rclcpp_LIBRARIES} ${OpenCV_LIBS} ) # Install target files install( TARGETS pose_estimation DESTINATION lib/${PROJECT_NAME}/ ) # Install header files install( DIRECTORY include/ DESTINATION include ) # Install launch files. install(DIRECTORY launch cfg data DESTINATION share/${PROJECT_NAME}/ ) if(BUILD_TESTING) find_package(ament_lint_auto REQUIRED) ament_lint_auto_find_test_dependencies() endif() ament_package() ================================================ FILE: grasp_utils/handeye_target_detection/README.md ================================================ # handeye_target_detection ## 1. Introduction This package is used to estimate the pose of calibration patterns. Currently four kinds of OpenCV calibration patterns are supported: CHESSBOARD, ASYMMETRIC_CIRCLES_GRID, CHARUCO, ARUCO. ## 2. Prerequisite * Download and print the calibration pattern on an A4 paper without the border shrink: * [Chessboard](./data/pattern/chessboard_9X6.png) * Width * Height: 9X6 * Square size: 0.026 * [Asymmetric circles grid 1](./data/pattern/asymmetric_circles_grid_4X11.png) * Width * Height: 4X11 * Cricles seperation: 0.035 * [Asymmetric circles grid 2](./data/pattern/asymmetric_circles_grid_3X5.png) * Width * Height: 3X5 * Cricles seperation: 0.035 * [Aruco board 1](./data/pattern/aruco_5X7_DICT_6X6_250.png) * Width * Height: 5X7 * Dictionary: 6X6_250 * Marker size: 0.035 * Marker seperation: 0.007 * [Aruco board 2](./data/pattern/aruco_3X4_DICT_4X4_50.png) * Width * Height: 3X4 * Dictionary: 4X4_50 * Marker size: 0.0256 * Marker seperation: 0.0066 * [Charuco board](./data/pattern/charuco_5X7_DICT_6X6_250.jpg) * Width * Height: 5X7 * Dictionary: 6X6_250 * Square size: 0.035 * Marker size: 0.022 * RGB camera: * Video/Image file * Intel®RealSenseTM (Tested with D435) * Standard USB camera * ROS Dashing (Ubuntu 18.04, 64 bits) ## 3. Environment Setup * Install [ROS Dashing](https://index.ros.org/doc/ros2/Installation/Dashing/Linux-Install-Debians/) * Install [Intel®RealSenseTM SDK 2.0](https://github.com/IntelRealSense/librealsense) * Install [Intel®RealSenseTM ROS2 Wrapper](https://github.com/intel/ros2_intel_realsense) ## 4. Build and install * Install dependencies ```shell sudo apt install ros-dashing-cv-bridge \ ros-dashing-image-transport ``` * Build with ros2_grasp_library ## 5. Run Before running the code, a camera should be launched and guarantee that the topics of the RGB image and camera info are being published. If a RealSense D435 camera is used, run the following command to bring up the camera: ```shell ros2 run realsense_node realsense_node __params:=`ros2 pkg prefix realsense_examples`/share/realsense_examples/config/d435.yaml ``` > Note: other cameras can be used, only if it can publish RGB image topic and camera_info topic. Run the following command to bring up the pose estimation of a calibration pattern: ```shell ros2 launch handeye_target_detection pose_estimation.launch.py ``` The launch file loads parameters from `./launch/pose_estimation.yaml` file. For the meaning of these parameters, refer to the table below: ```shell Launch options: --pattern (string, default: ARUCO) The pattern of the calibration plate, it should be one of {CHESSBOARD, ASYMMETRIC_CIRCLES_GRID, CHARUCO, ARUCO} --image_topic (string, default: /camera/color/image_raw) The RGB image topic, to which the pose estimation node subscribes --camera_info_topic (string, default: /camera/color/camera_info) The camera info topic, to which the pose estimation node subscribes --publish_image_topic (string, default: /image/detected) The image topic published by the pose estimation ndoe --width (int, default: 3) Usualy the number of squres or markers along the X direction --Height (int, default: 4) Usually the number of squares or markers alogn the Y direction --dictionary (string, default: DICT_6X6_250) If ARUCO or CHARUCO pattern is used, this parameter indicates which marker dictionary the Board markers belong to. For more infomation, refer to https://docs.opencv.org/3.4.0/d5/dae/tutorial_aruco_detection.html --chessboard_square_size (double, default: 0.026) If a CHESSBOARD is used, this indicates the square length --circle_grid_seperation (double, default: 0.035) If an ASYMMETRIC_CIRCLES_GRID is used, this indicates the seperation distance between circles --aruco_board_marker_size (double, default: 0.035) The size of aruco marker --aruco_board_marker_seperation (double, default: 0.007) The seperation distance of aruco marker --charuco_board_marker_size (double, default: 0.022) The length of charuco marker --charuco_board_square_size (double, default: 0.037) The length of charuco square ``` For running properly, user has to customize these parameters. For example, to make the pose estimation of the four calibration patterns listed in the `Prerequisite` section, the parameters should be: * Chessboard ```yml pose_estimation: ros__parameters: pattern: "CHESSBOARD" image_topic: "/camera/color/image_raw" camera_info_topic: "/camera/color/camera_info" publish_image_topic: "/image/detected" width: 9 height: 6 chessboard_square_size: 0.026 ``` * Asymmetric circles grid ```yml # 4X11 0.035 pose_estimation: ros__parameters: pattern: "ASYMMETRIC_CIRCLES_GRID" image_topic: "/camera/color/image_raw" camera_info_topic: "/camera/color/camera_info" publish_image_topic: "/image/detected" width: 4 height: 11 circle_grid_seperation: 0.035 ``` ```yml # 3X5 0.035 pose_estimation: ros__parameters: pattern: "ASYMMETRIC_CIRCLES_GRID" image_topic: "/camera/color/image_raw" camera_info_topic: "/camera/color/camera_info" publish_image_topic: "/image/detected" width: 3 height: 5 circle_grid_seperation: 0.035 ``` * Aruco board ```yml # 5x7 DICT_6X6_250 0.035 0.007 pose_estimation: ros__parameters: pattern: "ARUCO" image_topic: "/camera/color/image_raw" camera_info_topic: "/camera/color/camera_info" publish_image_topic: "/image/detected" width: 5 height: 7 dictionary: "DICT_6X6_250" aruco_board_marker_size: 0.035 aruco_board_marker_seperation: 0.007 ``` ```yml # 3x4 DICT_4X4_50 0.0256 0.0066 pose_estimation: ros__parameters: pattern: "ARUCO" image_topic: "/camera/color/image_raw" camera_info_topic: "/camera/color/camera_info" publish_image_topic: "/image/detected" width: 3 height: 4 dictionary: "DICT_4X4_50" aruco_board_marker_size: 0.0256 aruco_board_marker_seperation: 0.0066 ``` * Charuco board ```yml pose_estimation: ros__parameters: pattern: "CHARUCO" image_topic: "/camera/color/image_raw" camera_info_topic: "/camera/color/camera_info" publish_image_topic: "/image/detected" width: 5 height: 7 dictionary: "DICT_6X6_250" charuco_board_marker_size: 0.022 charuco_board_square_size: 0.037 ``` ## 6. Results If the detection works well, the (red, green, blue) arrows indicating the coordiante system origin at the corner of the calibration board should show up on the picture. For the patterns listed above, it should looks like: CHESSBOARD|ASYMMETRIC CIRCLES GRID 4X11|ASYMMETRIC CIRCLES GRID 3X5|CHARUCO|ARUCO 5X7|ARUCO 3X4 ----------|----------------------------|----|-------|-----|---- ![CHESSBOARD][image1]|![ASYMMETRIC CIRCLES GRID][image2_1]|![ASYMMETRIC CIRCLES GRID][image2_2]|![CHARUCO][image3]|![ARUCO][image4_1]|![ARUCO][image4_2] [image1]:data/detected/chessboard/chessboard.png [image2_1]:data/detected/circlegrid/4X11_circles_grid.png [image2_2]:data/detected/circlegrid/3X5_circles_grid.png [image3]:data/detected/charuco/charuco.png [image4_1]:data/detected/aruco/5X7_aruco.png [image4_2]:data/detected/aruco/3X4_aruco.png ###### *Any security issue should be reported using process at https://01.org/security* ================================================ FILE: grasp_utils/handeye_target_detection/include/PoseEstimator.h ================================================ /** Copyright (c) 2019 Intel Corporation * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ #ifndef POSEESTIMATOR_H #define POSEESTIMATOR_H // ROS include #include #include #include #include #include #include #include #include #include #include // OpenCV include #include #include #include #include #include // System include #include #include namespace tf2 { void transformTF2ToMsg(const tf2::Transform& tf2, geometry_msgs::msg::TransformStamped& msg, builtin_interfaces::msg::Time stamp, const std::string& frame_id, const std::string& child_frame_id); } class PoseEstimator { public: PoseEstimator(std::shared_ptr& node, std::string pattern, std::string image_topic, std::string camera_info_topic, std::string publish_image_topic, int width, int height, std::string dictionary, double chessboard_square_size, double circle_grid_seperation, double aruco_board_marker_size, double aruco_board_marker_seperation, double charuco_board_marker_size, double charuco_board_square_size); ~PoseEstimator() { } void imageCB_CHESSBOARD(const sensor_msgs::msg::Image::ConstSharedPtr& msg); void imageCB_ASYMMETRIC_CIRCLES_GRID(const sensor_msgs::msg::Image::ConstSharedPtr& msg); void imageCB_ARUCO(const sensor_msgs::msg::Image::ConstSharedPtr& msg); void imageCB_CHARUCO(const sensor_msgs::msg::Image::ConstSharedPtr& msg); void caminfoCB(const sensor_msgs::msg::CameraInfo::SharedPtr msg); void draw(cv::Mat img, std::vector corners, cv::Mat imgpts); void rotationVectorToTF2Quaternion(tf2::Quaternion&, cv::Vec3d&); private: // ROS variables std::shared_ptr node_; image_transport::ImageTransport it_; image_transport::Subscriber image_sub_; image_transport::Publisher image_pub_; rclcpp::Subscription::SharedPtr camerainfo_sub_; tf2_ros::TransformBroadcaster broadcaster_; // Native variables cv::Mat camera_matrix_; cv::Mat dist_coeffs_; bool run_; std::string image_topic_; std::string camera_info_topic_; std::string publish_image_topic_; int width_; int height_; double chessboard_square_size_; double circle_grid_seperation_; double aruco_board_marker_size_; double aruco_board_marker_seperation_; double charuco_board_marker_size_; double charuco_board_square_size_; enum Patterns { NOT_EXISTING, CHESSBOARD, ASYMMETRIC_CIRCLES_GRID, CHARUCO, ARUCO }; Patterns calibration_pattern_; std::map pattern_map_; cv::aruco::PREDEFINED_DICTIONARY_NAME dictionary_; std::map disctionary_map_; std::string path_; }; #endif ================================================ FILE: grasp_utils/handeye_target_detection/launch/pose_estimation.launch.py ================================================ # Copyright (c) 2019 Intel Corporation. All Rights Reserved # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import launch import launch.actions import launch.substitutions import launch_ros.actions from ament_index_python.packages import get_package_share_directory def generate_launch_description(): # .yaml file for configuring the parameters yaml = os.path.join( get_package_share_directory('handeye_target_detection'), 'launch', 'pose_estimation.yaml' ) rviz = os.path.join( get_package_share_directory('handeye_target_detection'), 'cfg', 'handeye.rviz' ) return launch.LaunchDescription([ launch_ros.actions.Node( package='handeye_target_detection', node_executable='pose_estimation', output='screen', arguments=['__params:='+yaml]), launch_ros.actions.Node( package='rviz2', node_executable='rviz2', output='screen', arguments=['-d', rviz]), ]) ================================================ FILE: grasp_utils/handeye_target_detection/launch/pose_estimation.yaml ================================================ pose_estimation: ros__parameters: pattern: "ARUCO" image_topic: "/camera/color/image_raw" camera_info_topic: "/camera/color/camera_info" publish_image_topic: "/image/detected" width: 3 height: 4 dictionary: "DICT_4X4_50" chessboard_square_size: 0.026 circle_grid_seperation: 0.035 aruco_board_marker_size: 0.0256 aruco_board_marker_seperation: 0.0066 charuco_board_marker_size: 0.022 charuco_board_square_size: 0.03 ================================================ FILE: grasp_utils/handeye_target_detection/package.xml ================================================ handeye_target_detection 0.1.0 Recognize the calibration pattern pose with RGBD camera. yanyu yanyu Apache License 2.0 ament_cmake rclcpp std_msgs sensor_msgs cv_bridge image_transport rospy tf rclcpp ament_cmake_clang_format ament_cmake ================================================ FILE: grasp_utils/handeye_target_detection/src/pose_estimation_node.cpp ================================================ /** Copyright (c) 2019 Intel Corporation * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ #include #include #include "PoseEstimator.h" using namespace std::chrono_literals; int main(int argc, char** argv) { // Start the ros node rclcpp::init(argc, argv); auto node = std::make_shared( "pose_estimation", rclcpp::NodeOptions().allow_undeclared_parameters(true).automatically_declare_parameters_from_overrides(true)); // Initialize parameter client auto parameters_client = std::make_shared(node); while (!parameters_client->wait_for_service(1s)) { if (!rclcpp::ok()) { RCLCPP_ERROR(node->get_logger(), "Interrupted while waiting for the service. Exiting."); rclcpp::shutdown(); } RCLCPP_INFO(node->get_logger(), "service not available, waiting again..."); } // Get parameters std::string pattern = parameters_client->get_parameter("pattern", "ARUCO"); std::string image_topic = parameters_client->get_parameter("image_topic", "/camera/color/image_raw"); std::string camera_info_topic = parameters_client->get_parameter("camera_info_topic", "/camera/color/camera_info"); std::string publish_image_topic = parameters_client->get_parameter("publish_image_topic", "/image/detected"); int width = parameters_client->get_parameter("width", 5); int height = parameters_client->get_parameter("height", 7); std::string dictionary = parameters_client->get_parameter("dictionary", "DICT_4X4_50"); double chessboard_square_size = parameters_client->get_parameter("chessboard_square_size", 0.026); double circle_grid_seperation = parameters_client->get_parameter("circle_grid_seperation", 0.035); double aruco_board_marker_size = parameters_client->get_parameter("aruco_board_marker_size", 0.035); double aruco_board_marker_seperation = parameters_client->get_parameter("aruco_board_marker_seperation", 0.007); double charuco_board_marker_size = parameters_client->get_parameter("charuco_board_marker_size", 0.022); double charuco_board_square_size = parameters_client->get_parameter("charuco_board_square_size", 0.037); PoseEstimator pe(node, pattern, image_topic, camera_info_topic, publish_image_topic, width, height, dictionary, chessboard_square_size, circle_grid_seperation, aruco_board_marker_size, aruco_board_marker_seperation, charuco_board_marker_size, charuco_board_square_size); rclcpp::spin(node); RCLCPP_INFO(node->get_logger(), "Node calibration_pattern_pose_estimation exited."); return 0; } ================================================ FILE: grasp_utils/handeye_target_detection/src/pose_estimator.cpp ================================================ /** Copyright (c) 2019 Intel Corporation * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ #include "PoseEstimator.h" using std::placeholders::_1; PoseEstimator::PoseEstimator(std::shared_ptr& node, std::string pattern, std::string image_topic, std::string camera_info_topic, std::string publish_image_topic, int width, int height, std::string dictionary, double chessboard_square_size, double circle_grid_seperation, double aruco_board_marker_size, double aruco_board_marker_seperation, double charuco_board_marker_size, double charuco_board_square_size) : node_(node) , it_(node_) , broadcaster_(node_) , run_(false) , width_(width) , height_(height) , chessboard_square_size_(chessboard_square_size) , circle_grid_seperation_(circle_grid_seperation) , aruco_board_marker_size_(aruco_board_marker_size) , aruco_board_marker_seperation_(aruco_board_marker_seperation) , charuco_board_marker_size_(charuco_board_marker_size) , charuco_board_square_size_(charuco_board_square_size) { // Parse dictionary parameter disctionary_map_["DICT_4X4_50"] = cv::aruco::DICT_4X4_50; disctionary_map_["DICT_4X4_100"] = cv::aruco::DICT_4X4_100; disctionary_map_["DICT_4X4_250"] = cv::aruco::DICT_4X4_250; disctionary_map_["DICT_4X4_1000"] = cv::aruco::DICT_4X4_1000; disctionary_map_["DICT_5X5_50"] = cv::aruco::DICT_5X5_50; disctionary_map_["DICT_5X5_100"] = cv::aruco::DICT_5X5_100; disctionary_map_["DICT_5X5_250"] = cv::aruco::DICT_5X5_250; disctionary_map_["DICT_5X5_1000"] = cv::aruco::DICT_5X5_1000; disctionary_map_["DICT_6X6_50"] = cv::aruco::DICT_6X6_50; disctionary_map_["DICT_6X6_100"] = cv::aruco::DICT_6X6_100; disctionary_map_["DICT_6X6_250"] = cv::aruco::DICT_6X6_250; disctionary_map_["DICT_6X6_1000"] = cv::aruco::DICT_6X6_1000; disctionary_map_["DICT_7X7_50"] = cv::aruco::DICT_7X7_50; disctionary_map_["DICT_7X7_100"] = cv::aruco::DICT_7X7_100; disctionary_map_["DICT_7X7_250"] = cv::aruco::DICT_7X7_250; disctionary_map_["DICT_7X7_1000"] = cv::aruco::DICT_7X7_1000; std::map::iterator it = disctionary_map_.find(dictionary); if (it != disctionary_map_.end()) dictionary_ = disctionary_map_[dictionary]; else { RCLCPP_ERROR(node_->get_logger(), "Invalid dictionary input: %s, default dictionary DICT_6X6_250 used.", dictionary); dictionary_ = cv::aruco::DICT_6X6_250; } // Initialize subscribers and publishers image_pub_ = it_.advertise(publish_image_topic, 1); camerainfo_sub_ = node_->create_subscription( camera_info_topic, 1, std::bind(&PoseEstimator::caminfoCB, this, _1)); calibration_pattern_ = NOT_EXISTING; pattern_map_ = { { "CHESSBOARD", CHESSBOARD }, { "ASYMMETRIC_CIRCLES_GRID", ASYMMETRIC_CIRCLES_GRID }, { "ARUCO", ARUCO }, { "CHARUCO", CHARUCO } }; std::map::iterator it_pattern = pattern_map_.find(pattern); if (it_pattern != pattern_map_.end()) calibration_pattern_ = pattern_map_[pattern]; else { RCLCPP_ERROR(node_->get_logger(), "Invalid pattern input: %s.", pattern); calibration_pattern_ = NOT_EXISTING; } switch (calibration_pattern_) { case CHESSBOARD: image_sub_ = it_.subscribe(image_topic, 1, &PoseEstimator::imageCB_CHESSBOARD, this); break; case ASYMMETRIC_CIRCLES_GRID: image_sub_ = it_.subscribe(image_topic, 1, &PoseEstimator::imageCB_ASYMMETRIC_CIRCLES_GRID, this); break; case ARUCO: image_sub_ = it_.subscribe(image_topic, 1, &PoseEstimator::imageCB_ARUCO, this); break; case CHARUCO: image_sub_ = it_.subscribe(image_topic, 1, &PoseEstimator::imageCB_CHARUCO, this); break; default: break; } // Initialize camera intrinsic parameters camera_matrix_ = cv::Mat::eye(3, 3, CV_64F); dist_coeffs_ = cv::Mat::zeros(5, 1, CV_64F); } void PoseEstimator::imageCB_CHESSBOARD(const sensor_msgs::msg::Image::ConstSharedPtr& msg) { if (run_) { if (!msg) { RCLCPP_ERROR(node_->get_logger(), "The pointer to image message is NULL."); return; } try { cv_bridge::CvImagePtr cv_ptr = cv_bridge::toCvCopy(msg, sensor_msgs::image_encodings::MONO8); // Find the chessboard pattern std::vector pointBuf; // corners cv::Size patternsize; patternsize.width = width_; patternsize.height = height_; int chessBoardFlags; chessBoardFlags = cv::CALIB_CB_ADAPTIVE_THRESH | cv::CALIB_CB_NORMALIZE_IMAGE; bool found = cv::findChessboardCorners(cv_ptr->image, patternsize, pointBuf, chessBoardFlags); if (found) { // Correct corner points cv::cornerSubPix(cv_ptr->image, pointBuf, cv::Size(11, 11), cv::Size(-1, -1), cv::TermCriteria(cv::TermCriteria::EPS + cv::TermCriteria::COUNT, 30, 0.1)); // Find the parameters of transform between the calibration plate and // camera plane std::vector sizeOjbPnts = { static_cast(pointBuf.size()), 3 }; cv::Mat objectPoints(sizeOjbPnts, CV_64F); for (int i = 0; i < sizeOjbPnts[0]; i++) { objectPoints.at(i, 0) = i % patternsize.width * chessboard_square_size_; objectPoints.at(i, 1) = i / patternsize.width * chessboard_square_size_; objectPoints.at(i, 2) = 0; } cv::Vec3d tvect, rvect; bool solved = cv::solvePnPRansac(objectPoints, pointBuf, camera_matrix_, dist_coeffs_, rvect, tvect); if (solved) { tf2::Quaternion q; rotationVectorToTF2Quaternion(q, rvect); geometry_msgs::msg::TransformStamped transform_stamped; tf2::transformTF2ToMsg(tf2::Transform(q, tf2::Vector3(tvect[0], tvect[1], tvect[2])), transform_stamped, node_->now(), msg->header.frame_id, "calib_board"); broadcaster_.sendTransform(transform_stamped); } // Project the axis points cv::Mat axis = cv::Mat::zeros(3, 3, CV_64F); axis.at(0, 0) = 3 * chessboard_square_size_; axis.at(1, 1) = 3 * chessboard_square_size_; axis.at(2, 2) = -3 * chessboard_square_size_; cv::Mat imageAxisPoints; cv::projectPoints(axis, rvect, tvect, camera_matrix_, dist_coeffs_, imageAxisPoints); // Draw axis to image cv_bridge::CvImage cv_image_color(msg->header, sensor_msgs::image_encodings::RGB8); cv::cvtColor(cv_ptr->image, cv_image_color.image, cv::COLOR_GRAY2RGB); cv::drawChessboardCorners(cv_image_color.image, patternsize, cv::Mat(pointBuf), found); draw(cv_image_color.image, pointBuf, imageAxisPoints); // Output stream image_pub_.publish(cv_image_color.toImageMsg()); } else // Output stream image_pub_.publish(cv_ptr->toImageMsg()); } catch (cv_bridge::Exception& e) { RCLCPP_ERROR(node_->get_logger(), "cv_bridge exeption: %s", e.what()); } } } void PoseEstimator::imageCB_ASYMMETRIC_CIRCLES_GRID(const sensor_msgs::msg::Image::ConstSharedPtr& msg) { if (run_) { if (!msg) { RCLCPP_INFO(node_->get_logger(), "The pointer to image message is NULL."); return; } try { cv_bridge::CvImagePtr cv_ptr = cv_bridge::toCvCopy(msg, sensor_msgs::image_encodings::MONO8); // Find the circlesgrid pattern std::vector pointBuf; // corners cv::Size patternsize; patternsize.width = width_; patternsize.height = height_; int chessBoardFlags; chessBoardFlags = cv::CALIB_CB_ASYMMETRIC_GRID; bool found = cv::findCirclesGrid(cv_ptr->image, patternsize, pointBuf, chessBoardFlags); if (found) { // Correct corner points std::vector corners2; patternsize.height = (patternsize.height + 1) / 2; for (int i = 0; i < patternsize.height; i++) { for (int j = 0; j < patternsize.width; j++) corners2.push_back(pointBuf[i * patternsize.width * 2 + j]); } pointBuf.clear(); for (size_t i = 0; i < corners2.size(); i++) pointBuf.push_back(corners2[i]); // Find the parameters of transform between the calibration plate and // camera plane std::vector sizeOjbPnts = { static_cast(pointBuf.size()), 3 }; cv::Mat objectPoints(sizeOjbPnts, CV_64F); for (int i = 0; i < sizeOjbPnts[0]; i++) { objectPoints.at(i, 0) = i % patternsize.width * circle_grid_seperation_; objectPoints.at(i, 1) = i / patternsize.width * circle_grid_seperation_; objectPoints.at(i, 2) = 0; } cv::Vec3d tvect, rvect; bool solved = cv::solvePnPRansac(objectPoints, pointBuf, camera_matrix_, dist_coeffs_, rvect, tvect); if (solved) { tf2::Quaternion q; rotationVectorToTF2Quaternion(q, rvect); geometry_msgs::msg::TransformStamped transform_stamped; tf2::transformTF2ToMsg(tf2::Transform(q, tf2::Vector3(tvect[0], tvect[1], tvect[2])), transform_stamped, node_->now(), msg->header.frame_id, "calib_board"); broadcaster_.sendTransform(transform_stamped); } // Project the axis points cv::Mat axis = cv::Mat::zeros(3, 3, CV_64F); axis.at(0, 0) = 3 * circle_grid_seperation_; axis.at(1, 1) = 3 * circle_grid_seperation_; axis.at(2, 2) = -3 * circle_grid_seperation_; cv::Mat imageAxisPoints; cv::projectPoints(axis, rvect, tvect, camera_matrix_, dist_coeffs_, imageAxisPoints); // Draw axis to image cv_bridge::CvImage cv_image_color(msg->header, sensor_msgs::image_encodings::RGB8); cv::cvtColor(cv_ptr->image, cv_image_color.image, cv::COLOR_GRAY2RGB); cv::drawChessboardCorners(cv_image_color.image, patternsize, cv::Mat(pointBuf), found); draw(cv_image_color.image, pointBuf, imageAxisPoints); // Output stream image_pub_.publish(cv_image_color.toImageMsg()); } else // Output stream image_pub_.publish(cv_ptr->toImageMsg()); } catch (cv_bridge::Exception& e) { RCLCPP_ERROR(node_->get_logger(), "cv_bridge exeption: %s", e.what()); } } } void PoseEstimator::imageCB_ARUCO(const sensor_msgs::msg::Image::ConstSharedPtr& msg) { if (run_) { if (!msg) { RCLCPP_INFO(node_->get_logger(), "The pointer to image message is NULL."); return; } try { cv_bridge::CvImagePtr cv_ptr = cv_bridge::toCvCopy(msg, sensor_msgs::image_encodings::MONO8); // Detect aruco board cv::Ptr dictionary = cv::aruco::getPredefinedDictionary(dictionary_); cv::Ptr board = cv::aruco::GridBoard::create(width_, height_, aruco_board_marker_size_, aruco_board_marker_seperation_, dictionary); std::vector ids; std::vector> corners; cv::aruco::detectMarkers(cv_ptr->image, dictionary, corners, ids); if (ids.size() > 0) { cv::Mat imageColor; cv::cvtColor(cv_ptr->image, imageColor, cv::COLOR_GRAY2RGB); std::vector> rejectedCorners; cv::aruco::refineDetectedMarkers(cv_ptr->image, board, corners, ids, rejectedCorners, camera_matrix_, dist_coeffs_); cv::aruco::drawDetectedMarkers(imageColor, corners, ids); // Estimate the pose of aruco board cv::Vec3d rvect, tvect; int valid = cv::aruco::estimatePoseBoard(corners, ids, board, camera_matrix_, dist_coeffs_, rvect, tvect); // If at least one board marker detected if (valid > 0) { cv::aruco::drawAxis(imageColor, camera_matrix_, dist_coeffs_, rvect, tvect, 0.1); tf2::Quaternion q; rotationVectorToTF2Quaternion(q, rvect); geometry_msgs::msg::TransformStamped transform_stamped; tf2::transformTF2ToMsg(tf2::Transform(q, tf2::Vector3(tvect[0], tvect[1], tvect[2])), transform_stamped, node_->now(), msg->header.frame_id, "calib_board"); broadcaster_.sendTransform(transform_stamped); } // Output stream cv_bridge::CvImage cv_image_color(msg->header, sensor_msgs::image_encodings::RGB8, imageColor); image_pub_.publish(cv_image_color.toImageMsg()); } else // Output stream image_pub_.publish(cv_ptr->toImageMsg()); } catch (cv_bridge::Exception& e) { RCLCPP_ERROR(node_->get_logger(), "cv_bridge exeption: %s", e.what()); } } } void PoseEstimator::imageCB_CHARUCO(const sensor_msgs::msg::Image::ConstSharedPtr& msg) { if (run_) { if (!msg) { RCLCPP_ERROR(node_->get_logger(), "The pointer to image message is NULL."); return; } try { cv_bridge::CvImagePtr cv_ptr = cv_bridge::toCvCopy(msg, sensor_msgs::image_encodings::MONO8); // Detect ChArUco cv::Ptr dictionary = cv::aruco::getPredefinedDictionary(dictionary_); cv::Ptr board = cv::aruco::CharucoBoard::create( width_, height_, charuco_board_square_size_, charuco_board_marker_size_, dictionary); cv::Ptr params_ptr(new cv::aruco::DetectorParameters()); #if CV_MINOR_VERSION == 2 params_ptr->doCornerRefinement = true; #else params_ptr->cornerRefinementMethod = cv::aruco::CORNER_REFINE_SUBPIX; #endif std::vector ids; std::vector> corners; cv::aruco::detectMarkers(cv_ptr->image, dictionary, corners, ids, params_ptr); if (ids.size() > 0) { cv::Mat imageColor; cv::cvtColor(cv_ptr->image, imageColor, cv::COLOR_GRAY2RGB); std::vector charucoCorners; std::vector charucoIds; cv::aruco::interpolateCornersCharuco(corners, ids, cv_ptr->image, board, charucoCorners, charucoIds); if (charucoIds.size() > 0) { cv::aruco::drawDetectedCornersCharuco(imageColor, charucoCorners, charucoIds, cv::Scalar(255, 0, 0)); // Estimate charuco pose cv::Vec3d rvect, tvect; bool valid = cv::aruco::estimatePoseCharucoBoard(charucoCorners, charucoIds, board, camera_matrix_, dist_coeffs_, rvect, tvect); if (valid) { cv::aruco::drawAxis(imageColor, camera_matrix_, dist_coeffs_, rvect, tvect, 0.1); tf2::Quaternion q; rotationVectorToTF2Quaternion(q, rvect); geometry_msgs::msg::TransformStamped transform_stamped; tf2::transformTF2ToMsg(tf2::Transform(q, tf2::Vector3(tvect[0], tvect[1], tvect[2])), transform_stamped, node_->now(), msg->header.frame_id, "calib_board"); broadcaster_.sendTransform(transform_stamped); } } // Output stream cv_bridge::CvImage cv_image_color(msg->header, sensor_msgs::image_encodings::RGB8, imageColor); image_pub_.publish(cv_image_color.toImageMsg()); } else // Output stream image_pub_.publish(cv_ptr->toImageMsg()); } catch (cv_bridge::Exception& e) { RCLCPP_ERROR(node_->get_logger(), "cv_bridge exeption: %s", e.what()); } } } void PoseEstimator::caminfoCB(const sensor_msgs::msg::CameraInfo::SharedPtr msg) { if (!run_) { if (msg->k.size() == 9 && msg->d.size() == 5) { // Store camera matrix info for (size_t i = 0; i < 3; i++) for (size_t j = 0; j < 3; j++) camera_matrix_.at(i, j) = msg->k[i * 3 + j]; // Store camera distortion info for (size_t i = 0; i < 5; i++) dist_coeffs_.at(i, 0) = msg->d[i]; // Set the flag to start processing the image run_ = true; } else { RCLCPP_ERROR(node_->get_logger(), "Got invalid camera info."); run_ = false; } } } void PoseEstimator::draw(cv::Mat img, std::vector corners, cv::Mat imgpts) { cv::Point corner(corners[0]); cv::Point axis_point_x(imgpts.ptr(0)[0], imgpts.ptr(0)[1]); cv::Point axis_point_y(imgpts.ptr(1)[0], imgpts.ptr(1)[1]); cv::Point axis_point_z(imgpts.ptr(2)[0], imgpts.ptr(2)[1]); cv::line(img, corner, axis_point_x, cv::Scalar(255, 0, 0), 6); cv::line(img, corner, axis_point_y, cv::Scalar(0, 255, 0), 6); cv::line(img, corner, axis_point_z, cv::Scalar(0, 0, 255), 6); } void PoseEstimator::rotationVectorToTF2Quaternion(tf2::Quaternion& q, cv::Vec3d& rvect) { q.setRPY(0.0, 0.0, 0.0); cv::Mat rm; cv::Rodrigues(rvect, rm); tf2::Matrix3x3 m(rm.ptr(0)[0], rm.ptr(0)[1], rm.ptr(0)[2], rm.ptr(1)[0], rm.ptr(1)[1], rm.ptr(1)[2], rm.ptr(2)[0], rm.ptr(2)[1], rm.ptr(2)[2]); m.getRotation(q); } ================================================ FILE: grasp_utils/handeye_tf_service/CMakeLists.txt ================================================ cmake_minimum_required(VERSION 3.5) project(handeye_tf_service) # Default to C++14 if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic) endif() find_package(rclcpp REQUIRED) find_package(tf2_ros REQUIRED) find_package(ament_cmake REQUIRED) find_package(geometry_msgs REQUIRED) find_package(builtin_interfaces REQUIRED) find_package(std_msgs REQUIRED) find_package(rosidl_default_generators REQUIRED) set(srv_files "srv/HandeyeTF.srv") rosidl_generate_interfaces(${PROJECT_NAME} ${srv_files} DEPENDENCIES geometry_msgs builtin_interfaces std_msgs ) ament_export_dependencies(rosidl_default_runtime) add_executable(handeye_tf_server src/handeye_tf_server.cpp ) ament_target_dependencies(handeye_tf_server rclcpp tf2_ros ) get_default_rmw_implementation(rmw_implementation) find_package("${rmw_implementation}" REQUIRED) get_rmw_typesupport(typesupport_impls "${rmw_implementation}" LANGUAGE "cpp") foreach(typesupport_impl ${typesupport_impls}) rosidl_target_interfaces(handeye_tf_server ${PROJECT_NAME} ${typesupport_impl} ) endforeach() install(TARGETS handeye_tf_server DESTINATION lib/${PROJECT_NAME}) ament_package() ================================================ FILE: grasp_utils/handeye_tf_service/README.md ================================================ # handeye_tf_service ================================================ FILE: grasp_utils/handeye_tf_service/package.xml ================================================ handeye_tf_service 0.1.0 Provide TF get function for handeye. Yu Yan Apache License 2.0 ament_cmake rosidl_default_generators rclcpp geometry_msgs std_msgs tf2_ros builtin_interfaces rosidl_default_runtime rosidl_interface_packages ament_cmake ================================================ FILE: grasp_utils/handeye_tf_service/src/handeye_tf_server.cpp ================================================ /** Copyright (c) 2019 Intel Corporation. All Rights Reserved * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ #include #include #include #include #include #include #include using HandeyeTF = handeye_tf_service::srv::HandeyeTF; using namespace std::chrono_literals; class ServerNode : public rclcpp::Node { public: explicit ServerNode(const rclcpp::NodeOptions & options) : Node("handeye_tf_server", options), broadcaster_(this) { // Init tf message tf_msg_.header.frame_id = "base"; // Used to void TF_NO_FRAME_ID error, updated by user later tf_msg_.child_frame_id = "camera_link"; // Initialize rotation to avoid TF_DENORMALIZED_QUATERNION error tf_msg_.transform.rotation.x = 0.0; tf_msg_.transform.rotation.y = 0.0; tf_msg_.transform.rotation.z = 0.0; tf_msg_.transform.rotation.w = 1.0; // Init timer timer_ = this->create_wall_timer( 100ms, std::bind(&ServerNode::timer_callback, this)); // Init tf listener clock_ = this->get_clock(); rclcpp::Clock::SharedPtr clock = std::make_shared(RCL_SYSTEM_TIME); tf_buffer_ = std::make_shared(clock_); tf_listener_ = std::make_shared(*tf_buffer_); // Service handler auto handle_service = [this](const std::shared_ptr request_header, const std::shared_ptr request, std::shared_ptr response) -> void { if (request->publish.data) // Publish the camera-robot transform { RCLCPP_INFO(this->get_logger(), "Incoming publish request\nframe_id: %s child_frame_id: %s", request->transform.header.frame_id.data(), request->transform.child_frame_id.data()); tf_msg_ = request->transform; } else // Lookup the requested transform { (void)request_header; RCLCPP_INFO(this->get_logger(), "Incoming lookup request\nframe_id: %s child_frame_id: %s", request->transform.header.frame_id.data(), request->transform.child_frame_id.data()); try { response->tf_lookup_result = tf_buffer_->lookupTransform(request->transform.header.frame_id, request->transform.child_frame_id, tf2::TimePoint()); } catch (tf2::TransformException &ex) { std::string temp = ex.what(); RCLCPP_WARN(this->get_logger(), "%s", temp.c_str()); } } }; // Create a service that will use the callback function to handle requests. srv_ = create_service("handeye_tf_service", handle_service); RCLCPP_INFO(this->get_logger(), "Handeye TF service created."); } private: void timer_callback() { broadcaster_.sendTransform(tf_msg_); } // Handeye service rclcpp::Service::SharedPtr srv_; // Variables used for looking up tf transforms std::shared_ptr tf_buffer_; std::shared_ptr tf_listener_; // Timer used for static transform publish rclcpp::TimerBase::SharedPtr timer_; // TF message for camera w.r.t robot transform geometry_msgs::msg::TransformStamped tf_msg_; // TF broadcaster tf2_ros::StaticTransformBroadcaster broadcaster_; rclcpp::Clock::SharedPtr clock_; }; int main(int argc, char ** argv) { rclcpp::init(argc, argv); auto node = std::make_shared(rclcpp::NodeOptions()); rclcpp::spin(node); rclcpp::shutdown(); return 0; } ================================================ FILE: grasp_utils/handeye_tf_service/srv/HandeyeTF.srv ================================================ geometry_msgs/TransformStamped transform std_msgs/Bool publish --- geometry_msgs/TransformStamped tf_lookup_result ================================================ FILE: grasp_utils/robot_interface/CMakeLists.txt ================================================ # Copyright (c) 2018 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. cmake_minimum_required(VERSION 3.5) project(robot_interface) # Default to C99 if(NOT CMAKE_C_STANDARD) set(CMAKE_C_STANDARD 99) endif() # Default to C++14 if(NOT CMAKE_CXX_STANDARD) set(CMAKE_CXX_STANDARD 14) endif() if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") add_compile_options(-Wall -Wextra -Wpedantic -Wno-unused-parameter) endif() # find dependencies find_package(ament_cmake REQUIRED) find_package(tf2 REQUIRED) find_package(tf2_eigen REQUIRED) find_package(tf2_ros REQUIRED) find_package(eigen3_cmake_module REQUIRED) find_package(Eigen3 REQUIRED) find_package(rclcpp REQUIRED) find_package(sensor_msgs REQUIRED) find_package(geometry_msgs REQUIRED) find_library( ur_modern_driver_LIBRARIES NAMES ur_driver_lib HINTS /usr/local/lib) find_path(ur_modern_driver_INCLUDE_DIRS ur_modern_driver/tcp_socket.h) # Eigen 3.2 (Wily) only provides EIGEN3_INCLUDE_DIR, not EIGEN3_INCLUDE_DIRS if(NOT EIGEN3_INCLUDE_DIRS) set(EIGEN3_INCLUDE_DIRS ${EIGEN3_INCLUDE_DIR}) endif() # Set include directory path include_directories( include ${rclcpp_INCLUDE_DIRS} ${tf2_INCLUDE_DIRS} ${tf2_eigen_INCLUDE_DIRS} ${tf2_ros_INCLUDE_DIRS} ${Eigen3_INCLUDE_DIRS} ${sensor_msgs_INCLUDE_DIRS} ${geometry_msgs_INCLUDE_DIRS} ${ur_modern_driver_INCLUDE_DIRS}) include_directories(SYSTEM ${EIGEN3_INCLUDE_DIRS}) # Add robot interface library set(${PROJECT_NAME}_SOURCES src/control_base.cpp src/control_ur.cpp ) add_library(${PROJECT_NAME} ${${PROJECT_NAME}_SOURCES}) ament_target_dependencies(${PROJECT_NAME} rclcpp sensor_msgs geometry_msgs tf2_ros) target_link_libraries(${PROJECT_NAME} ${ur_modern_driver_LIBRARIES}) # Add test of UR robot interface library set(TEST_SOURCE test/ur_test.cpp) add_executable(ur_test_move_command test/ur_test_move_command.cpp) ament_target_dependencies(ur_test_move_command rclcpp sensor_msgs geometry_msgs) target_link_libraries(ur_test_move_command ${PROJECT_NAME} ${ur_modern_driver_LIBRARIES}) add_executable(ur_test_state_publish test/ur_test_state_publish.cpp) ament_target_dependencies(ur_test_state_publish rclcpp sensor_msgs geometry_msgs) target_link_libraries(ur_test_state_publish ${PROJECT_NAME} ${ur_modern_driver_LIBRARIES}) ament_export_include_directories(include ${Eigen3_INCLUDE_DIRS}) ament_export_interfaces(${PROJECT_NAME} HAS_LIBRARY_TARGET) ament_export_libraries(${PROJECT_NAME} ${ur_modern_driver_LIBRARIES}) ament_export_dependencies(rclcpp) ament_export_dependencies(sensor_msgs) ament_export_dependencies(geometry_msgs) ament_export_dependencies(tf2_ros) ament_export_dependencies(eigen3_cmake_module) ament_export_dependencies(Eigen3) # Install library install( TARGETS ${PROJECT_NAME} EXPORT ${PROJECT_NAME} ARCHIVE DESTINATION lib LIBRARY DESTINATION lib RUNTIME DESTINATION bin INCLUDES DESTINATION include ) # Install executables install(TARGETS ur_test_move_command ur_test_state_publish DESTINATION lib/${PROJECT_NAME}) # Install header files install( DIRECTORY include/ DESTINATION include ) # Install launch files. install(DIRECTORY launch DESTINATION share/${PROJECT_NAME}/ ) if(BUILD_TESTING) find_package(ament_lint_auto REQUIRED) # the following line skips the linter which checks for copyrights # uncomment the line when a copyright and license is not present in all source files #set(ament_cmake_copyright_FOUND TRUE) # the following line skips cpplint (only works in a git repo) # uncomment the line when this package is not in a git repo #set(ament_cmake_cpplint_FOUND TRUE) ament_lint_auto_find_test_dependencies() endif() ament_package() ================================================ FILE: grasp_utils/robot_interface/Doxyfile ================================================ # Doxyfile 1.8.13 # This file describes the settings to be used by the documentation system # doxygen (www.doxygen.org) for a project. # # All text after a double hash (##) is considered a comment and is placed in # front of the TAG it is preceding. # # All text after a single hash (#) is considered a comment and will be ignored. # The format is: # TAG = value [value, ...] # For lists, items can also be appended using: # TAG += value [value, ...] # Values that contain spaces should be placed between quotes (\" \"). #--------------------------------------------------------------------------- # Project related configuration options #--------------------------------------------------------------------------- # This tag specifies the encoding used for all characters in the config file # that follow. The default is UTF-8 which is also the encoding used for all text # before the first occurrence of this tag. Doxygen uses libiconv (or the iconv # built into libc) for the transcoding. See http://www.gnu.org/software/libiconv # for the list of possible encodings. # The default value is: UTF-8. DOXYFILE_ENCODING = UTF-8 # The PROJECT_NAME tag is a single word (or a sequence of words surrounded by # double-quotes, unless you are using Doxywizard) that should identify the # project for which the documentation is generated. This name is used in the # title of most generated pages and in a few other places. # The default value is: My Project. PROJECT_NAME = "robot_interface" # The PROJECT_NUMBER tag can be used to enter a project or revision number. This # could be handy for archiving the generated documentation or if some version # control system is used. PROJECT_NUMBER = # Using the PROJECT_BRIEF tag one can provide an optional one line description # for a project that appears at the top of each page and should give viewer a # quick idea about the purpose of the project. Keep the description short. PROJECT_BRIEF = "Native robot interface for making the visual manipulation" # With the PROJECT_LOGO tag one can specify a logo or an icon that is included # in the documentation. The maximum height of the logo should not exceed 55 # pixels and the maximum width should not exceed 200 pixels. Doxygen will copy # the logo to the output directory. PROJECT_LOGO = # The OUTPUT_DIRECTORY tag is used to specify the (relative or absolute) path # into which the generated documentation will be written. If a relative path is # entered, it will be relative to the location where doxygen was started. If # left blank the current directory will be used. OUTPUT_DIRECTORY = ./build # If the CREATE_SUBDIRS tag is set to YES then doxygen will create 4096 sub- # directories (in 2 levels) under the output directory of each output format and # will distribute the generated files over these directories. Enabling this # option can be useful when feeding doxygen a huge amount of source files, where # putting all generated files in the same directory would otherwise causes # performance problems for the file system. # The default value is: NO. CREATE_SUBDIRS = NO # If the ALLOW_UNICODE_NAMES tag is set to YES, doxygen will allow non-ASCII # characters to appear in the names of generated files. If set to NO, non-ASCII # characters will be escaped, for example _xE3_x81_x84 will be used for Unicode # U+3044. # The default value is: NO. ALLOW_UNICODE_NAMES = NO # The OUTPUT_LANGUAGE tag is used to specify the language in which all # documentation generated by doxygen is written. Doxygen will use this # information to generate all constant output in the proper language. # Possible values are: Afrikaans, Arabic, Armenian, Brazilian, Catalan, Chinese, # Chinese-Traditional, Croatian, Czech, Danish, Dutch, English (United States), # Esperanto, Farsi (Persian), Finnish, French, German, Greek, Hungarian, # Indonesian, Italian, Japanese, Japanese-en (Japanese with English messages), # Korean, Korean-en (Korean with English messages), Latvian, Lithuanian, # Macedonian, Norwegian, Persian (Farsi), Polish, Portuguese, Romanian, Russian, # Serbian, Serbian-Cyrillic, Slovak, Slovene, Spanish, Swedish, Turkish, # Ukrainian and Vietnamese. # The default value is: English. OUTPUT_LANGUAGE = English # If the BRIEF_MEMBER_DESC tag is set to YES, doxygen will include brief member # descriptions after the members that are listed in the file and class # documentation (similar to Javadoc). Set to NO to disable this. # The default value is: YES. BRIEF_MEMBER_DESC = YES # If the REPEAT_BRIEF tag is set to YES, doxygen will prepend the brief # description of a member or function before the detailed description # # Note: If both HIDE_UNDOC_MEMBERS and BRIEF_MEMBER_DESC are set to NO, the # brief descriptions will be completely suppressed. # The default value is: YES. REPEAT_BRIEF = YES # This tag implements a quasi-intelligent brief description abbreviator that is # used to form the text in various listings. Each string in this list, if found # as the leading text of the brief description, will be stripped from the text # and the result, after processing the whole list, is used as the annotated # text. Otherwise, the brief description is used as-is. If left blank, the # following values are used ($name is automatically replaced with the name of # the entity):The $name class, The $name widget, The $name file, is, provides, # specifies, contains, represents, a, an and the. ABBREVIATE_BRIEF = "The $name class" \ "The $name widget" \ "The $name file" \ is \ provides \ specifies \ contains \ represents \ a \ an \ the # If the ALWAYS_DETAILED_SEC and REPEAT_BRIEF tags are both set to YES then # doxygen will generate a detailed section even if there is only a brief # description. # The default value is: NO. ALWAYS_DETAILED_SEC = NO # If the INLINE_INHERITED_MEMB tag is set to YES, doxygen will show all # inherited members of a class in the documentation of that class as if those # members were ordinary class members. Constructors, destructors and assignment # operators of the base classes will not be shown. # The default value is: NO. INLINE_INHERITED_MEMB = NO # If the FULL_PATH_NAMES tag is set to YES, doxygen will prepend the full path # before files name in the file list and in the header files. If set to NO the # shortest path that makes the file name unique will be used # The default value is: YES. FULL_PATH_NAMES = YES # The STRIP_FROM_PATH tag can be used to strip a user-defined part of the path. # Stripping is only done if one of the specified strings matches the left-hand # part of the path. The tag can be used to show relative paths in the file list. # If left blank the directory from which doxygen is run is used as the path to # strip. # # Note that you can specify absolute paths here, but also relative paths, which # will be relative from the directory where doxygen is started. # This tag requires that the tag FULL_PATH_NAMES is set to YES. STRIP_FROM_PATH = # The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of the # path mentioned in the documentation of a class, which tells the reader which # header file to include in order to use a class. If left blank only the name of # the header file containing the class definition is used. Otherwise one should # specify the list of include paths that are normally passed to the compiler # using the -I flag. STRIP_FROM_INC_PATH = # If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter (but # less readable) file names. This can be useful is your file systems doesn't # support long names like on DOS, Mac, or CD-ROM. # The default value is: NO. SHORT_NAMES = NO # If the JAVADOC_AUTOBRIEF tag is set to YES then doxygen will interpret the # first line (until the first dot) of a Javadoc-style comment as the brief # description. If set to NO, the Javadoc-style will behave just like regular Qt- # style comments (thus requiring an explicit @brief command for a brief # description.) # The default value is: NO. JAVADOC_AUTOBRIEF = NO # If the QT_AUTOBRIEF tag is set to YES then doxygen will interpret the first # line (until the first dot) of a Qt-style comment as the brief description. If # set to NO, the Qt-style will behave just like regular Qt-style comments (thus # requiring an explicit \brief command for a brief description.) # The default value is: NO. QT_AUTOBRIEF = NO # The MULTILINE_CPP_IS_BRIEF tag can be set to YES to make doxygen treat a # multi-line C++ special comment block (i.e. a block of //! or /// comments) as # a brief description. This used to be the default behavior. The new default is # to treat a multi-line C++ comment block as a detailed description. Set this # tag to YES if you prefer the old behavior instead. # # Note that setting this tag to YES also means that rational rose comments are # not recognized any more. # The default value is: NO. MULTILINE_CPP_IS_BRIEF = NO # If the INHERIT_DOCS tag is set to YES then an undocumented member inherits the # documentation from any documented member that it re-implements. # The default value is: YES. INHERIT_DOCS = YES # If the SEPARATE_MEMBER_PAGES tag is set to YES then doxygen will produce a new # page for each member. If set to NO, the documentation of a member will be part # of the file/class/namespace that contains it. # The default value is: NO. SEPARATE_MEMBER_PAGES = NO # The TAB_SIZE tag can be used to set the number of spaces in a tab. Doxygen # uses this value to replace tabs by spaces in code fragments. # Minimum value: 1, maximum value: 16, default value: 4. TAB_SIZE = 4 # This tag can be used to specify a number of aliases that act as commands in # the documentation. An alias has the form: # name=value # For example adding # "sideeffect=@par Side Effects:\n" # will allow you to put the command \sideeffect (or @sideeffect) in the # documentation, which will result in a user-defined paragraph with heading # "Side Effects:". You can put \n's in the value part of an alias to insert # newlines. ALIASES = # This tag can be used to specify a number of word-keyword mappings (TCL only). # A mapping has the form "name=value". For example adding "class=itcl::class" # will allow you to use the command class in the itcl::class meaning. TCL_SUBST = # Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C sources # only. Doxygen will then generate output that is more tailored for C. For # instance, some of the names that are used will be different. The list of all # members will be omitted, etc. # The default value is: NO. OPTIMIZE_OUTPUT_FOR_C = NO # Set the OPTIMIZE_OUTPUT_JAVA tag to YES if your project consists of Java or # Python sources only. Doxygen will then generate output that is more tailored # for that language. For instance, namespaces will be presented as packages, # qualified scopes will look different, etc. # The default value is: NO. OPTIMIZE_OUTPUT_JAVA = NO # Set the OPTIMIZE_FOR_FORTRAN tag to YES if your project consists of Fortran # sources. Doxygen will then generate output that is tailored for Fortran. # The default value is: NO. OPTIMIZE_FOR_FORTRAN = NO # Set the OPTIMIZE_OUTPUT_VHDL tag to YES if your project consists of VHDL # sources. Doxygen will then generate output that is tailored for VHDL. # The default value is: NO. OPTIMIZE_OUTPUT_VHDL = NO # Doxygen selects the parser to use depending on the extension of the files it # parses. With this tag you can assign which parser to use for a given # extension. Doxygen has a built-in mapping, but you can override or extend it # using this tag. The format is ext=language, where ext is a file extension, and # language is one of the parsers supported by doxygen: IDL, Java, Javascript, # C#, C, C++, D, PHP, Objective-C, Python, Fortran (fixed format Fortran: # FortranFixed, free formatted Fortran: FortranFree, unknown formatted Fortran: # Fortran. In the later case the parser tries to guess whether the code is fixed # or free formatted code, this is the default for Fortran type files), VHDL. For # instance to make doxygen treat .inc files as Fortran files (default is PHP), # and .f files as C (default is Fortran), use: inc=Fortran f=C. # # Note: For files without extension you can use no_extension as a placeholder. # # Note that for custom extensions you also need to set FILE_PATTERNS otherwise # the files are not read by doxygen. EXTENSION_MAPPING = # If the MARKDOWN_SUPPORT tag is enabled then doxygen pre-processes all comments # according to the Markdown format, which allows for more readable # documentation. See http://daringfireball.net/projects/markdown/ for details. # The output of markdown processing is further processed by doxygen, so you can # mix doxygen, HTML, and XML commands with Markdown formatting. Disable only in # case of backward compatibilities issues. # The default value is: YES. MARKDOWN_SUPPORT = YES # When the TOC_INCLUDE_HEADINGS tag is set to a non-zero value, all headings up # to that level are automatically included in the table of contents, even if # they do not have an id attribute. # Note: This feature currently applies only to Markdown headings. # Minimum value: 0, maximum value: 99, default value: 0. # This tag requires that the tag MARKDOWN_SUPPORT is set to YES. TOC_INCLUDE_HEADINGS = 0 # When enabled doxygen tries to link words that correspond to documented # classes, or namespaces to their corresponding documentation. Such a link can # be prevented in individual cases by putting a % sign in front of the word or # globally by setting AUTOLINK_SUPPORT to NO. # The default value is: YES. AUTOLINK_SUPPORT = YES # If you use STL classes (i.e. std::string, std::vector, etc.) but do not want # to include (a tag file for) the STL sources as input, then you should set this # tag to YES in order to let doxygen match functions declarations and # definitions whose arguments contain STL classes (e.g. func(std::string); # versus func(std::string) {}). This also make the inheritance and collaboration # diagrams that involve STL classes more complete and accurate. # The default value is: NO. BUILTIN_STL_SUPPORT = NO # If you use Microsoft's C++/CLI language, you should set this option to YES to # enable parsing support. # The default value is: NO. CPP_CLI_SUPPORT = NO # Set the SIP_SUPPORT tag to YES if your project consists of sip (see: # http://www.riverbankcomputing.co.uk/software/sip/intro) sources only. Doxygen # will parse them like normal C++ but will assume all classes use public instead # of private inheritance when no explicit protection keyword is present. # The default value is: NO. SIP_SUPPORT = NO # For Microsoft's IDL there are propget and propput attributes to indicate # getter and setter methods for a property. Setting this option to YES will make # doxygen to replace the get and set methods by a property in the documentation. # This will only work if the methods are indeed getting or setting a simple # type. If this is not the case, or you want to show the methods anyway, you # should set this option to NO. # The default value is: YES. IDL_PROPERTY_SUPPORT = YES # If member grouping is used in the documentation and the DISTRIBUTE_GROUP_DOC # tag is set to YES then doxygen will reuse the documentation of the first # member in the group (if any) for the other members of the group. By default # all members of a group must be documented explicitly. # The default value is: NO. DISTRIBUTE_GROUP_DOC = NO # If one adds a struct or class to a group and this option is enabled, then also # any nested class or struct is added to the same group. By default this option # is disabled and one has to add nested compounds explicitly via \ingroup. # The default value is: NO. GROUP_NESTED_COMPOUNDS = NO # Set the SUBGROUPING tag to YES to allow class member groups of the same type # (for instance a group of public functions) to be put as a subgroup of that # type (e.g. under the Public Functions section). Set it to NO to prevent # subgrouping. Alternatively, this can be done per class using the # \nosubgrouping command. # The default value is: YES. SUBGROUPING = YES # When the INLINE_GROUPED_CLASSES tag is set to YES, classes, structs and unions # are shown inside the group in which they are included (e.g. using \ingroup) # instead of on a separate page (for HTML and Man pages) or section (for LaTeX # and RTF). # # Note that this feature does not work in combination with # SEPARATE_MEMBER_PAGES. # The default value is: NO. INLINE_GROUPED_CLASSES = NO # When the INLINE_SIMPLE_STRUCTS tag is set to YES, structs, classes, and unions # with only public data fields or simple typedef fields will be shown inline in # the documentation of the scope in which they are defined (i.e. file, # namespace, or group documentation), provided this scope is documented. If set # to NO, structs, classes, and unions are shown on a separate page (for HTML and # Man pages) or section (for LaTeX and RTF). # The default value is: NO. INLINE_SIMPLE_STRUCTS = NO # When TYPEDEF_HIDES_STRUCT tag is enabled, a typedef of a struct, union, or # enum is documented as struct, union, or enum with the name of the typedef. So # typedef struct TypeS {} TypeT, will appear in the documentation as a struct # with name TypeT. When disabled the typedef will appear as a member of a file, # namespace, or class. And the struct will be named TypeS. This can typically be # useful for C code in case the coding convention dictates that all compound # types are typedef'ed and only the typedef is referenced, never the tag name. # The default value is: NO. TYPEDEF_HIDES_STRUCT = NO # The size of the symbol lookup cache can be set using LOOKUP_CACHE_SIZE. This # cache is used to resolve symbols given their name and scope. Since this can be # an expensive process and often the same symbol appears multiple times in the # code, doxygen keeps a cache of pre-resolved symbols. If the cache is too small # doxygen will become slower. If the cache is too large, memory is wasted. The # cache size is given by this formula: 2^(16+LOOKUP_CACHE_SIZE). The valid range # is 0..9, the default is 0, corresponding to a cache size of 2^16=65536 # symbols. At the end of a run doxygen will report the cache usage and suggest # the optimal cache size from a speed point of view. # Minimum value: 0, maximum value: 9, default value: 0. LOOKUP_CACHE_SIZE = 0 #--------------------------------------------------------------------------- # Build related configuration options #--------------------------------------------------------------------------- # If the EXTRACT_ALL tag is set to YES, doxygen will assume all entities in # documentation are documented, even if no documentation was available. Private # class members and static file members will be hidden unless the # EXTRACT_PRIVATE respectively EXTRACT_STATIC tags are set to YES. # Note: This will also disable the warnings about undocumented members that are # normally produced when WARNINGS is set to YES. # The default value is: NO. EXTRACT_ALL = NO # If the EXTRACT_PRIVATE tag is set to YES, all private members of a class will # be included in the documentation. # The default value is: NO. EXTRACT_PRIVATE = NO # If the EXTRACT_PACKAGE tag is set to YES, all members with package or internal # scope will be included in the documentation. # The default value is: NO. EXTRACT_PACKAGE = NO # If the EXTRACT_STATIC tag is set to YES, all static members of a file will be # included in the documentation. # The default value is: NO. EXTRACT_STATIC = NO # If the EXTRACT_LOCAL_CLASSES tag is set to YES, classes (and structs) defined # locally in source files will be included in the documentation. If set to NO, # only classes defined in header files are included. Does not have any effect # for Java sources. # The default value is: YES. EXTRACT_LOCAL_CLASSES = YES # This flag is only useful for Objective-C code. If set to YES, local methods, # which are defined in the implementation section but not in the interface are # included in the documentation. If set to NO, only methods in the interface are # included. # The default value is: NO. EXTRACT_LOCAL_METHODS = NO # If this flag is set to YES, the members of anonymous namespaces will be # extracted and appear in the documentation as a namespace called # 'anonymous_namespace{file}', where file will be replaced with the base name of # the file that contains the anonymous namespace. By default anonymous namespace # are hidden. # The default value is: NO. EXTRACT_ANON_NSPACES = NO # If the HIDE_UNDOC_MEMBERS tag is set to YES, doxygen will hide all # undocumented members inside documented classes or files. If set to NO these # members will be included in the various overviews, but no documentation # section is generated. This option has no effect if EXTRACT_ALL is enabled. # The default value is: NO. HIDE_UNDOC_MEMBERS = NO # If the HIDE_UNDOC_CLASSES tag is set to YES, doxygen will hide all # undocumented classes that are normally visible in the class hierarchy. If set # to NO, these classes will be included in the various overviews. This option # has no effect if EXTRACT_ALL is enabled. # The default value is: NO. HIDE_UNDOC_CLASSES = NO # If the HIDE_FRIEND_COMPOUNDS tag is set to YES, doxygen will hide all friend # (class|struct|union) declarations. If set to NO, these declarations will be # included in the documentation. # The default value is: NO. HIDE_FRIEND_COMPOUNDS = NO # If the HIDE_IN_BODY_DOCS tag is set to YES, doxygen will hide any # documentation blocks found inside the body of a function. If set to NO, these # blocks will be appended to the function's detailed documentation block. # The default value is: NO. HIDE_IN_BODY_DOCS = NO # The INTERNAL_DOCS tag determines if documentation that is typed after a # \internal command is included. If the tag is set to NO then the documentation # will be excluded. Set it to YES to include the internal documentation. # The default value is: NO. INTERNAL_DOCS = NO # If the CASE_SENSE_NAMES tag is set to NO then doxygen will only generate file # names in lower-case letters. If set to YES, upper-case letters are also # allowed. This is useful if you have classes or files whose names only differ # in case and if your file system supports case sensitive file names. Windows # and Mac users are advised to set this option to NO. # The default value is: system dependent. CASE_SENSE_NAMES = YES # If the HIDE_SCOPE_NAMES tag is set to NO then doxygen will show members with # their full class and namespace scopes in the documentation. If set to YES, the # scope will be hidden. # The default value is: NO. HIDE_SCOPE_NAMES = NO # If the HIDE_COMPOUND_REFERENCE tag is set to NO (default) then doxygen will # append additional text to a page's title, such as Class Reference. If set to # YES the compound reference will be hidden. # The default value is: NO. HIDE_COMPOUND_REFERENCE= NO # If the SHOW_INCLUDE_FILES tag is set to YES then doxygen will put a list of # the files that are included by a file in the documentation of that file. # The default value is: YES. SHOW_INCLUDE_FILES = YES # If the SHOW_GROUPED_MEMB_INC tag is set to YES then Doxygen will add for each # grouped member an include statement to the documentation, telling the reader # which file to include in order to use the member. # The default value is: NO. SHOW_GROUPED_MEMB_INC = NO # If the FORCE_LOCAL_INCLUDES tag is set to YES then doxygen will list include # files with double quotes in the documentation rather than with sharp brackets. # The default value is: NO. FORCE_LOCAL_INCLUDES = NO # If the INLINE_INFO tag is set to YES then a tag [inline] is inserted in the # documentation for inline members. # The default value is: YES. INLINE_INFO = YES # If the SORT_MEMBER_DOCS tag is set to YES then doxygen will sort the # (detailed) documentation of file and class members alphabetically by member # name. If set to NO, the members will appear in declaration order. # The default value is: YES. SORT_MEMBER_DOCS = YES # If the SORT_BRIEF_DOCS tag is set to YES then doxygen will sort the brief # descriptions of file, namespace and class members alphabetically by member # name. If set to NO, the members will appear in declaration order. Note that # this will also influence the order of the classes in the class list. # The default value is: NO. SORT_BRIEF_DOCS = NO # If the SORT_MEMBERS_CTORS_1ST tag is set to YES then doxygen will sort the # (brief and detailed) documentation of class members so that constructors and # destructors are listed first. If set to NO the constructors will appear in the # respective orders defined by SORT_BRIEF_DOCS and SORT_MEMBER_DOCS. # Note: If SORT_BRIEF_DOCS is set to NO this option is ignored for sorting brief # member documentation. # Note: If SORT_MEMBER_DOCS is set to NO this option is ignored for sorting # detailed member documentation. # The default value is: NO. SORT_MEMBERS_CTORS_1ST = NO # If the SORT_GROUP_NAMES tag is set to YES then doxygen will sort the hierarchy # of group names into alphabetical order. If set to NO the group names will # appear in their defined order. # The default value is: NO. SORT_GROUP_NAMES = NO # If the SORT_BY_SCOPE_NAME tag is set to YES, the class list will be sorted by # fully-qualified names, including namespaces. If set to NO, the class list will # be sorted only by class name, not including the namespace part. # Note: This option is not very useful if HIDE_SCOPE_NAMES is set to YES. # Note: This option applies only to the class list, not to the alphabetical # list. # The default value is: NO. SORT_BY_SCOPE_NAME = NO # If the STRICT_PROTO_MATCHING option is enabled and doxygen fails to do proper # type resolution of all parameters of a function it will reject a match between # the prototype and the implementation of a member function even if there is # only one candidate or it is obvious which candidate to choose by doing a # simple string match. By disabling STRICT_PROTO_MATCHING doxygen will still # accept a match between prototype and implementation in such cases. # The default value is: NO. STRICT_PROTO_MATCHING = NO # The GENERATE_TODOLIST tag can be used to enable (YES) or disable (NO) the todo # list. This list is created by putting \todo commands in the documentation. # The default value is: YES. GENERATE_TODOLIST = YES # The GENERATE_TESTLIST tag can be used to enable (YES) or disable (NO) the test # list. This list is created by putting \test commands in the documentation. # The default value is: YES. GENERATE_TESTLIST = YES # The GENERATE_BUGLIST tag can be used to enable (YES) or disable (NO) the bug # list. This list is created by putting \bug commands in the documentation. # The default value is: YES. GENERATE_BUGLIST = YES # The GENERATE_DEPRECATEDLIST tag can be used to enable (YES) or disable (NO) # the deprecated list. This list is created by putting \deprecated commands in # the documentation. # The default value is: YES. GENERATE_DEPRECATEDLIST= YES # The ENABLED_SECTIONS tag can be used to enable conditional documentation # sections, marked by \if ... \endif and \cond # ... \endcond blocks. ENABLED_SECTIONS = # The MAX_INITIALIZER_LINES tag determines the maximum number of lines that the # initial value of a variable or macro / define can have for it to appear in the # documentation. If the initializer consists of more lines than specified here # it will be hidden. Use a value of 0 to hide initializers completely. The # appearance of the value of individual variables and macros / defines can be # controlled using \showinitializer or \hideinitializer command in the # documentation regardless of this setting. # Minimum value: 0, maximum value: 10000, default value: 30. MAX_INITIALIZER_LINES = 30 # Set the SHOW_USED_FILES tag to NO to disable the list of files generated at # the bottom of the documentation of classes and structs. If set to YES, the # list will mention the files that were used to generate the documentation. # The default value is: YES. SHOW_USED_FILES = YES # Set the SHOW_FILES tag to NO to disable the generation of the Files page. This # will remove the Files entry from the Quick Index and from the Folder Tree View # (if specified). # The default value is: YES. SHOW_FILES = YES # Set the SHOW_NAMESPACES tag to NO to disable the generation of the Namespaces # page. This will remove the Namespaces entry from the Quick Index and from the # Folder Tree View (if specified). # The default value is: YES. SHOW_NAMESPACES = YES # The FILE_VERSION_FILTER tag can be used to specify a program or script that # doxygen should invoke to get the current version for each file (typically from # the version control system). Doxygen will invoke the program by executing (via # popen()) the command command input-file, where command is the value of the # FILE_VERSION_FILTER tag, and input-file is the name of an input file provided # by doxygen. Whatever the program writes to standard output is used as the file # version. For an example see the documentation. FILE_VERSION_FILTER = # The LAYOUT_FILE tag can be used to specify a layout file which will be parsed # by doxygen. The layout file controls the global structure of the generated # output files in an output format independent way. To create the layout file # that represents doxygen's defaults, run doxygen with the -l option. You can # optionally specify a file name after the option, if omitted DoxygenLayout.xml # will be used as the name of the layout file. # # Note that if you run doxygen from a directory containing a file called # DoxygenLayout.xml, doxygen will parse it automatically even if the LAYOUT_FILE # tag is left empty. LAYOUT_FILE = # The CITE_BIB_FILES tag can be used to specify one or more bib files containing # the reference definitions. This must be a list of .bib files. The .bib # extension is automatically appended if omitted. This requires the bibtex tool # to be installed. See also http://en.wikipedia.org/wiki/BibTeX for more info. # For LaTeX the style of the bibliography can be controlled using # LATEX_BIB_STYLE. To use this feature you need bibtex and perl available in the # search path. See also \cite for info how to create references. CITE_BIB_FILES = #--------------------------------------------------------------------------- # Configuration options related to warning and progress messages #--------------------------------------------------------------------------- # The QUIET tag can be used to turn on/off the messages that are generated to # standard output by doxygen. If QUIET is set to YES this implies that the # messages are off. # The default value is: NO. QUIET = NO # The WARNINGS tag can be used to turn on/off the warning messages that are # generated to standard error (stderr) by doxygen. If WARNINGS is set to YES # this implies that the warnings are on. # # Tip: Turn warnings on while writing the documentation. # The default value is: YES. WARNINGS = YES # If the WARN_IF_UNDOCUMENTED tag is set to YES then doxygen will generate # warnings for undocumented members. If EXTRACT_ALL is set to YES then this flag # will automatically be disabled. # The default value is: YES. WARN_IF_UNDOCUMENTED = YES # If the WARN_IF_DOC_ERROR tag is set to YES, doxygen will generate warnings for # potential errors in the documentation, such as not documenting some parameters # in a documented function, or documenting parameters that don't exist or using # markup commands wrongly. # The default value is: YES. WARN_IF_DOC_ERROR = YES # This WARN_NO_PARAMDOC option can be enabled to get warnings for functions that # are documented, but have no documentation for their parameters or return # value. If set to NO, doxygen will only warn about wrong or incomplete # parameter documentation, but not about the absence of documentation. # The default value is: NO. WARN_NO_PARAMDOC = NO # If the WARN_AS_ERROR tag is set to YES then doxygen will immediately stop when # a warning is encountered. # The default value is: NO. WARN_AS_ERROR = NO # The WARN_FORMAT tag determines the format of the warning messages that doxygen # can produce. The string should contain the $file, $line, and $text tags, which # will be replaced by the file and line number from which the warning originated # and the warning text. Optionally the format may contain $version, which will # be replaced by the version of the file (if it could be obtained via # FILE_VERSION_FILTER) # The default value is: $file:$line: $text. WARN_FORMAT = "$file:$line: $text" # The WARN_LOGFILE tag can be used to specify a file to which warning and error # messages should be written. If left blank the output is written to standard # error (stderr). WARN_LOGFILE = #--------------------------------------------------------------------------- # Configuration options related to the input files #--------------------------------------------------------------------------- # The INPUT tag is used to specify the files and/or directories that contain # documented source files. You may enter file names like myfile.cpp or # directories like /usr/src/myproject. Separate the files or directories with # spaces. See also FILE_PATTERNS and EXTENSION_MAPPING # Note: If this tag is empty the current directory is searched. INPUT = ./README.md ./include/robot_interface/control_base.hpp ./src/control_base.cpp # This tag can be used to specify the character encoding of the source files # that doxygen parses. Internally doxygen uses the UTF-8 encoding. Doxygen uses # libiconv (or the iconv built into libc) for the transcoding. See the libiconv # documentation (see: http://www.gnu.org/software/libiconv) for the list of # possible encodings. # The default value is: UTF-8. INPUT_ENCODING = UTF-8 # If the value of the INPUT tag contains directories, you can use the # FILE_PATTERNS tag to specify one or more wildcard patterns (like *.cpp and # *.h) to filter out the source-files in the directories. # # Note that for custom extensions or not directly supported extensions you also # need to set EXTENSION_MAPPING for the extension otherwise the files are not # read by doxygen. # # If left blank the following patterns are tested:*.c, *.cc, *.cxx, *.cpp, # *.c++, *.java, *.ii, *.ixx, *.ipp, *.i++, *.inl, *.idl, *.ddl, *.odl, *.h, # *.hh, *.hxx, *.hpp, *.h++, *.cs, *.d, *.php, *.php4, *.php5, *.phtml, *.inc, # *.m, *.markdown, *.md, *.mm, *.dox, *.py, *.pyw, *.f90, *.f95, *.f03, *.f08, # *.f, *.for, *.tcl, *.vhd, *.vhdl, *.ucf and *.qsf. FILE_PATTERNS = *.c \ *.cc \ *.cxx \ *.cpp \ *.c++ \ *.java \ *.ii \ *.ixx \ *.ipp \ *.i++ \ *.inl \ *.idl \ *.ddl \ *.odl \ *.h \ *.hh \ *.hxx \ *.hpp \ *.h++ \ *.cs \ *.d \ *.php \ *.php4 \ *.php5 \ *.phtml \ *.inc \ *.m \ *.markdown \ *.md \ *.mm \ *.dox \ *.py \ *.pyw \ *.f90 \ *.f95 \ *.f03 \ *.f08 \ *.f \ *.for \ *.tcl \ *.vhd \ *.vhdl \ *.ucf \ *.qsf # The RECURSIVE tag can be used to specify whether or not subdirectories should # be searched for input files as well. # The default value is: NO. RECURSIVE = NO # The EXCLUDE tag can be used to specify files and/or directories that should be # excluded from the INPUT source files. This way you can easily exclude a # subdirectory from a directory tree whose root is specified with the INPUT tag. # # Note that relative paths are relative to the directory from which doxygen is # run. EXCLUDE = # The EXCLUDE_SYMLINKS tag can be used to select whether or not files or # directories that are symbolic links (a Unix file system feature) are excluded # from the input. # The default value is: NO. EXCLUDE_SYMLINKS = NO # If the value of the INPUT tag contains directories, you can use the # EXCLUDE_PATTERNS tag to specify one or more wildcard patterns to exclude # certain files from those directories. # # Note that the wildcards are matched against the file with absolute path, so to # exclude all test directories for example use the pattern */test/* EXCLUDE_PATTERNS = # The EXCLUDE_SYMBOLS tag can be used to specify one or more symbol names # (namespaces, classes, functions, etc.) that should be excluded from the # output. The symbol name can be a fully qualified name, a word, or if the # wildcard * is used, a substring. Examples: ANamespace, AClass, # AClass::ANamespace, ANamespace::*Test # # Note that the wildcards are matched against the file with absolute path, so to # exclude all test directories use the pattern */test/* EXCLUDE_SYMBOLS = # The EXAMPLE_PATH tag can be used to specify one or more files or directories # that contain example code fragments that are included (see the \include # command). EXAMPLE_PATH = # If the value of the EXAMPLE_PATH tag contains directories, you can use the # EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp and # *.h) to filter out the source-files in the directories. If left blank all # files are included. EXAMPLE_PATTERNS = * # If the EXAMPLE_RECURSIVE tag is set to YES then subdirectories will be # searched for input files to be used with the \include or \dontinclude commands # irrespective of the value of the RECURSIVE tag. # The default value is: NO. EXAMPLE_RECURSIVE = NO # The IMAGE_PATH tag can be used to specify one or more files or directories # that contain images that are to be included in the documentation (see the # \image command). IMAGE_PATH = # The INPUT_FILTER tag can be used to specify a program that doxygen should # invoke to filter for each input file. Doxygen will invoke the filter program # by executing (via popen()) the command: # # # # where is the value of the INPUT_FILTER tag, and is the # name of an input file. Doxygen will then use the output that the filter # program writes to standard output. If FILTER_PATTERNS is specified, this tag # will be ignored. # # Note that the filter must not add or remove lines; it is applied before the # code is scanned, but not when the output code is generated. If lines are added # or removed, the anchors will not be placed correctly. # # Note that for custom extensions or not directly supported extensions you also # need to set EXTENSION_MAPPING for the extension otherwise the files are not # properly processed by doxygen. INPUT_FILTER = # The FILTER_PATTERNS tag can be used to specify filters on a per file pattern # basis. Doxygen will compare the file name with each pattern and apply the # filter if there is a match. The filters are a list of the form: pattern=filter # (like *.cpp=my_cpp_filter). See INPUT_FILTER for further information on how # filters are used. If the FILTER_PATTERNS tag is empty or if none of the # patterns match the file name, INPUT_FILTER is applied. # # Note that for custom extensions or not directly supported extensions you also # need to set EXTENSION_MAPPING for the extension otherwise the files are not # properly processed by doxygen. FILTER_PATTERNS = # If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using # INPUT_FILTER) will also be used to filter the input files that are used for # producing the source files to browse (i.e. when SOURCE_BROWSER is set to YES). # The default value is: NO. FILTER_SOURCE_FILES = NO # The FILTER_SOURCE_PATTERNS tag can be used to specify source filters per file # pattern. A pattern will override the setting for FILTER_PATTERN (if any) and # it is also possible to disable source filtering for a specific pattern using # *.ext= (so without naming a filter). # This tag requires that the tag FILTER_SOURCE_FILES is set to YES. FILTER_SOURCE_PATTERNS = # If the USE_MDFILE_AS_MAINPAGE tag refers to the name of a markdown file that # is part of the input, its contents will be placed on the main page # (index.html). This can be useful if you have a project on for instance GitHub # and want to reuse the introduction page also for the doxygen output. USE_MDFILE_AS_MAINPAGE = ./README.md #--------------------------------------------------------------------------- # Configuration options related to source browsing #--------------------------------------------------------------------------- # If the SOURCE_BROWSER tag is set to YES then a list of source files will be # generated. Documented entities will be cross-referenced with these sources. # # Note: To get rid of all source code in the generated output, make sure that # also VERBATIM_HEADERS is set to NO. # The default value is: NO. SOURCE_BROWSER = NO # Setting the INLINE_SOURCES tag to YES will include the body of functions, # classes and enums directly into the documentation. # The default value is: NO. INLINE_SOURCES = NO # Setting the STRIP_CODE_COMMENTS tag to YES will instruct doxygen to hide any # special comment blocks from generated source code fragments. Normal C, C++ and # Fortran comments will always remain visible. # The default value is: YES. STRIP_CODE_COMMENTS = YES # If the REFERENCED_BY_RELATION tag is set to YES then for each documented # function all documented functions referencing it will be listed. # The default value is: NO. REFERENCED_BY_RELATION = NO # If the REFERENCES_RELATION tag is set to YES then for each documented function # all documented entities called/used by that function will be listed. # The default value is: NO. REFERENCES_RELATION = NO # If the REFERENCES_LINK_SOURCE tag is set to YES and SOURCE_BROWSER tag is set # to YES then the hyperlinks from functions in REFERENCES_RELATION and # REFERENCED_BY_RELATION lists will link to the source code. Otherwise they will # link to the documentation. # The default value is: YES. REFERENCES_LINK_SOURCE = YES # If SOURCE_TOOLTIPS is enabled (the default) then hovering a hyperlink in the # source code will show a tooltip with additional information such as prototype, # brief description and links to the definition and documentation. Since this # will make the HTML file larger and loading of large files a bit slower, you # can opt to disable this feature. # The default value is: YES. # This tag requires that the tag SOURCE_BROWSER is set to YES. SOURCE_TOOLTIPS = YES # If the USE_HTAGS tag is set to YES then the references to source code will # point to the HTML generated by the htags(1) tool instead of doxygen built-in # source browser. The htags tool is part of GNU's global source tagging system # (see http://www.gnu.org/software/global/global.html). You will need version # 4.8.6 or higher. # # To use it do the following: # - Install the latest version of global # - Enable SOURCE_BROWSER and USE_HTAGS in the config file # - Make sure the INPUT points to the root of the source tree # - Run doxygen as normal # # Doxygen will invoke htags (and that will in turn invoke gtags), so these # tools must be available from the command line (i.e. in the search path). # # The result: instead of the source browser generated by doxygen, the links to # source code will now point to the output of htags. # The default value is: NO. # This tag requires that the tag SOURCE_BROWSER is set to YES. USE_HTAGS = NO # If the VERBATIM_HEADERS tag is set the YES then doxygen will generate a # verbatim copy of the header file for each class for which an include is # specified. Set to NO to disable this. # See also: Section \class. # The default value is: YES. VERBATIM_HEADERS = YES # If the CLANG_ASSISTED_PARSING tag is set to YES then doxygen will use the # clang parser (see: http://clang.llvm.org/) for more accurate parsing at the # cost of reduced performance. This can be particularly helpful with template # rich C++ code for which doxygen's built-in parser lacks the necessary type # information. # Note: The availability of this option depends on whether or not doxygen was # generated with the -Duse-libclang=ON option for CMake. # The default value is: NO. CLANG_ASSISTED_PARSING = NO # If clang assisted parsing is enabled you can provide the compiler with command # line options that you would normally use when invoking the compiler. Note that # the include paths will already be set by doxygen for the files and directories # specified with INPUT and INCLUDE_PATH. # This tag requires that the tag CLANG_ASSISTED_PARSING is set to YES. CLANG_OPTIONS = #--------------------------------------------------------------------------- # Configuration options related to the alphabetical class index #--------------------------------------------------------------------------- # If the ALPHABETICAL_INDEX tag is set to YES, an alphabetical index of all # compounds will be generated. Enable this if the project contains a lot of # classes, structs, unions or interfaces. # The default value is: YES. ALPHABETICAL_INDEX = YES # The COLS_IN_ALPHA_INDEX tag can be used to specify the number of columns in # which the alphabetical index list will be split. # Minimum value: 1, maximum value: 20, default value: 5. # This tag requires that the tag ALPHABETICAL_INDEX is set to YES. COLS_IN_ALPHA_INDEX = 5 # In case all classes in a project start with a common prefix, all classes will # be put under the same header in the alphabetical index. The IGNORE_PREFIX tag # can be used to specify a prefix (or a list of prefixes) that should be ignored # while generating the index headers. # This tag requires that the tag ALPHABETICAL_INDEX is set to YES. IGNORE_PREFIX = #--------------------------------------------------------------------------- # Configuration options related to the HTML output #--------------------------------------------------------------------------- # If the GENERATE_HTML tag is set to YES, doxygen will generate HTML output # The default value is: YES. GENERATE_HTML = YES # The HTML_OUTPUT tag is used to specify where the HTML docs will be put. If a # relative path is entered the value of OUTPUT_DIRECTORY will be put in front of # it. # The default directory is: html. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_OUTPUT = html # The HTML_FILE_EXTENSION tag can be used to specify the file extension for each # generated HTML page (for example: .htm, .php, .asp). # The default value is: .html. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_FILE_EXTENSION = .html # The HTML_HEADER tag can be used to specify a user-defined HTML header file for # each generated HTML page. If the tag is left blank doxygen will generate a # standard header. # # To get valid HTML the header file that includes any scripts and style sheets # that doxygen needs, which is dependent on the configuration options used (e.g. # the setting GENERATE_TREEVIEW). It is highly recommended to start with a # default header using # doxygen -w html new_header.html new_footer.html new_stylesheet.css # YourConfigFile # and then modify the file new_header.html. See also section "Doxygen usage" # for information on how to generate the default header that doxygen normally # uses. # Note: The header is subject to change so you typically have to regenerate the # default header when upgrading to a newer version of doxygen. For a description # of the possible markers and block names see the documentation. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_HEADER = # The HTML_FOOTER tag can be used to specify a user-defined HTML footer for each # generated HTML page. If the tag is left blank doxygen will generate a standard # footer. See HTML_HEADER for more information on how to generate a default # footer and what special commands can be used inside the footer. See also # section "Doxygen usage" for information on how to generate the default footer # that doxygen normally uses. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_FOOTER = # The HTML_STYLESHEET tag can be used to specify a user-defined cascading style # sheet that is used by each HTML page. It can be used to fine-tune the look of # the HTML output. If left blank doxygen will generate a default style sheet. # See also section "Doxygen usage" for information on how to generate the style # sheet that doxygen normally uses. # Note: It is recommended to use HTML_EXTRA_STYLESHEET instead of this tag, as # it is more robust and this tag (HTML_STYLESHEET) will in the future become # obsolete. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_STYLESHEET = # The HTML_EXTRA_STYLESHEET tag can be used to specify additional user-defined # cascading style sheets that are included after the standard style sheets # created by doxygen. Using this option one can overrule certain style aspects. # This is preferred over using HTML_STYLESHEET since it does not replace the # standard style sheet and is therefore more robust against future updates. # Doxygen will copy the style sheet files to the output directory. # Note: The order of the extra style sheet files is of importance (e.g. the last # style sheet in the list overrules the setting of the previous ones in the # list). For an example see the documentation. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_EXTRA_STYLESHEET = # The HTML_EXTRA_FILES tag can be used to specify one or more extra images or # other source files which should be copied to the HTML output directory. Note # that these files will be copied to the base HTML output directory. Use the # $relpath^ marker in the HTML_HEADER and/or HTML_FOOTER files to load these # files. In the HTML_STYLESHEET file, use the file name only. Also note that the # files will be copied as-is; there are no commands or markers available. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_EXTRA_FILES = # The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. Doxygen # will adjust the colors in the style sheet and background images according to # this color. Hue is specified as an angle on a colorwheel, see # http://en.wikipedia.org/wiki/Hue for more information. For instance the value # 0 represents red, 60 is yellow, 120 is green, 180 is cyan, 240 is blue, 300 # purple, and 360 is red again. # Minimum value: 0, maximum value: 359, default value: 220. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_COLORSTYLE_HUE = 220 # The HTML_COLORSTYLE_SAT tag controls the purity (or saturation) of the colors # in the HTML output. For a value of 0 the output will use grayscales only. A # value of 255 will produce the most vivid colors. # Minimum value: 0, maximum value: 255, default value: 100. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_COLORSTYLE_SAT = 100 # The HTML_COLORSTYLE_GAMMA tag controls the gamma correction applied to the # luminance component of the colors in the HTML output. Values below 100 # gradually make the output lighter, whereas values above 100 make the output # darker. The value divided by 100 is the actual gamma applied, so 80 represents # a gamma of 0.8, The value 220 represents a gamma of 2.2, and 100 does not # change the gamma. # Minimum value: 40, maximum value: 240, default value: 80. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_COLORSTYLE_GAMMA = 80 # If the HTML_TIMESTAMP tag is set to YES then the footer of each generated HTML # page will contain the date and time when the page was generated. Setting this # to YES can help to show when doxygen was last run and thus if the # documentation is up to date. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_TIMESTAMP = NO # If the HTML_DYNAMIC_SECTIONS tag is set to YES then the generated HTML # documentation will contain sections that can be hidden and shown after the # page has loaded. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_DYNAMIC_SECTIONS = NO # With HTML_INDEX_NUM_ENTRIES one can control the preferred number of entries # shown in the various tree structured indices initially; the user can expand # and collapse entries dynamically later on. Doxygen will expand the tree to # such a level that at most the specified number of entries are visible (unless # a fully collapsed tree already exceeds this amount). So setting the number of # entries 1 will produce a full collapsed tree by default. 0 is a special value # representing an infinite number of entries and will result in a full expanded # tree by default. # Minimum value: 0, maximum value: 9999, default value: 100. # This tag requires that the tag GENERATE_HTML is set to YES. HTML_INDEX_NUM_ENTRIES = 100 # If the GENERATE_DOCSET tag is set to YES, additional index files will be # generated that can be used as input for Apple's Xcode 3 integrated development # environment (see: http://developer.apple.com/tools/xcode/), introduced with # OSX 10.5 (Leopard). To create a documentation set, doxygen will generate a # Makefile in the HTML output directory. Running make will produce the docset in # that directory and running make install will install the docset in # ~/Library/Developer/Shared/Documentation/DocSets so that Xcode will find it at # startup. See http://developer.apple.com/tools/creatingdocsetswithdoxygen.html # for more information. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_DOCSET = NO # This tag determines the name of the docset feed. A documentation feed provides # an umbrella under which multiple documentation sets from a single provider # (such as a company or product suite) can be grouped. # The default value is: Doxygen generated docs. # This tag requires that the tag GENERATE_DOCSET is set to YES. DOCSET_FEEDNAME = "Doxygen generated docs" # This tag specifies a string that should uniquely identify the documentation # set bundle. This should be a reverse domain-name style string, e.g. # com.mycompany.MyDocSet. Doxygen will append .docset to the name. # The default value is: org.doxygen.Project. # This tag requires that the tag GENERATE_DOCSET is set to YES. DOCSET_BUNDLE_ID = org.doxygen.Project # The DOCSET_PUBLISHER_ID tag specifies a string that should uniquely identify # the documentation publisher. This should be a reverse domain-name style # string, e.g. com.mycompany.MyDocSet.documentation. # The default value is: org.doxygen.Publisher. # This tag requires that the tag GENERATE_DOCSET is set to YES. DOCSET_PUBLISHER_ID = org.doxygen.Publisher # The DOCSET_PUBLISHER_NAME tag identifies the documentation publisher. # The default value is: Publisher. # This tag requires that the tag GENERATE_DOCSET is set to YES. DOCSET_PUBLISHER_NAME = Publisher # If the GENERATE_HTMLHELP tag is set to YES then doxygen generates three # additional HTML index files: index.hhp, index.hhc, and index.hhk. The # index.hhp is a project file that can be read by Microsoft's HTML Help Workshop # (see: http://www.microsoft.com/en-us/download/details.aspx?id=21138) on # Windows. # # The HTML Help Workshop contains a compiler that can convert all HTML output # generated by doxygen into a single compiled HTML file (.chm). Compiled HTML # files are now used as the Windows 98 help format, and will replace the old # Windows help format (.hlp) on all Windows platforms in the future. Compressed # HTML files also contain an index, a table of contents, and you can search for # words in the documentation. The HTML workshop also contains a viewer for # compressed HTML files. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_HTMLHELP = NO # The CHM_FILE tag can be used to specify the file name of the resulting .chm # file. You can add a path in front of the file if the result should not be # written to the html output directory. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. CHM_FILE = # The HHC_LOCATION tag can be used to specify the location (absolute path # including file name) of the HTML help compiler (hhc.exe). If non-empty, # doxygen will try to run the HTML help compiler on the generated index.hhp. # The file has to be specified with full path. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. HHC_LOCATION = # The GENERATE_CHI flag controls if a separate .chi index file is generated # (YES) or that it should be included in the master .chm file (NO). # The default value is: NO. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. GENERATE_CHI = NO # The CHM_INDEX_ENCODING is used to encode HtmlHelp index (hhk), content (hhc) # and project file content. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. CHM_INDEX_ENCODING = # The BINARY_TOC flag controls whether a binary table of contents is generated # (YES) or a normal table of contents (NO) in the .chm file. Furthermore it # enables the Previous and Next buttons. # The default value is: NO. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. BINARY_TOC = NO # The TOC_EXPAND flag can be set to YES to add extra items for group members to # the table of contents of the HTML help documentation and to the tree view. # The default value is: NO. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. TOC_EXPAND = NO # If the GENERATE_QHP tag is set to YES and both QHP_NAMESPACE and # QHP_VIRTUAL_FOLDER are set, an additional index file will be generated that # can be used as input for Qt's qhelpgenerator to generate a Qt Compressed Help # (.qch) of the generated HTML documentation. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_QHP = NO # If the QHG_LOCATION tag is specified, the QCH_FILE tag can be used to specify # the file name of the resulting .qch file. The path specified is relative to # the HTML output folder. # This tag requires that the tag GENERATE_QHP is set to YES. QCH_FILE = # The QHP_NAMESPACE tag specifies the namespace to use when generating Qt Help # Project output. For more information please see Qt Help Project / Namespace # (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#namespace). # The default value is: org.doxygen.Project. # This tag requires that the tag GENERATE_QHP is set to YES. QHP_NAMESPACE = org.doxygen.Project # The QHP_VIRTUAL_FOLDER tag specifies the namespace to use when generating Qt # Help Project output. For more information please see Qt Help Project / Virtual # Folders (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#virtual- # folders). # The default value is: doc. # This tag requires that the tag GENERATE_QHP is set to YES. QHP_VIRTUAL_FOLDER = doc # If the QHP_CUST_FILTER_NAME tag is set, it specifies the name of a custom # filter to add. For more information please see Qt Help Project / Custom # Filters (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#custom- # filters). # This tag requires that the tag GENERATE_QHP is set to YES. QHP_CUST_FILTER_NAME = # The QHP_CUST_FILTER_ATTRS tag specifies the list of the attributes of the # custom filter to add. For more information please see Qt Help Project / Custom # Filters (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#custom- # filters). # This tag requires that the tag GENERATE_QHP is set to YES. QHP_CUST_FILTER_ATTRS = # The QHP_SECT_FILTER_ATTRS tag specifies the list of the attributes this # project's filter section matches. Qt Help Project / Filter Attributes (see: # http://qt-project.org/doc/qt-4.8/qthelpproject.html#filter-attributes). # This tag requires that the tag GENERATE_QHP is set to YES. QHP_SECT_FILTER_ATTRS = # The QHG_LOCATION tag can be used to specify the location of Qt's # qhelpgenerator. If non-empty doxygen will try to run qhelpgenerator on the # generated .qhp file. # This tag requires that the tag GENERATE_QHP is set to YES. QHG_LOCATION = # If the GENERATE_ECLIPSEHELP tag is set to YES, additional index files will be # generated, together with the HTML files, they form an Eclipse help plugin. To # install this plugin and make it available under the help contents menu in # Eclipse, the contents of the directory containing the HTML and XML files needs # to be copied into the plugins directory of eclipse. The name of the directory # within the plugins directory should be the same as the ECLIPSE_DOC_ID value. # After copying Eclipse needs to be restarted before the help appears. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_ECLIPSEHELP = NO # A unique identifier for the Eclipse help plugin. When installing the plugin # the directory name containing the HTML and XML files should also have this # name. Each documentation set should have its own identifier. # The default value is: org.doxygen.Project. # This tag requires that the tag GENERATE_ECLIPSEHELP is set to YES. ECLIPSE_DOC_ID = org.doxygen.Project # If you want full control over the layout of the generated HTML pages it might # be necessary to disable the index and replace it with your own. The # DISABLE_INDEX tag can be used to turn on/off the condensed index (tabs) at top # of each HTML page. A value of NO enables the index and the value YES disables # it. Since the tabs in the index contain the same information as the navigation # tree, you can set this option to YES if you also set GENERATE_TREEVIEW to YES. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. DISABLE_INDEX = NO # The GENERATE_TREEVIEW tag is used to specify whether a tree-like index # structure should be generated to display hierarchical information. If the tag # value is set to YES, a side panel will be generated containing a tree-like # index structure (just like the one that is generated for HTML Help). For this # to work a browser that supports JavaScript, DHTML, CSS and frames is required # (i.e. any modern browser). Windows users are probably better off using the # HTML help feature. Via custom style sheets (see HTML_EXTRA_STYLESHEET) one can # further fine-tune the look of the index. As an example, the default style # sheet generated by doxygen has an example that shows how to put an image at # the root of the tree instead of the PROJECT_NAME. Since the tree basically has # the same information as the tab index, you could consider setting # DISABLE_INDEX to YES when enabling this option. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_TREEVIEW = NO # The ENUM_VALUES_PER_LINE tag can be used to set the number of enum values that # doxygen will group on one line in the generated HTML documentation. # # Note that a value of 0 will completely suppress the enum values from appearing # in the overview section. # Minimum value: 0, maximum value: 20, default value: 4. # This tag requires that the tag GENERATE_HTML is set to YES. ENUM_VALUES_PER_LINE = 4 # If the treeview is enabled (see GENERATE_TREEVIEW) then this tag can be used # to set the initial width (in pixels) of the frame in which the tree is shown. # Minimum value: 0, maximum value: 1500, default value: 250. # This tag requires that the tag GENERATE_HTML is set to YES. TREEVIEW_WIDTH = 250 # If the EXT_LINKS_IN_WINDOW option is set to YES, doxygen will open links to # external symbols imported via tag files in a separate window. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. EXT_LINKS_IN_WINDOW = NO # Use this tag to change the font size of LaTeX formulas included as images in # the HTML documentation. When you change the font size after a successful # doxygen run you need to manually remove any form_*.png images from the HTML # output directory to force them to be regenerated. # Minimum value: 8, maximum value: 50, default value: 10. # This tag requires that the tag GENERATE_HTML is set to YES. FORMULA_FONTSIZE = 10 # Use the FORMULA_TRANPARENT tag to determine whether or not the images # generated for formulas are transparent PNGs. Transparent PNGs are not # supported properly for IE 6.0, but are supported on all modern browsers. # # Note that when changing this option you need to delete any form_*.png files in # the HTML output directory before the changes have effect. # The default value is: YES. # This tag requires that the tag GENERATE_HTML is set to YES. FORMULA_TRANSPARENT = YES # Enable the USE_MATHJAX option to render LaTeX formulas using MathJax (see # http://www.mathjax.org) which uses client side Javascript for the rendering # instead of using pre-rendered bitmaps. Use this if you do not have LaTeX # installed or if you want to formulas look prettier in the HTML output. When # enabled you may also need to install MathJax separately and configure the path # to it using the MATHJAX_RELPATH option. # The default value is: NO. # This tag requires that the tag GENERATE_HTML is set to YES. USE_MATHJAX = NO # When MathJax is enabled you can set the default output format to be used for # the MathJax output. See the MathJax site (see: # http://docs.mathjax.org/en/latest/output.html) for more details. # Possible values are: HTML-CSS (which is slower, but has the best # compatibility), NativeMML (i.e. MathML) and SVG. # The default value is: HTML-CSS. # This tag requires that the tag USE_MATHJAX is set to YES. MATHJAX_FORMAT = HTML-CSS # When MathJax is enabled you need to specify the location relative to the HTML # output directory using the MATHJAX_RELPATH option. The destination directory # should contain the MathJax.js script. For instance, if the mathjax directory # is located at the same level as the HTML output directory, then # MATHJAX_RELPATH should be ../mathjax. The default value points to the MathJax # Content Delivery Network so you can quickly see the result without installing # MathJax. However, it is strongly recommended to install a local copy of # MathJax from http://www.mathjax.org before deployment. # The default value is: http://cdn.mathjax.org/mathjax/latest. # This tag requires that the tag USE_MATHJAX is set to YES. MATHJAX_RELPATH = http://cdn.mathjax.org/mathjax/latest # The MATHJAX_EXTENSIONS tag can be used to specify one or more MathJax # extension names that should be enabled during MathJax rendering. For example # MATHJAX_EXTENSIONS = TeX/AMSmath TeX/AMSsymbols # This tag requires that the tag USE_MATHJAX is set to YES. MATHJAX_EXTENSIONS = # The MATHJAX_CODEFILE tag can be used to specify a file with javascript pieces # of code that will be used on startup of the MathJax code. See the MathJax site # (see: http://docs.mathjax.org/en/latest/output.html) for more details. For an # example see the documentation. # This tag requires that the tag USE_MATHJAX is set to YES. MATHJAX_CODEFILE = # When the SEARCHENGINE tag is enabled doxygen will generate a search box for # the HTML output. The underlying search engine uses javascript and DHTML and # should work on any modern browser. Note that when using HTML help # (GENERATE_HTMLHELP), Qt help (GENERATE_QHP), or docsets (GENERATE_DOCSET) # there is already a search function so this one should typically be disabled. # For large projects the javascript based search engine can be slow, then # enabling SERVER_BASED_SEARCH may provide a better solution. It is possible to # search using the keyboard; to jump to the search box use + S # (what the is depends on the OS and browser, but it is typically # , /