Showing preview only (503K chars total). Download the full file or copy to clipboard to get everything.
Repository: intel/ros2_grasp_library
Branch: master
Commit: 980b7ddd4348
Files: 139
Total size: 13.7 MB
Directory structure:
gitextract_b6simm2f/
├── .gitignore
├── CHANGELOG.rst
├── LICENSE
├── README.md
├── docker/
│ ├── Dockerfile
│ ├── README.md
│ ├── script/
│ │ ├── 00_ros2_install.sh
│ │ ├── 10_eigen_install.sh
│ │ ├── 11_libpcl_install.sh
│ │ ├── 12_opencv_install.sh
│ │ ├── 13_openvino_install.sh
│ │ ├── 20_librealsense_install.sh
│ │ ├── 30_gpg_install.sh
│ │ ├── 31_gpd_install.sh
│ │ ├── 32_ur_modern_driver_install.sh
│ │ ├── 50_ros2_deps.sh
│ │ ├── install_ros2_grasp_library.sh
│ │ ├── ros_entrypoint.sh
│ │ └── ros_env.sh
│ └── setup_docker_display.sh
├── grasp_apps/
│ ├── draw_x/
│ │ ├── CMakeLists.txt
│ │ ├── launch/
│ │ │ ├── draw_x.launch.py
│ │ │ └── draw_x.yaml
│ │ ├── package.xml
│ │ └── src/
│ │ └── draw_x.cpp
│ ├── fixed_position_pick/
│ │ ├── CMakeLists.txt
│ │ ├── launch/
│ │ │ ├── fixed_position_pick.launch.py
│ │ │ └── fixed_position_pick.yaml
│ │ ├── package.xml
│ │ └── src/
│ │ └── fixed_position_pick.cpp
│ ├── random_pick/
│ │ ├── CMakeLists.txt
│ │ ├── cfg/
│ │ │ └── random_pick.yaml
│ │ ├── package.xml
│ │ └── src/
│ │ └── random_pick.cpp
│ └── recognize_pick/
│ ├── CMakeLists.txt
│ ├── cfg/
│ │ └── recognize_pick.yaml
│ ├── package.xml
│ └── src/
│ ├── place_publisher.cpp
│ └── recognize_pick.cpp
├── grasp_msgs/
│ ├── CMakeLists.txt
│ ├── msg/
│ │ ├── CloudIndexed.msg
│ │ ├── CloudSamples.msg
│ │ ├── CloudSources.msg
│ │ ├── GraspConfig.msg
│ │ ├── GraspConfigList.msg
│ │ └── SamplesMsg.msg
│ └── package.xml
├── grasp_ros2/
│ ├── CMakeLists.txt
│ ├── cfg/
│ │ ├── grasp_ros2_params.yaml
│ │ ├── random_pick.yaml
│ │ ├── recognize_pick.yaml
│ │ └── test_grasp_ros2.yaml
│ ├── include/
│ │ └── grasp_library/
│ │ └── ros2/
│ │ ├── consts.hpp
│ │ ├── grasp_detector_base.hpp
│ │ ├── grasp_detector_gpd.hpp
│ │ ├── grasp_planner.hpp
│ │ └── ros_params.hpp
│ ├── package.xml
│ ├── src/
│ │ ├── consts.cpp
│ │ ├── grasp_composition.cpp
│ │ ├── grasp_detector_gpd.cpp
│ │ ├── grasp_planner.cpp
│ │ └── ros_params.cpp
│ └── tests/
│ ├── CMakeLists.txt
│ ├── resource/
│ │ └── table_top.pcd
│ ├── tgrasp_ros2.cpp
│ └── tgrasp_ros2.h.in
├── grasp_tutorials/
│ ├── CMakeLists.txt
│ ├── README.md
│ ├── _static/
│ │ └── images/
│ │ └── workflow.vsdx
│ ├── conf.py
│ ├── doc/
│ │ ├── bringup_robot.rst
│ │ ├── draw_x.rst
│ │ ├── fixed_position_pick.rst
│ │ ├── getting_start.rst
│ │ ├── grasp_api.rst
│ │ ├── grasp_planner.rst
│ │ ├── grasp_ros2/
│ │ │ ├── install_gpd.md
│ │ │ ├── install_openvino.md
│ │ │ ├── tutorials_1_grasp_ros2_with_camera.md
│ │ │ ├── tutorials_2_grasp_ros2_test.md
│ │ │ └── tutorials_3_grasp_ros2_launch_options.md
│ │ ├── handeye_calibration.rst
│ │ ├── overview.rst
│ │ ├── random_pick.rst
│ │ ├── recognize_pick.rst
│ │ ├── robot_interface.rst
│ │ └── template.rst
│ ├── index.rst
│ └── package.xml
├── grasp_utils/
│ ├── handeye_dashboard/
│ │ ├── README.md
│ │ ├── config/
│ │ │ └── Default.perspective
│ │ ├── data/
│ │ │ ├── camera-robot.json
│ │ │ └── dataset.json
│ │ ├── launch/
│ │ │ └── handeye_dashboard.launch.py
│ │ ├── package.xml
│ │ ├── plugin.xml
│ │ ├── resource/
│ │ │ └── handeye_dashboard
│ │ ├── setup.py
│ │ └── src/
│ │ └── handeye_dashboard/
│ │ ├── __init__.py
│ │ ├── handeye_calibration.py
│ │ ├── main.py
│ │ └── plugin.py
│ ├── handeye_target_detection/
│ │ ├── .clang-format
│ │ ├── CMakeLists.txt
│ │ ├── README.md
│ │ ├── include/
│ │ │ └── PoseEstimator.h
│ │ ├── launch/
│ │ │ ├── pose_estimation.launch.py
│ │ │ └── pose_estimation.yaml
│ │ ├── package.xml
│ │ └── src/
│ │ ├── pose_estimation_node.cpp
│ │ └── pose_estimator.cpp
│ ├── handeye_tf_service/
│ │ ├── CMakeLists.txt
│ │ ├── README.md
│ │ ├── package.xml
│ │ ├── src/
│ │ │ └── handeye_tf_server.cpp
│ │ └── srv/
│ │ └── HandeyeTF.srv
│ └── robot_interface/
│ ├── CMakeLists.txt
│ ├── Doxyfile
│ ├── README.md
│ ├── include/
│ │ └── robot_interface/
│ │ ├── control_base.hpp
│ │ └── control_ur.hpp
│ ├── launch/
│ │ ├── ur_test.launch.py
│ │ └── ur_test.yaml
│ ├── package.xml
│ ├── src/
│ │ ├── control_base.cpp
│ │ └── control_ur.cpp
│ └── test/
│ ├── ur_test_move_command.cpp
│ └── ur_test_state_publish.cpp
└── moveit_msgs_light/
├── CMakeLists.txt
├── README.md
├── msg/
│ ├── CollisionObject.msg
│ ├── Grasp.msg
│ ├── GripperTranslation.msg
│ ├── MoveItErrorCodes.msg
│ ├── ObjectType.msg
│ └── PlaceLocation.msg
├── package.xml
└── srv/
└── GraspPlanning.srv
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
# rviz file
*.rviz
# document buid file
build
_build
# vscode file
.vscode
================================================
FILE: CHANGELOG.rst
================================================
changelog for ros2_grasp_library
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
0.5.0 (2019-11-06)
------------------
* Added examples for advanced industrial robot applications
* draw X
* fixed position pick and place
* random picking with OpenVINO grasp planning
* recognition picking with OpenVINO grasp planning and OpenVINO mask-rcnn object segmentation
* Support ROS2 hand-eye calibration
* Support robot interface for manipulation
* Added tutorials on how to
* Build and launch example applications
* Operate hand-eye calibration and publish the transformation
* Quickly enable robot interface on a new industrial robot
0.4.0 (2019-03-13)
------------------
* Support "service-driven" grasp detection mechanism (via configure auto_mode) to optimize CPU load for real-time processing.
* Support grasp transformation from camera frame to a specified target frame expected in the visual manipulation.
* Support launch option "grasp_approach" to specify expected approach direction in the target frame specified by 'grasp_frame_id'. Grasp Planner will return grasp poses with approach direction approximate to this parameter.
* Support launch option "device" to configure device for grasp pose inference to execute, 0 for CPU, 1 for GPU, 2 for VPU, 3 for FPGA. In case OpenVINO plug-ins are installed (tutorial), this configure deploy the CNN based deep learning inference on to the target device.
* Add tutorials for introduction to Intel DLDT toolkit and Intel OpenVINO toolkit.
* Add tutorials for launch options and customization notes.
0.3.0 (2018-12-28)
------------------
* Support grasp pose detection from RGBD point cloud.
* Support MoveIt! grasp planning service.
================================================
FILE: LICENSE
================================================
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2018 Intel Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
================================================
FILE: README.md
================================================
# DISCONTINUATION OF PROJECT #
This project will no longer be maintained by Intel.
Intel has ceased development and contributions including, but not limited to, maintenance, bug fixes, new releases, or updates, to this project.
Intel no longer accepts patches to this project.
If you have an ongoing need to use this project, are interested in independently developing it, or would like to maintain patches for the open source software community, please create your own fork of this project.
Contact: webadmin@linux.intel.com
# ROS2 Grasp Library
A ROS2 intelligent visual grasp solution for advanced industrial usages, with OpenVINO™ grasp detection and MoveIt Grasp Planning.
## Overview
ROS2 Grasp Library enables state-of-the-art CNN based deep learning grasp detection algorithms on ROS2 for intelligent visual grasp in industrial robot usage scenarios. This package provides ROS2 interfaces compliant with the open source [MoveIt](http://moveit.ros.org/) motion planning framework supported by most of the [robot models](https://moveit.ros.org/robots) in ROS industrial. This package delivers
* A ROS2 Grasp Planner providing grasp planning service, as an extensible capability of MoveIt ([moveit_msgs::srv::GraspPlanning](http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html)), translating grasp detection results into MoveIt Interfaces ([moveit_msgs::msg::Grasp](http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html))
* A ROS2 Grasp Detctor abstracting interfaces for grasp detection results
* A ROS2 hand-eye calibration module generating transformation from camera frame to robot frame
* ROS2 example applications demonstrating how to use this ROS2 Grasp Library in advanced industrial usages for intelligent visual grasp
## Grasp Detection Algorithms
Grasp detection back-end algorithms enabled by this ROS2 Grasp Library:
- [Grasp Pose Detection](https://github.com/atenpas/gpd) detects 6-DOF grasp poses for a 2-finger grasp (e.g. a parallel jaw gripper) in 3D point clouds from RGBD sensor or PCD file. The grasp detection was enabled with Intel® [DLDT](https://github.com/opencv/dldt) toolkit and Intel® [OpenVINO™](https://software.intel.com/en-us/openvino-toolkit) toolkit.
<img src="grasp_tutorials/doc/grasp_ros2/img/ros2_grasp_library.png" width = 50% height = 50% alt="ROS2 Grasp Library" align=center />
## Tutorials
Refer to ROS2 Grasp Library [Tutorials](http://intel.github.io/ros2_grasp_library) for how to
* Install, build, and launch the ROS2 Grasp Planner and Detector
* Use launch options to customize in a new workspace
* Bring up the intelligent visual grasp solution on a new robot
* Do hand-eye calibration for a new camera setup
* Launch the example applications
## Example Applications
### Random Picking (OpenVINO Grasp Detection)
[<img src="grasp_tutorials/_static/images/random_pick.png" width = 50% height = 50% alt="Random Pick with OpenVINO Grasp Detection - Link to Youtube video demo" align=center>](https://www.youtube.com/embed/b4EPvHdidOA)
### Recognition Picking (OpenVINO Grasp Detection + OpenVINO Mask-rcnn Object Segmentation)
[<img src="grasp_tutorials/_static/images/recognize_pick.png" width = 50% height = 50% alt="Recognition Pick with OpenVINO Grasp Detection - Link to Youtube video demo" align=center>](https://www.youtube.com/embed/trIt0uKRXBs)
## Known Issues
* Cloud camera failed at "Invalid sizes when resizing a matrix or array" when dealing with XYZRGBA pointcloud from ROS2 Realsenes, tracked as [#6](https://github.com/atenpas/gpg/issues/6) of gpg, [patch](https://github.com/atenpas/gpg/pull/7) under review.
* 'colcon test' sometimes failed with test suite "tgrasp_ros2", due to ROS2 service request failure issue (reported ros2 examples issue [#228](https://github.com/ros2/examples/issues/228) and detailed discussed in ros2 demo issue [#304](https://github.com/ros2/demos/issues/304))
* Rviz2 failed to receive Static TF from camera due to transient_local QoS (expected in the coming ROS2 Eloquent, discussed in geometry2 issue [#183](https://github.com/ros2/geometry2/issues/183)), workaround [patch](https://github.com/intel/ros2_intel_realsense/pull/88) available till the adaption to Eloquent
## Contribute to This Project
It's welcomed to contribute to this project. Here're some recommended practices:
* When adding a new feature it's expected to add tests covering the new functionalities
```bash
colcon test --packages-select <names_of_affected_packages>
```
* Before submitting a patch, it's recommended to pass all existing tests to avoid regression
```bash
colcon test --packages-select <names_of_existing_packages>
```
###### *Any security issue should be reported using process at https://01.org/security*
================================================
FILE: docker/Dockerfile
================================================
########################################################
# Based on Ubuntu 18.04
########################################################
# Set the base image to ubuntu 18.04
FROM ubuntu:bionic
MAINTAINER Liu Cong "congx.liu@intel.com"
ARG DEPS_DIR=/root/deps
WORKDIR $DEPS_DIR
# install ros2 grasp library deps
COPY ./script/ $DEPS_DIR/script/
RUN bash script/install_ros2_grasp_library_deps.sh /root/deps
WORKDIR /root
ENTRYPOINT ["/root/script/ros_entrypoint.sh"]
CMD ["bash"]
================================================
FILE: docker/README.md
================================================
# Precondition
## add docker group
```
sudo groupadd docker
sudo usermod -aG docker $USER
```
## Build docker image
```
cd ros2_grasp_library/docker
docker build -t intel/ros2:ros2_grasp_library_deps .
```
If your use proxy
```
docker build -t intel/ros2:ros2_grasp_library_deps --build-arg http_proxy=<proxy>:<port> --build-arg https_proxy=<proxy>:<port> .
```
## OPTION:Please refer below command to verify image creating success
```
docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
intel/ros2 ros2_grasp_library_deps b6d619a01f33 1 hours ago 6.92GB
```
# Run OpenVINO Grasp Library with RGBD Camera
## Terminal 1: Build ros2_grasp_library and launch Rviz2 to illustrate detection results.
After the project runs, there will be a pop-up x window, you need to set the operating environment first.
```
./setup_docker_display.sh
docker run -it --rm --privileged -v /tmp/.X11-unix:/tmp/.X11-unix:rw -v /tmp/.docker.xauth:/tmp/.docker.xauth:rw -v /dev/bus/usb:/dev/bus/usb \
-v /dev:/dev:rw -e XAUTHORITY=/tmp/.docker.xauth -e DISPLAY --name ros2_grasp_library intel/ros2:ros2_grasp_library_deps bash
# cd /root/
# mkdir -p ros2_ws/src
# cd ros2_ws/src
# git clone https://github.com/intel/ros2_grasp_library.git
# git clone https://github.com/intel/ros2_intel_realsense.git -b refactor
# git clone https://github.com/intel/ros2_openvino_toolkit.git
# cd ..
# colcon build --symlink-install --packages-select grasp_msgs moveit_msgs people_msgs grasp_ros2 realsense_msgs realsense_ros realsense_node
# source ./install/local_setup.bash
# ros2 run rviz2 rviz2 -d src/ros2_grasp_library/grasp_ros2/rviz2/grasp.rviz
```
## Terminal 2: launch RGBD camera
```
docker exec -t -i ros2_grasp_library bash
# source /root/ros2_ws/install/setup.bash
# ros2 run realsense_node realsense_node
```
## Terminal 3: launch Grasp Library
```
docker exec -t -i ros2_grasp_library bash
# source /root/ros2_ws/install/setup.bash
# ros2 run grasp_ros2 grasp_ros2 __params:=src/ros2_grasp_library/grasp_ros2/cfg/grasp_ros2_params.yaml
```
Note: If you haven't already installed or want more information on how to use docker, please see the article here for more information:
https://docs.docker.com/install/
================================================
FILE: docker/script/00_ros2_install.sh
================================================
#!/bin/bash
DEPS_DIR=${DEPS_PATH}
ros2_version=dashing
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
# fix popup caused by libssl
$SUDO apt-get install -y debconf-utils \
echo 'libssl1.0.0:amd64 libraries/restart-without-asking boolean true' | $SUDO debconf-set-selections
# Authorize gpg key with apt
$SUDO apt-get update && $SUDO apt-get install -y curl gnupg2 lsb-release &&\
curl http://repo.ros2.org/repos.key | $SUDO apt-key add -
# Add the repository to sources list
$SUDO sh -c 'echo "deb [arch=amd64,arm64] http://packages.ros.org/ros2/ubuntu `lsb_release -cs` main" > /etc/apt/sources.list.d/ros2-latest.list'
# Install development tools and ROS tools
$SUDO apt-get update && $SUDO apt-get install -y \
python-rosdep \
python3-vcstool \
python3-colcon-common-extensions
# Install ROS 2 packages
echo "install $ros2_version"
$SUDO apt-get update && $SUDO apt-get install -y ros-${ros2_version}-desktop
================================================
FILE: docker/script/10_eigen_install.sh
================================================
#!/bin/bash
set -e
DEPS_DIR=${DEPS_PATH}
eigen_version=https://gitlab.com/libeigen/eigen/-/archive/3.2/eigen-3.2.tar.gz
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
cd $DEPS_DIR
$SUDO apt-get install -y gfortran
wget -t 3 -c $eigen_version
tar -xvf eigen-3.2.tar.gz
cd eigen-3.2 &&mkdir -p build && cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
make -j4
$SUDO make install
$SUDO rm -rf /usr/include/eigen3/
$SUDO ln -sf /usr/local/include/eigen3 /usr/include/
$SUDO make install
================================================
FILE: docker/script/11_libpcl_install.sh
================================================
#!/bin/bash
set -e
DEPS_DIR=${DEPS_PATH}
pcl_version=https://github.com/PointCloudLibrary/pcl/archive/pcl-1.8.1.tar.gz
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
cd $DEPS_DIR
$SUDO apt-get install -y libhdf5-dev python3-h5py python3-pip
wget -t 3 -c $pcl_version
tar -xvf pcl-1.8.1.tar.gz
cd pcl-pcl-1.8.1 &&mkdir -p build && cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
make -j4
$SUDO make install
================================================
FILE: docker/script/12_opencv_install.sh
================================================
#!/bin/bash
DEPS_DIR=${DEPS_PATH}
opencv_version=4.1.2
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
#install opencv
cd $DEPS_DIR
$SUDO apt-get update && $SUDO apt-get install -y build-essential \
libgtk2.0-dev \
pkg-config \
libavcodec-dev \
libavformat-dev \
libswscale-dev \
python-dev \
python-numpy \
libtbb2 \
libtbb-dev \
libjpeg-dev \
libpng-dev \
libtiff-dev \
libdc1394-22-dev
git clone --depth 1 https://github.com/opencv/opencv.git -b $opencv_version
git clone --depth 1 https://github.com/opencv/opencv_contrib.git -b $opencv_version
cd $DEPS_DIR/opencv
mkdir -p build && cd build
cd $DEPS_DIR/opencv/build
cmake -D OPENCV_EXTRA_MODULES_PATH=$DEPS_DIR/opencv_contrib/modules \
-D CMAKE_BUILD_TYPE=Release \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D BUILD_EXAMPLES=ON \
-D BUILD_opencv_xfeatures2d=OFF \
..
make -j4
$SUDO make install
echo "/usr/local/lib" | $SUDO tee /etc/ld.so.conf.d/opencv.conf
$SUDO ldconfig
================================================
FILE: docker/script/13_openvino_install.sh
================================================
#!/bin/bash
DEPS_DIR=${DEPS_PATH}
MKL_URL=https://github.com/intel/mkl-dnn/releases/download/v0.19/mklml_lnx_2019.0.5.20190502.tgz
MKL_VERSION=mklml_lnx_2019.0.5.20190502
OPENVINO_VERSION=2019_R3.1
SUDO=$1
if [ "$SUDO" == "" ];then
SUDO="sudo"
fi
# install mkl 2019.0.5.20190502
$SUDO apt-get update && $SUDO apt-get install -y wget
cd $DEPS_DIR
wget -t 3 -c ${MKL_URL} &&\
tar -xvf ${MKL_VERSION}.tgz &&\
cd ${MKL_VERSION} &&\
$SUDO mkdir -p /usr/local/lib/mklml &&\
$SUDO cp -rf ./lib /usr/local/lib/mklml &&\
$SUDO cp -rf ./include /usr/local/lib/mklml &&\
$SUDO touch /usr/local/lib/mklml/version.info
#install opencl 19.41.14441
cd $DEPS_DIR
mkdir -p opencl && cd opencl &&\
wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-gmmlib_19.3.2_amd64.deb &&\
wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-igc-core_1.0.2597_amd64.deb &&\
wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-igc-opencl_1.0.2597_amd64.deb &&\
wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-opencl_19.41.14441_amd64.deb &&\
wget -t 3 -c https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-ocloc_19.41.14441_amd64.deb &&\
$SUDO dpkg -i *.deb
#install cmake 3.11
if [ $(cmake --version|grep "version"|awk '{print $3}') != "3.14.3" ];then
cd $DEPS_DIR
wget -t 3 -c https://www.cmake.org/files/v3.14/cmake-3.14.3.tar.gz && \
tar xf cmake-3.14.3.tar.gz && \
(cd cmake-3.14.3 && ./bootstrap --parallel=$(nproc --all) && make --jobs=$(nproc --all) && $SUDO make install) && \
rm -rf cmake-3.14.3 cmake-3.14.3.tar.gz
fi
#install openvino 2019_R3.1
cd $DEPS_DIR
$SUDO apt-get update && $SUDO apt-get install -y git
git clone --depth 1 https://github.com/openvinotoolkit/openvino -b ${OPENVINO_VERSION}
cd $DEPS_DIR/openvino/inference-engine
git submodule update --init --recursive &&\
chmod +x install_dependencies.sh &&\
$SUDO ./install_dependencies.sh
mkdir -p build && cd build &&\
cmake -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DGEMM=MKL -DMKLROOT=/usr/local/lib/mklml \
-DTHREADING=OMP \
-DENABLE_MKL_DNN=ON \
-DENABLE_CLDNN=ON \
-DENABLE_OPENCV=OFF \
..
cd $DEPS_DIR/openvino/inference-engine/build
make -j8
cd $DEPS_DIR/openvino/inference-engine/build
$SUDO mkdir -p /usr/share/InferenceEngine &&\
$SUDO cp InferenceEngineConfig*.cmake /usr/share/InferenceEngine &&\
$SUDO cp targets.cmake /usr/share/InferenceEngine &&\
echo `pwd`/../bin/intel64/Release/lib | $SUDO tee -a /etc/ld.so.conf.d/openvino.conf &&\
$SUDO ldconfig
$SUDO ln -sf $DEPS_DIR/openvino /opt/openvino_toolkit/openvino
================================================
FILE: docker/script/20_librealsense_install.sh
================================================
#!/bin/bash
DEPS_DIR=${DEPS_PATH}
librealsense_version=2.31.0-0~realsense0.1791
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
# install librealsense v2.34.0-0~realsense0.2251
echo "install librealsense 2.34.0-0~realsense0.2251"
cd $DEPS_DIR
if [ $http_proxy == "" ];then
$SUDO apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE
else
$SUDO apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --keyserver-options http-proxy=$http_proxy --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE
fi
$SUDO sh -c 'echo "deb http://realsense-hw-public.s3.amazonaws.com/Debian/apt-repo `lsb_release -cs` main" > /etc/apt/sources.list.d/librealsense.list'
$SUDO apt-get update && $SUDO apt-get install -y librealsense2=${librealsense_version} \
librealsense2-dev=${librealsense_version} \
librealsense2-udev-rules=${librealsense_version} \
librealsense2-gl=${librealsense_version} \
librealsense2-utils=${librealsense_version} \
librealsense2-dbg=${librealsense_version} \
librealsense2-dkms
================================================
FILE: docker/script/30_gpg_install.sh
================================================
#!/bin/bash
DEPS_DIR=${DEPS_PATH}
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
# install gpg
cd $DEPS_DIR
wget -t 3 -c https://github.com/atenpas/gpg/archive/3dcd656d70f095ad1bda3d2fb597a994198466ab.zip
unzip 3dcd656d70f095ad1bda3d2fb597a994198466ab.zip
cd gpg-3dcd656d70f095ad1bda3d2fb597a994198466ab
mkdir -p build && cd build
cmake .. && make
$SUDO make install
ls /usr/local/lib/libgrasp_candidates_generator.so
================================================
FILE: docker/script/31_gpd_install.sh
================================================
#!/bin/bash
DEPS_DIR=${DEPS_PATH}
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
# install gpd
cd $DEPS_DIR
git clone --depth 1 https://github.com/sharronliu/gpd.git -b libgpd
cd gpd/src/gpd
mkdir -p build && cd build
cmake -DUSE_OPENVINO=ON .. && make
$SUDO make install
================================================
FILE: docker/script/32_ur_modern_driver_install.sh
================================================
#!/bin/bash
DEPS_DIR=${DEPS_PATH}
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
# install ur_modern_driver
cd $DEPS_DIR
git clone --depth 1 https://github.com/RoboticsYY/ur_modern_driver.git -b libur_modern_driver
cd ur_modern_driver/libur_modern_driver
mkdir -p build && cd build
cmake -DCMAKE_BUILD_TYPE=Release .. && make
$SUDO make install
================================================
FILE: docker/script/50_ros2_deps.sh
================================================
#!/bin/bash
SUDO=$1
if [ "$SUDO" == "sudo" ];then
SUDO="sudo"
else
SUDO=""
fi
$SUDO apt-get install -y ros-dashing-object-msgs \
python3-scipy \
ros-dashing-eigen3-cmake-module
WORK_DIR=${DEPS_PATH}/../ros2_ws
mkdir -p $WORK_DIR/src &&cd $WORK_DIR/src
git clone --depth 1 https://github.com/RoboticsYY/ros2_ur_description.git
git clone --depth 1 https://github.com/RoboticsYY/handeye
git clone --depth 1 https://github.com/RoboticsYY/criutils.git
git clone --depth 1 https://github.com/RoboticsYY/baldor.git
git clone --depth 1 https://github.com/intel/ros2_intel_realsense.git -b refactor
git clone --depth 1 https://github.com/intel/ros2_grasp_library.git
cd $WORK_DIR
source /opt/ros/dashing/setup.sh
export InferenceEngine_DIR=/opt/openvino_toolkit/openvino/inference-engine/build/
export export CPU_EXTENSION_LIB=/opt/openvino_toolkit/openvino/inference-engine/bin/intel64/Release/lib/libcpu_extension.so
export GFLAGS_LIB=/opt/openvino_toolkit/openvino/inference-engine/bin/intel64/Release/lib/libgflags_nothreads.a
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$InferenceEngine_DIR/../bin/intel64/Release/lib:/usr/local/lib/mklml/lib
colcon build --symlink-install
================================================
FILE: docker/script/install_ros2_grasp_library.sh
================================================
#!/bin/bash
set -e
deps_path=$1
if [ -z "$deps_path" ]; then
echo -e "warring:\n install_ros2_grasp_library_deps.sh <your-install-deps-path>"
echo -e "If you want to use'sudo' : install_ros2_grasp_library_deps.sh <your-install-deps-path> sudo"
exit 0
fi
shift
# mkdir deps-path
echo "DEPS_PATH = $deps_path"
mkdir -p $deps_path
export DEPS_PATH=$deps_path
CURRENT_DIR=$(dirname "$(readlink -f "${BASH_SOURCE[0]}")")
echo "CURRENT_DIR = ${CURRENT_DIR}"
# install ros2 dashing
bash ${CURRENT_DIR}/00_ros2_install.sh $@
# instal eigen 3.2
bash ${CURRENT_DIR}/10_eigen_install.sh $@
# install libpcl 1.8.1
bash ${CURRENT_DIR}/11_libpcl_install.sh $@
# install opencv 4.1.2
bash ${CURRENT_DIR}/12_opencv_install.sh $@
# install openvino 2019_R3.1
bash ${CURRENT_DIR}/13_openvino_install.sh $@
# install librealsense 2.31
bash ${CURRENT_DIR}/20_librealsense_install.sh $@
# install gpg
bash ${CURRENT_DIR}/30_gpg_install.sh $@
# install gpd
bash ${CURRENT_DIR}/31_gpd_install.sh $@
# install ur_modern_driver
bash ${CURRENT_DIR}/32_ur_modern_driver_install.sh $@
# build ros2 other deps
bash ${CURRENT_DIR}/50_ros2_deps.sh $@
================================================
FILE: docker/script/ros_entrypoint.sh
================================================
#!/bin/bash
set -e
# setup ros2 environment
source /opt/ros/dashing/setup.bash
source /root/ros2_ws/install/setup.bash
exec "$@"
================================================
FILE: docker/script/ros_env.sh
================================================
#!/bin/bash
ROS_PATH=$(pwd)
# setup ros2 environment
source /opt/ros/dashing/setup.bash
source ${ROS_PATH}/install/setup.bash
export ROS_DOMAIN_ID=100 # robot_group_id
export InferenceEngine_DIR=/opt/openvino_toolkit/openvino/inference-engine/build/
export export CPU_EXTENSION_LIB=/opt/openvino_toolkit/openvino/inference-engine/bin/intel64/Release/lib/libcpu_extension.so
export GFLAGS_LIB=/opt/openvino_toolkit/openvino/inference-engine/bin/intel64/Release/lib/libgflags_nothreads.a
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$InferenceEngine_DIR/../bin/intel64/Release/lib:/usr/local/lib/mklml/lib
================================================
FILE: docker/setup_docker_display.sh
================================================
#!/bin/bash
set -e
# setup docker display
XSOCK=/tmp/.X11-unix
XAUTH=/tmp/.docker.xauth
touch $XAUTH
xauth nlist $DISPLAY | sed -e 's/^..../ffff/' | xauth -f $XAUTH nmerge -
================================================
FILE: grasp_apps/draw_x/CMakeLists.txt
================================================
# Copyright (c) 2019 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
cmake_minimum_required(VERSION 3.5)
project(draw_x)
if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 14)
endif()
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
add_compile_options(-Wall -Wextra -Wpedantic)
endif()
if(CMAKE_BUILD_TYPE EQUAL "RELEASE")
message(STATUS "Create Release Build.")
set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}")
else()
message(STATUS "Create Debug Build.")
endif()
find_package(ament_cmake REQUIRED)
find_package(rclcpp REQUIRED)
find_package(robot_interface REQUIRED)
include_directories(
include
${rclcpp_INCLUDE_DIRS}
${robot_interface_INCLUDE_DIRS}
)
# draw_x app
add_executable(${PROJECT_NAME}
src/draw_x.cpp
)
ament_target_dependencies(${PROJECT_NAME}
"rclcpp"
"robot_interface"
)
target_link_libraries(${PROJECT_NAME}
${ament_LIBRARIES}
${robot_interface_LIBRARIES}
)
# Install binaries
install(TARGETS ${PROJECT_NAME}
RUNTIME DESTINATION bin
)
install(TARGETS ${PROJECT_NAME}
DESTINATION lib/${PROJECT_NAME}
)
# Install launch files.
install(DIRECTORY
launch
DESTINATION share/${PROJECT_NAME}/
)
# Flags
if(UNIX OR APPLE)
# Linker flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# GCC specific flags. ICC is compatible with them.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# In Clang, -z flags are not compatible, they need to be passed to linker via -Wl.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
endif()
# Compiler flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU")
# GCC specific flags.
if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR
CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong")
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
endif()
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# Clang is compatbile with some of the flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# Same as above, with exception that ICC compilation crashes with -fPIE option, even
# though it uses -pie linker option that require -fPIE during compilation. Checksec
# shows that it generates correct PIE anyway if only -pie is provided.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector")
endif()
# Generic flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \
-Wall -fopenmp")
set( CUDA_PROPAGATE_HOST_FLAGS OFF )
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie")
endif()
ament_package()
================================================
FILE: grasp_apps/draw_x/launch/draw_x.launch.py
================================================
# Copyright (c) 2019 Intel Corporation. All Rights Reserved
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import launch
import launch.actions
import launch.substitutions
import launch_ros.actions
from ament_index_python.packages import get_package_share_directory
def generate_launch_description():
# .yaml file for configuring the parameters
yaml = os.path.join(
get_package_share_directory('draw_x'),
'launch', 'draw_x.yaml'
)
return launch.LaunchDescription([
launch_ros.actions.Node(
package='draw_x',
node_executable='draw_x',
output='screen', arguments=['__params:='+yaml]),
])
================================================
FILE: grasp_apps/draw_x/launch/draw_x.yaml
================================================
robot_control:
ros__parameters:
host: "192.168.0.5"
shutdown_on_disconnect: true
joint_names: ["shoulder_pan_joint", "shoulder_lift_joint", "elbow_joint", "wrist_1_joint", "wrist_2_joint", "wrist_3_joint"]
================================================
FILE: grasp_apps/draw_x/package.xml
================================================
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format2.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="2">
<name>draw_x</name>
<version>0.5.0</version>
<description>A demo app for draw_x</description>
<author email="yu.yan@intel.com">Yu Yan</author>
<maintainer email="yu.yan@intel.com">Yu Yan</maintainer>
<license>Apache License 2.0</license>
<buildtool_depend>ament_cmake</buildtool_depend>
<build_depend>rclcpp</build_depend>
<build_depend>robot_interface</build_depend>
<exec_depend>rclcpp</exec_depend>
<exec_depend>robot_interface</exec_depend>
<test_depend>ament_lint_auto</test_depend>
<test_depend>ament_lint_common</test_depend>
<export>
<build_type>ament_cmake</build_type>
</export>
</package>
================================================
FILE: grasp_apps/draw_x/src/draw_x.cpp
================================================
#include <geometry_msgs/msg/pose_stamped.hpp>
#include <rclcpp/rclcpp.hpp>
#include <robot_interface/control_ur.hpp>
/* pose in joint values*/
static const std::vector<double> HOME = {0.87, -1.44, 1.68, -1.81, -1.56, 0};
/* pose in [x, y, z, R, P, Y]*/
static const std::vector<double> CORNER1_POSE = { 0.1, -0.65, 0.15, 3.14, 0, -3.14};
static const std::vector<double> CORNER2_POSE = {-0.1, -0.45, 0.15, 3.14, 0, -3.14};
static const std::vector<double> CORNER3_POSE = {-0.1, -0.65, 0.15, 3.14, 0, -3.14};
static const std::vector<double> CORNER4_POSE = { 0.1, -0.45, 0.15, 3.14, 0, -3.14};
int main(int argc, char **argv)
{
rclcpp::init(argc, argv);
// init robot control
auto robot = std::make_shared<URControl>("robot_control",
rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true));
robot->parseArgs();
robot->startLoop();
rclcpp::sleep_for(2s);
// Move to home
robot->moveToJointValues(HOME, 1.05, 1.4);
// Move to the first corner
robot->moveToTcpPose(CORNER1_POSE[0], CORNER1_POSE[1], CORNER1_POSE[2],
CORNER1_POSE[3], CORNER1_POSE[4], CORNER1_POSE[5], 1.05, 1.4);
robot->moveToTcpPose(CORNER1_POSE[0], CORNER1_POSE[1], CORNER1_POSE[2] - 0.05,
CORNER1_POSE[3], CORNER1_POSE[4], CORNER1_POSE[5], 1.05, 1.4);
// Move to the second corner
robot->moveToTcpPose(CORNER2_POSE[0], CORNER2_POSE[1], CORNER2_POSE[2] - 0.05,
CORNER2_POSE[3], CORNER2_POSE[4], CORNER2_POSE[5], 1.05, 1.4);
robot->moveToTcpPose(CORNER2_POSE[0], CORNER2_POSE[1], CORNER2_POSE[2],
CORNER2_POSE[3], CORNER2_POSE[4], CORNER2_POSE[5], 1.05, 1.4);
// Move to the third corner
robot->moveToTcpPose(CORNER3_POSE[0], CORNER3_POSE[1], CORNER3_POSE[2],
CORNER3_POSE[3], CORNER3_POSE[4], CORNER3_POSE[5], 1.05, 1.4);
robot->moveToTcpPose(CORNER3_POSE[0], CORNER3_POSE[1], CORNER3_POSE[2] - 0.05,
CORNER3_POSE[3], CORNER3_POSE[4], CORNER3_POSE[5], 1.05, 1.4);
// Move to the fourth corner
robot->moveToTcpPose(CORNER4_POSE[0], CORNER4_POSE[1], CORNER4_POSE[2] - 0.05,
CORNER4_POSE[3], CORNER4_POSE[4], CORNER4_POSE[5], 1.05, 1.4);
robot->moveToTcpPose(CORNER4_POSE[0], CORNER4_POSE[1], CORNER4_POSE[2],
CORNER4_POSE[3], CORNER4_POSE[4], CORNER4_POSE[5], 1.05, 1.4);
// Move back to home
robot->moveToJointValues(HOME, 1.05, 1.4);
rclcpp::shutdown();
return 0;
}
================================================
FILE: grasp_apps/fixed_position_pick/CMakeLists.txt
================================================
# Copyright (c) 2019 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
cmake_minimum_required(VERSION 3.5)
project(fixed_position_pick)
if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 14)
endif()
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
add_compile_options(-Wall -Wextra -Wpedantic)
endif()
if(CMAKE_BUILD_TYPE EQUAL "RELEASE")
message(STATUS "Create Release Build.")
set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}")
else()
message(STATUS "Create Debug Build.")
endif()
find_package(ament_cmake REQUIRED)
find_package(rclcpp REQUIRED)
find_package(robot_interface REQUIRED)
include_directories(
include
${rclcpp_INCLUDE_DIRS}
${robot_interface_INCLUDE_DIRS}
)
# draw_x app
add_executable(${PROJECT_NAME}
src/fixed_position_pick.cpp
)
ament_target_dependencies(${PROJECT_NAME}
"rclcpp"
"robot_interface"
)
target_link_libraries(${PROJECT_NAME}
${ament_LIBRARIES}
${robot_interface_LIBRARIES}
)
# Install binaries
install(TARGETS ${PROJECT_NAME}
RUNTIME DESTINATION bin
)
install(TARGETS ${PROJECT_NAME}
DESTINATION lib/${PROJECT_NAME}
)
# Install launch files.
install(DIRECTORY
launch
DESTINATION share/${PROJECT_NAME}/
)
# Flags
if(UNIX OR APPLE)
# Linker flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# GCC specific flags. ICC is compatible with them.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# In Clang, -z flags are not compatible, they need to be passed to linker via -Wl.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
endif()
# Compiler flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU")
# GCC specific flags.
if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR
CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong")
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
endif()
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# Clang is compatbile with some of the flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# Same as above, with exception that ICC compilation crashes with -fPIE option, even
# though it uses -pie linker option that require -fPIE during compilation. Checksec
# shows that it generates correct PIE anyway if only -pie is provided.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector")
endif()
# Generic flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \
-Wall -fopenmp")
set( CUDA_PROPAGATE_HOST_FLAGS OFF )
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie")
endif()
ament_package()
================================================
FILE: grasp_apps/fixed_position_pick/launch/fixed_position_pick.launch.py
================================================
# Copyright (c) 2019 Intel Corporation. All Rights Reserved
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import launch
import launch.actions
import launch.substitutions
import launch_ros.actions
from ament_index_python.packages import get_package_share_directory
def generate_launch_description():
# .yaml file for configuring the parameters
yaml = os.path.join(
get_package_share_directory('fixed_position_pick'),
'launch', 'fixed_position_pick.yaml'
)
return launch.LaunchDescription([
launch_ros.actions.Node(
package='fixed_position_pick',
node_executable='fixed_position_pick',
output='screen', arguments=['__params:='+yaml]),
])
================================================
FILE: grasp_apps/fixed_position_pick/launch/fixed_position_pick.yaml
================================================
robot_control:
ros__parameters:
host: "192.168.0.5"
shutdown_on_disconnect: true
joint_names: ["shoulder_pan_joint", "shoulder_lift_joint", "elbow_joint", "wrist_1_joint", "wrist_2_joint", "wrist_3_joint"]
================================================
FILE: grasp_apps/fixed_position_pick/package.xml
================================================
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format2.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="2">
<name>fixed_position_pick</name>
<version>0.5.0</version>
<description>A demo app for draw_x</description>
<author email="yu.yan@intel.com">Yu Yan</author>
<maintainer email="yu.yan@intel.com">Yu Yan</maintainer>
<license>Apache License 2.0</license>
<buildtool_depend>ament_cmake</buildtool_depend>
<build_depend>rclcpp</build_depend>
<build_depend>robot_interface</build_depend>
<exec_depend>rclcpp</exec_depend>
<exec_depend>robot_interface</exec_depend>
<test_depend>ament_lint_auto</test_depend>
<test_depend>ament_lint_common</test_depend>
<export>
<build_type>ament_cmake</build_type>
</export>
</package>
================================================
FILE: grasp_apps/fixed_position_pick/src/fixed_position_pick.cpp
================================================
#include <geometry_msgs/msg/pose_stamped.hpp>
#include <rclcpp/rclcpp.hpp>
#include <robot_interface/control_ur.hpp>
/* pose in joint values*/
static const std::vector<double> HOME = {0.87, -1.44, 1.68, -1.81, -1.56, 0};
/* pose in [x, y, z, qx, qy, qz, qw]*/
static const std::vector<double> PICK_POSE = { -0.157402, -0.679509, 0.094437, 0.190600, 0.948295, 0.239947, 0.082662};
static const std::vector<double> PLACE_POSE = {-0.350, -0.296, 0.145, -0.311507, 0.950216, -0.004305, 0.005879};
int main(int argc, char **argv)
{
rclcpp::init(argc, argv);
// init robot control
auto robot = std::make_shared<URControl>("robot_control",
rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true));
robot->parseArgs();
robot->startLoop();
rclcpp::sleep_for(2s);
// Move to home
robot->moveToJointValues(HOME, 1.05, 1.4);
// Pick
geometry_msgs::msg::PoseStamped pose_pick;
pose_pick.header.frame_id = "base";
pose_pick.header.stamp = robot->now();
pose_pick.pose.position.x = PICK_POSE[0];
pose_pick.pose.position.y = PICK_POSE[1];
pose_pick.pose.position.z = PICK_POSE[2];
pose_pick.pose.orientation.x = PICK_POSE[3];
pose_pick.pose.orientation.y = PICK_POSE[4];
pose_pick.pose.orientation.z = PICK_POSE[5];
pose_pick.pose.orientation.w = PICK_POSE[6];
robot->pick(pose_pick, 1.05, 1.4, 0.5, 0.1);
// Place
geometry_msgs::msg::PoseStamped pose_place;
pose_place.header.frame_id = "base";
pose_place.header.stamp = robot->now();
pose_place.pose.position.x = PLACE_POSE[0];
pose_place.pose.position.y = PLACE_POSE[1];
pose_place.pose.position.z = PLACE_POSE[2];
pose_place.pose.orientation.x = PLACE_POSE[3];
pose_place.pose.orientation.y = PLACE_POSE[4];
pose_place.pose.orientation.z = PLACE_POSE[5];
pose_place.pose.orientation.w = PLACE_POSE[6];
robot->place(pose_place, 1.05, 1.4, 0.5, 0.1);
// Move back to home
robot->moveToJointValues(HOME, 1.05, 1.4);
rclcpp::shutdown();
return 0;
}
================================================
FILE: grasp_apps/random_pick/CMakeLists.txt
================================================
# Copyright (c) 2019 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
option(BUILD_RANDOM_PICK "build random_pick app" ON)
if(NOT BUILD_RANDOM_PICK STREQUAL "ON")
return()
endif()
cmake_minimum_required(VERSION 3.5)
project(random_pick)
if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 14)
endif()
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
add_compile_options(-Wall -Wextra -Wpedantic)
endif()
if(CMAKE_BUILD_TYPE EQUAL "RELEASE")
message(STATUS "Create Release Build.")
set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}")
else()
message(STATUS "Create Debug Build.")
endif()
find_package(ament_cmake REQUIRED)
find_package(rclcpp REQUIRED)
find_package(moveit_msgs REQUIRED)
find_package(robot_interface REQUIRED)
find_package(tf2_ros REQUIRED)
include_directories(
include
${rclcpp_INCLUDE_DIRS}
${moveit_msgs_INCLUDE_DIRS}
${robot_interface_INCLUDE_DIRS}
${tf2_ros_INCLUDE_DIRS}
)
# random_pick app
add_executable(${PROJECT_NAME}
src/random_pick.cpp
)
ament_target_dependencies(${PROJECT_NAME}
"rclcpp"
"moveit_msgs"
"robot_interface"
"tf2_ros"
)
target_link_libraries(${PROJECT_NAME}
${ament_LIBRARIES}
${robot_interface_LIBRARIES}
)
# Install binaries
install(TARGETS ${PROJECT_NAME}
RUNTIME DESTINATION bin
)
install(TARGETS ${PROJECT_NAME}
DESTINATION lib/${PROJECT_NAME}
)
# Flags
if(UNIX OR APPLE)
# Linker flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# GCC specific flags. ICC is compatible with them.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# In Clang, -z flags are not compatible, they need to be passed to linker via -Wl.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
endif()
# Compiler flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU")
# GCC specific flags.
if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR
CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong")
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
endif()
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# Clang is compatbile with some of the flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# Same as above, with exception that ICC compilation crashes with -fPIE option, even
# though it uses -pie linker option that require -fPIE during compilation. Checksec
# shows that it generates correct PIE anyway if only -pie is provided.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector")
endif()
# Generic flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \
-Wall -fopenmp")
set( CUDA_PROPAGATE_HOST_FLAGS OFF )
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie")
endif()
ament_package()
================================================
FILE: grasp_apps/random_pick/cfg/random_pick.yaml
================================================
robot_control:
ros__parameters:
host: "192.168.1.5"
================================================
FILE: grasp_apps/random_pick/package.xml
================================================
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format2.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="2">
<name>random_pick</name>
<version>0.5.0</version>
<description>A demo app for grasp detection, and random picking</description>
<author email="sharron.liu@intel.com">Sharron LIU</author>
<maintainer email="sharron.liu@intel.com">Sharron LIU</maintainer>
<license>Apache License 2.0</license>
<buildtool_depend>ament_cmake</buildtool_depend>
<build_depend>rclcpp</build_depend>
<build_depend>moveit_msgs</build_depend>
<build_depend>people_msgs</build_depend>
<build_depend>robot_interface</build_depend>
<build_depend>tf2_ros</build_depend>
<exec_depend>rclcpp</exec_depend>
<exec_depend>moveit_msgs</exec_depend>
<exec_depend>people_msgs</exec_depend>
<exec_depend>robot_interface</exec_depend>
<exec_depend>tf2_ros</exec_depend>
<test_depend>ament_lint_auto</test_depend>
<test_depend>ament_lint_common</test_depend>
<export>
<build_type>ament_cmake</build_type>
</export>
</package>
================================================
FILE: grasp_apps/random_pick/src/random_pick.cpp
================================================
#include <geometry_msgs/msg/pose_stamped.hpp>
#include <moveit_msgs/msg/move_it_error_codes.hpp>
#include <moveit_msgs/msg/place_location.hpp>
#include <moveit_msgs/srv/grasp_planning.hpp>
#include <rclcpp/logger.hpp>
#include <rclcpp/rclcpp.hpp>
#include <robot_interface/control_ur.hpp>
#include <tf2_ros/static_transform_broadcaster.h>
#define robot_enable
using GraspPlanning = moveit_msgs::srv::GraspPlanning;
/* pick position in [x, y, z, R, P, Y]*/
static std::vector<double> pick_ = {0.0, -0.54, 0.145, 3.14, 0.0, 1.956};
/* place position in [x, y, z, R, P, Y]*/
static std::vector<double> place_ = {-0.50, -0.30, 0.20, 3.14, 0.0, 1.956};
/* pre-pick position in joint values*/
static std::vector<double> joint_values_pick = {1.065, -1.470, 1.477, -1.577, -1.556, 0};
/* place position in joint values*/
static std::vector<double> joint_values_place = {0.385, -1.470, 1.477, -1.577, -1.556, 0};
static double vel_ = 0.9, acc_ = 0.9, vscale_ = 0.9, appr_ = 0.1;
static std::shared_ptr<URControl> robot_ = nullptr;
static rclcpp::Node::SharedPtr node_ = nullptr;
static std::shared_ptr<GraspPlanning::Response> result_ = nullptr;
int main(int argc, char **argv)
{
rclcpp::init(argc, argv);
// init robot control
robot_ = std::make_shared<URControl>("robot_control",
rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true));
robot_->parseArgs();
robot_->startLoop();
rclcpp::sleep_for(2s);
#ifdef robot_enable
// reset joint
robot_->moveToJointValues(joint_values_place, vel_, acc_);
#endif
// init random pick node
node_ = rclcpp::Node::make_shared("random_pick");
tf2_ros::StaticTransformBroadcaster tfb(node_);
// create client for grasp planning
auto client = node_->create_client<GraspPlanning>("plan_grasps");
// wait for service
while (!client->wait_for_service(5s)) {
RCLCPP_INFO(node_->get_logger(), "Wait for service");
}
RCLCPP_INFO(node_->get_logger(), "Got service");
while(rclcpp::ok())
{
// request grasp poses
auto request = std::make_shared<GraspPlanning::Request>();
auto result_future = client->async_send_request(request);
RCLCPP_INFO(node_->get_logger(), "Request sent");
// wait for response
if (rclcpp::spin_until_future_complete(node_, result_future) !=
rclcpp::executor::FutureReturnCode::SUCCESS)
{
continue;
}
// get response
if (moveit_msgs::msg::MoveItErrorCodes::SUCCESS == result_future.get()->error_code.val) {
result_ = result_future.get();
RCLCPP_INFO(node_->get_logger(), "Response received %d", result_->error_code.val);
} else continue;
geometry_msgs::msg::PoseStamped p = result_->grasps[0].grasp_pose;
// publish grasp pose
tf2::Quaternion q(p.pose.orientation.x, p.pose.orientation.y, p.pose.orientation.z, p.pose.orientation.w);
double roll, pitch, yaw;
tf2::Matrix3x3 r;
r.setRotation(q);
r.getRPY(roll, pitch, yaw);
RCLCPP_INFO(node_->get_logger(), "**********pick pose [position %f %f %f, quat %f %f %f %f, RPY %f %f %f]",
p.pose.position.x, p.pose.position.y, p.pose.position.z,
p.pose.orientation.x, p.pose.orientation.y, p.pose.orientation.z, p.pose.orientation.w,
roll, pitch, yaw);
geometry_msgs::msg::TransformStamped tf_msg;
tf_msg.header = p.header;
tf_msg.child_frame_id = "grasp_pose";
tf_msg.transform.translation.x = p.pose.position.x;
tf_msg.transform.translation.y = p.pose.position.y;
tf_msg.transform.translation.z = p.pose.position.z;
tf_msg.transform.rotation = p.pose.orientation;
tfb.sendTransform(tf_msg);
#ifdef robot_enable
// pick
robot_->moveToJointValues(joint_values_pick, vel_, acc_);
robot_->pick(p, vel_, acc_, vscale_, appr_);
// place
robot_->moveToJointValues(joint_values_place, vel_, acc_);
robot_->place(place_[0], place_[1], place_[2], place_[3], place_[4], place_[5], vel_, acc_, vscale_, appr_);
#endif
}
rclcpp::shutdown();
return 0;
}
================================================
FILE: grasp_apps/recognize_pick/CMakeLists.txt
================================================
# Copyright (c) 2019 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
option(BUILD_RECOGNIZE_PICK "build recognize_pick app" OFF)
if(NOT BUILD_RECOGNIZE_PICK STREQUAL "ON")
return()
endif()
cmake_minimum_required(VERSION 3.5)
project(recognize_pick)
if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 14)
endif()
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
add_compile_options(-Wall -Wextra -Wpedantic)
endif()
if(CMAKE_BUILD_TYPE EQUAL "RELEASE")
message(STATUS "Create Release Build.")
set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}")
else()
message(STATUS "Create Debug Build.")
endif()
find_package(ament_cmake REQUIRED)
find_package(rclcpp REQUIRED)
find_package(moveit_msgs REQUIRED)
find_package(people_msgs REQUIRED)
find_package(robot_interface REQUIRED)
find_package(tf2_ros REQUIRED)
include_directories(
include
${rclcpp_INCLUDE_DIRS}
${moveit_msgs_INCLUDE_DIRS}
${people_msgs_INCLUDE_DIRS}
${robot_interface_INCLUDE_DIRS}
${tf2_ros_INCLUDE_DIRS}
)
# recognize_pick app
add_executable(${PROJECT_NAME}
src/recognize_pick.cpp
)
ament_target_dependencies(${PROJECT_NAME}
"rclcpp"
"moveit_msgs"
"people_msgs"
"robot_interface"
"tf2_ros"
)
target_link_libraries(${PROJECT_NAME}
${ament_LIBRARIES}
${robot_interface_LIBRARIES}
)
# place publisher app
set(PLACE_PUBLISHER place_publisher)
add_executable(${PLACE_PUBLISHER}
src/place_publisher.cpp
)
ament_target_dependencies(${PLACE_PUBLISHER}
"rclcpp"
"moveit_msgs"
)
target_link_libraries(${PLACE_PUBLISHER}
${ament_LIBRARIES}
)
# Install binaries
install(TARGETS ${PROJECT_NAME} ${PLACE_PUBLISHER}
RUNTIME DESTINATION bin
)
install(TARGETS ${PROJECT_NAME} ${PLACE_PUBLISHER}
DESTINATION lib/${PROJECT_NAME}
)
# Flags
if(UNIX OR APPLE)
# Linker flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# GCC specific flags. ICC is compatible with them.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# In Clang, -z flags are not compatible, they need to be passed to linker via -Wl.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
endif()
# Compiler flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU")
# GCC specific flags.
if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR
CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong")
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
endif()
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# Clang is compatbile with some of the flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# Same as above, with exception that ICC compilation crashes with -fPIE option, even
# though it uses -pie linker option that require -fPIE during compilation. Checksec
# shows that it generates correct PIE anyway if only -pie is provided.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector")
endif()
# Generic flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \
-Wall -fopenmp")
set( CUDA_PROPAGATE_HOST_FLAGS OFF )
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie")
endif()
ament_package()
================================================
FILE: grasp_apps/recognize_pick/cfg/recognize_pick.yaml
================================================
robot_control:
ros__parameters:
host: "192.168.1.5"
================================================
FILE: grasp_apps/recognize_pick/package.xml
================================================
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format2.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="2">
<name>recognize_pick</name>
<version>0.5.0</version>
<description>A demo app for object segmentation, grasp detection, and picking</description>
<author email="sharron.liu@intel.com">Sharron LIU</author>
<maintainer email="sharron.liu@intel.com">Sharron LIU</maintainer>
<license>Apache License 2.0</license>
<buildtool_depend>ament_cmake</buildtool_depend>
<build_depend>rclcpp</build_depend>
<build_depend>moveit_msgs</build_depend>
<build_depend>people_msgs</build_depend>
<build_depend>robot_interface</build_depend>
<build_depend>tf2_ros</build_depend>
<exec_depend>rclcpp</exec_depend>
<exec_depend>moveit_msgs</exec_depend>
<exec_depend>people_msgs</exec_depend>
<exec_depend>robot_interface</exec_depend>
<exec_depend>tf2_ros</exec_depend>
<test_depend>ament_lint_auto</test_depend>
<test_depend>ament_lint_common</test_depend>
<export>
<build_type>ament_cmake</build_type>
</export>
</package>
================================================
FILE: grasp_apps/recognize_pick/src/place_publisher.cpp
================================================
// Copyright (c) 2019 Intel Corporation
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <rclcpp/logger.hpp>
#include <rclcpp/rclcpp.hpp>
#include <moveit_msgs/msg/place_location.hpp>
int main(int argc, char ** argv) {
std::vector<std::string> args = rclcpp::init_and_remove_ros_arguments(argc, argv);
auto node = rclcpp::Node::make_shared("PlacePublisher");
auto pub = node->create_publisher<moveit_msgs::msg::PlaceLocation>("/recognize_pick/place", 10);
rclcpp::Clock clock(RCL_ROS_TIME);
moveit_msgs::msg::PlaceLocation p;
if (args.size() < 5) {
p.place_pose.pose.position.x = -0.45;
p.place_pose.pose.position.y = -0.30;
p.place_pose.pose.position.z = 0.125;
} else {
p.place_pose.pose.position.x = atof(args[2].c_str());
p.place_pose.pose.position.y = atof(args[3].c_str());
p.place_pose.pose.position.z = atof(args[4].c_str());
}
if (args.size() < 2) {
RCLCPP_INFO(node->get_logger(), "Place publisher specifying object name and place position.");
RCLCPP_INFO(node->get_logger(), "Usage: place_publisher object_name [x y z]");
RCLCPP_INFO(node->get_logger(), "Example: place_publisher sports_ball");
RCLCPP_INFO(node->get_logger(), "Example: place_publisher sports_ball -0.45 -0.30 0.125");
rclcpp::shutdown();
return 0;
} else {
p.id = args[1];
}
RCLCPP_INFO(node->get_logger(), "place publisher %s [%f %f %f]",
p.id.c_str(), p.place_pose.pose.position.x, p.place_pose.pose.position.y, p.place_pose.pose.position.z);
while (rclcpp::ok()) {
p.place_pose.header.stamp = clock.now();
p.place_pose.header.frame_id = "base";
pub->publish(p);
rclcpp::Rate(0.5).sleep();
}
rclcpp::spin(node);
rclcpp::shutdown();
return 0;
}
================================================
FILE: grasp_apps/recognize_pick/src/recognize_pick.cpp
================================================
#include <geometry_msgs/msg/pose_stamped.hpp>
#include <moveit_msgs/msg/move_it_error_codes.hpp>
#include <moveit_msgs/msg/place_location.hpp>
#include <moveit_msgs/srv/grasp_planning.hpp>
#include <rclcpp/logger.hpp>
#include <rclcpp/rclcpp.hpp>
#include <robot_interface/control_ur.hpp>
#include <tf2_ros/static_transform_broadcaster.h>
#define robot_enable
using GraspPlanning = moveit_msgs::srv::GraspPlanning;
/* pick position in [x, y, z, R, P, Y]*/
static std::vector<double> pick_ = {0.0, -0.54, 0.145, 3.14, 0.0, 1.956};
/* place position in [x, y, z, R, P, Y]*/
static std::vector<double> place_ = {-0.45, -0.30, 0.125, 3.14, 0.0, 1.956};
/* pre-pick position in joint values*/
static std::vector<double> joint_values_pick = {1.065, -1.470, 1.477, -1.577, -1.556, 0};
/* place position in joint values*/
static std::vector<double> joint_values_place = {0.385, -1.470, 1.477, -1.577, -1.556, 0};
static double vel_ = 0.4, acc_ = 0.4, vscale_ = 0.5, appr_ = 0.1;
static std::shared_ptr<URControl> robot_ = nullptr;
static rclcpp::Node::SharedPtr node_ = nullptr;
static std::shared_ptr<GraspPlanning::Response> result_ = nullptr;
static moveit_msgs::msg::PlaceLocation::SharedPtr place_pose_ = nullptr;
static void place_callback(const moveit_msgs::msg::PlaceLocation::SharedPtr msg) {
place_pose_ = msg;
}
int main(int argc, char **argv)
{
rclcpp::init(argc, argv);
// init robot control
robot_ = std::make_shared<URControl>("robot_control",
rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true));
robot_->parseArgs();
robot_->startLoop();
rclcpp::sleep_for(2s);
#ifdef robot_enable
// reset joint
robot_->moveToJointValues(joint_values_place, vel_, acc_);
#endif
// init random pick node
node_ = rclcpp::Node::make_shared("random_pick");
tf2_ros::StaticTransformBroadcaster tfb(node_);
// subscribe place callback
auto sub = node_->create_subscription<moveit_msgs::msg::PlaceLocation>(
"/recognize_pick/place", rclcpp::QoS(rclcpp::KeepLast(0)), place_callback);
// create client
auto client = node_->create_client<GraspPlanning>("plan_grasps");
// wait for service
while (!client->wait_for_service(5s)) {
RCLCPP_INFO(node_->get_logger(), "Wait for service");
}
RCLCPP_INFO(node_->get_logger(), "Got service");
while(rclcpp::ok())
{
if (place_pose_ == nullptr) {
RCLCPP_INFO(node_->get_logger(), "Wait for place mission");
rclcpp::spin_some(node_);
rclcpp::sleep_for(std::chrono::seconds(2));
continue;
}
moveit_msgs::msg::PlaceLocation::SharedPtr place = place_pose_;
RCLCPP_INFO(node_->get_logger(), "Place %s", place->id.c_str());
// get grasp poses
auto request = std::make_shared<GraspPlanning::Request>();
request->target.id = place->id;
auto result_future = client->async_send_request(request);
RCLCPP_INFO(node_->get_logger(), "Request sent");
// wait for response
if (rclcpp::spin_until_future_complete(node_, result_future) !=
rclcpp::executor::FutureReturnCode::SUCCESS)
{
continue;
}
// get response
if (moveit_msgs::msg::MoveItErrorCodes::SUCCESS == result_future.get()->error_code.val) {
result_ = result_future.get();
RCLCPP_INFO(node_->get_logger(), "Response received %d", result_->error_code.val);
} else continue;
geometry_msgs::msg::PoseStamped p = result_->grasps[0].grasp_pose;
// publish grasp pose
tf2::Quaternion q(p.pose.orientation.x, p.pose.orientation.y, p.pose.orientation.z, p.pose.orientation.w);
double roll, pitch, yaw;
tf2::Matrix3x3 r;
r.setRotation(q);
r.getRPY(roll, pitch, yaw);
RCLCPP_INFO(node_->get_logger(), "**********pick pose [position %f %f %f, quat %f %f %f %f, RPY %f %f %f]",
p.pose.position.x, p.pose.position.y, p.pose.position.z,
p.pose.orientation.x, p.pose.orientation.y, p.pose.orientation.z, p.pose.orientation.w,
roll, pitch, yaw);
geometry_msgs::msg::TransformStamped tf_msg;
tf_msg.header = p.header;
tf_msg.child_frame_id = "grasp_pose";
tf_msg.transform.translation.x = p.pose.position.x;
tf_msg.transform.translation.y = p.pose.position.y;
tf_msg.transform.translation.z = p.pose.position.z;
tf_msg.transform.rotation = p.pose.orientation;
tfb.sendTransform(tf_msg);
#ifdef robot_enable
// pick
robot_->moveToJointValues(joint_values_pick, vel_, acc_);
robot_->pick(p, vel_, acc_, vscale_, appr_);
// place
robot_->moveToJointValues(joint_values_place, vel_, acc_);
robot_->place(place_[0], place_[1], place_[2], place_[3], place_[4], place_[5], vel_, acc_, vscale_, appr_);
#endif
rclcpp::spin_some(node_);
place_pose_ = nullptr;
}
rclcpp::shutdown();
return 0;
}
================================================
FILE: grasp_msgs/CMakeLists.txt
================================================
cmake_minimum_required(VERSION 3.5)
project(grasp_msgs)
if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 14)
endif()
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
add_compile_options(-Wall -Wextra -Wpedantic)
endif()
find_package(ament_cmake REQUIRED)
find_package(rosidl_default_generators REQUIRED)
find_package(builtin_interfaces REQUIRED)
find_package(std_msgs REQUIRED)
find_package(geometry_msgs REQUIRED)
set(msg_files
"msg/GraspConfig.msg"
"msg/GraspConfigList.msg"
"msg/SamplesMsg.msg"
)
rosidl_generate_interfaces(${PROJECT_NAME}
${msg_files}
DEPENDENCIES builtin_interfaces std_msgs geometry_msgs
ADD_LINTER_TESTS
)
ament_export_dependencies(rosidl_default_runtime)
ament_package()
================================================
FILE: grasp_msgs/msg/CloudIndexed.msg
================================================
# This message holds a point cloud and a list of indices into the point cloud
# at which to sample grasp candidates.
# The point cloud.
gpd/CloudSources cloud_sources
# The indices into the point cloud at which to sample grasp candidates.
std_msgs/Int64[] indices
================================================
FILE: grasp_msgs/msg/CloudSamples.msg
================================================
# This message holds a point cloud and a list of samples at which the grasp
# detector should search for grasp candidates.
# The point cloud.
gpd/CloudSources cloud_sources
# The samples, as (x,y,z) points, at which to search for grasp candidates.
geometry_msgs/Point[] samples
================================================
FILE: grasp_msgs/msg/CloudSources.msg
================================================
# This message holds a point cloud that can be a combination of point clouds
# from different camera sources (at least one). For each point in the cloud,
# this message also stores the index of the camera that produced the point.
# The point cloud.
sensor_msgs/PointCloud2 cloud
# For each point in the cloud, the index of the camera that acquired the point.
std_msgs/Int64[] camera_source
# A list of camera positions at which the point cloud was acquired.
geometry_msgs/Point[] view_points
================================================
FILE: grasp_msgs/msg/GraspConfig.msg
================================================
# This message describes a 2-finger grasp configuration by its 6-DOF pose,
# consisting of a 3-DOF position and 3-DOF orientation, and the opening
# width of the robot hand.
# Position
geometry_msgs/Point bottom # centered bottom/base of the robot hand)
geometry_msgs/Point top # centered top/fingertip of the robot hand)
geometry_msgs/Point surface # centered position on object surface
# Orientation represented as three axis (R = [approach binormal axis])
geometry_msgs/Vector3 approach # The grasp approach direction
geometry_msgs/Vector3 binormal # The binormal
geometry_msgs/Vector3 axis # The hand axis
geometry_msgs/Point sample # Point at which the grasp was found
std_msgs/Float32 width # Required aperture (opening width) of the robot hand
std_msgs/Float32 score # Score assigned to the grasp by the classifier
================================================
FILE: grasp_msgs/msg/GraspConfigList.msg
================================================
# This message stores a list of grasp configurations.
# The time of acquisition, and the coordinate frame ID.
std_msgs/Header header
# The list of grasp configurations.
grasp_msgs/GraspConfig[] grasps
# Name of the known object these grasps associated to.
string object_name
================================================
FILE: grasp_msgs/msg/SamplesMsg.msg
================================================
# This message describes a set of point samples at which to detect grasps.
# Header
std_msgs/Header header
# The samples, as (x,y,z) points, at which to search for grasp candidates.
geometry_msgs/Point[] samples
================================================
FILE: grasp_msgs/package.xml
================================================
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format2.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>grasp_msgs</name>
<version>0.5.0</version>
<description>ROS2 messages definitions for grasp library</description>
<maintainer email="sharron.liu@intel.com">Sharron LIU</maintainer>
<license>Apache License 2.0</license>
<buildtool_depend>ament_cmake</buildtool_depend>
<buildtool_depend>rosidl_default_generators</buildtool_depend>
<build_depend>builtin_interfaces</build_depend>
<build_depend>std_msgs</build_depend>
<build_depend>geometry_msgs</build_depend>
<exec_depend>rosidl_default_runtime</exec_depend>
<exec_depend>builtin_interfaces</exec_depend>
<exec_depend>std_msgs</exec_depend>
<exec_depend>geometry_msgs</exec_depend>
<test_depend>ament_lint_common</test_depend>
<member_of_group>rosidl_interface_packages</member_of_group>
<export>
<build_type>ament_cmake</build_type>
</export>
</package>
================================================
FILE: grasp_ros2/CMakeLists.txt
================================================
# Copyright (c) 2018 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
cmake_minimum_required(VERSION 3.5)
project(grasp_ros2)
if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 14)
endif()
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
add_compile_options(-Wall -Wextra -Wpedantic)
endif()
if(CMAKE_BUILD_TYPE EQUAL "RELEASE")
message(STATUS "Create Release Build.")
set(CMAKE_CXX_FLAGS "-O2 ${CMAKE_CXX_FLAGS}")
else()
message(STATUS "Create Debug Build.")
endif()
if(BUILD_RECOGNIZE_PICK EQUAL "ON")
add_definitions("-DRECOGNIZE_PICK")
endif()
find_package(ament_cmake REQUIRED)
find_package(builtin_interfaces REQUIRED)
find_package(rclcpp REQUIRED)
find_package(rclcpp_components REQUIRED)
find_package(grasp_msgs REQUIRED)
find_package(sensor_msgs REQUIRED)
find_package(moveit_msgs)
if(BUILD_RECOGNIZE_PICK STREQUAL "ON")
find_package(people_msgs)
endif()
find_package(visualization_msgs)
find_package(tf2)
find_package(tf2_ros REQUIRED)
find_package(tf2_geometry_msgs REQUIRED)
find_package(trajectory_msgs REQUIRED)
find_package(pcl_conversions REQUIRED)
find_package(Eigen3 REQUIRED)
# GPG
find_library(gpg_LIBRARIES grasp_candidates_generator)
find_path(gpg_INCLUDE_DIRS gpg/grasp.h)
# GPD
find_library(gpd_LIBRARIES grasp_pose_detection)
find_path(gpd_INCLUDE_DIRS gpd/grasp_detector.h)
# PCL
find_package(PCL 1.8.1 EXACT)
include_directories(${PCL_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS})
include_directories(
include
${rclcpp_INCLUDE_DIRS}
${builtin_interfaces_INCLUDE_DIRS}
${grasp_msgs_INCLUDE_DIRS}
${sensor_msgs_INCLUDE_DIRS}
${moveit_msgs_INCLUDE_DIRS}
if(BUILD_RECOGNIZE_PICK STREQUAL "ON")
${people_msgs_INCLUDE_DIRS}
endif
${tf2_geometry_msgs_INCLUDE_DIRS}
${trajectory_msgs_INCLUDE_DIRS}
${visualization_msgs_INCLUDE_DIRS}
${pcl_conversions_INCLUDE_DIRS}
${tf2_INCLUDE_DIRS}
${gpg_INCLUDE_DIRS}
${gpd_INCLUDE_DIRS}
)
# create ament index resource which references the libraries in the binary dir
set(node_plugins "")
# grasp detect
set(libgrasp_detect "grasp_detect")
add_library(${libgrasp_detect} SHARED
src/consts.cpp
src/ros_params.cpp
src/grasp_detector_gpd.cpp)
target_compile_definitions(${libgrasp_detect}
PRIVATE "GRASP_ROS2_BUILDING_DLL")
ament_target_dependencies(${libgrasp_detect}
"class_loader"
"rclcpp"
"rclcpp_components"
"grasp_msgs"
"sensor_msgs"
"moveit_msgs"
if(BUILD_RECOGNIZE_PICK STREQUAL "ON")
"people_msgs"
endif()
"visualization_msgs")
target_link_libraries(${libgrasp_detect}
${gpg_LIBRARIES}
${gpd_LIBRARIES}
${PCL_LIBRARIES}
)
rclcpp_components_register_nodes(${libgrasp_detect} "grasp_ros2::GraspDetectorGPD")
set(node_plugins "${node_plugins}grasp_ros2::GraspDetectorGPD;$<TARGET_FILE:${libgrasp_detect}>\n")
# grasp plan
set(libgrasp_plan "grasp_plan")
add_library(${libgrasp_plan} SHARED
src/consts.cpp
src/ros_params.cpp
src/grasp_planner.cpp)
target_compile_definitions(${libgrasp_plan}
PRIVATE "GRASP_ROS2_BUILDING_DLL")
ament_target_dependencies(${libgrasp_plan}
"class_loader"
"rclcpp"
"rclcpp_components"
"grasp_msgs"
"moveit_msgs"
"tf2"
"tf2_ros"
"tf2_geometry_msgs"
"trajectory_msgs")
target_link_libraries(${libgrasp_plan}
)
rclcpp_components_register_nodes(${libgrasp_plan} "grasp_ros2::GraspPlanner")
set(node_plugins "${node_plugins}grasp_ros2::GraspPlanner;$<TARGET_FILE:${libgrasp_plan}>\n")
add_executable(${PROJECT_NAME}
src/grasp_composition.cpp
)
ament_target_dependencies(${PROJECT_NAME}
"rclcpp"
"builtin_interfaces"
"grasp_msgs"
"sensor_msgs"
"moveit_msgs"
"visualization_msgs"
"tf2"
"tf2_ros"
"tf2_geometry_msgs"
"trajectory_msgs"
"pcl_conversions"
)
target_link_libraries(${PROJECT_NAME}
${ament_LIBRARIES}
${gpg_LIBRARIES}
${gpd_LIBRARIES}
${PCL_LIBRARIES}
${libgrasp_detect}
${libgrasp_plan}
)
# Install libs
install(TARGETS
${libgrasp_detect}
${libgrasp_plan}
ARCHIVE DESTINATION lib
LIBRARY DESTINATION lib
RUNTIME DESTINATION bin)
# Install binaries
install(TARGETS ${PROJECT_NAME}
RUNTIME DESTINATION bin
)
install(TARGETS ${PROJECT_NAME}
DESTINATION lib/${PROJECT_NAME}
)
# Install header files
install(
DIRECTORY include/
DESTINATION include
)
# Flags
if(UNIX OR APPLE)
# Linker flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU" OR ${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# GCC specific flags. ICC is compatible with them.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -z noexecstack -z relro -z now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -z noexecstack -z relro -z now")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# In Clang, -z flags are not compatible, they need to be passed to linker via -Wl.
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} \
-Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now")
endif()
# Compiler flags.
if(${CMAKE_CXX_COMPILER_ID} STREQUAL "GNU")
# GCC specific flags.
if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 OR
CMAKE_CXX_COMPILER_VERSION VERSION_EQUAL 4.9)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector-strong")
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
endif()
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Clang")
# Clang is compatbile with some of the flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIE -fstack-protector")
elseif(${CMAKE_CXX_COMPILER_ID} STREQUAL "Intel")
# Same as above, with exception that ICC compilation crashes with -fPIE option, even
# though it uses -pie linker option that require -fPIE during compilation. Checksec
# shows that it generates correct PIE anyway if only -pie is provided.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fstack-protector")
endif()
# Generic flags.
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -fno-operator-names -Wformat -Wformat-security \
-Wall -fopenmp")
set( CUDA_PROPAGATE_HOST_FLAGS OFF )
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -D_FORTIFY_SOURCE=2")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pie")
endif()
if(BUILD_TESTING)
find_package(ament_lint_auto REQUIRED)
ament_lint_auto_find_test_dependencies()
add_subdirectory(tests)
endif()
ament_package()
================================================
FILE: grasp_ros2/cfg/grasp_ros2_params.yaml
================================================
GraspDetectorGPD:
ros__parameters:
cloud_topic: /camera/pointcloud
#cloud_topic: /mechmind/color_point_cloud
rviz: true
device: 0 # 0:CPU, 1:GPU, 2:VPU
auto_mode: true
plane_remove: true
# grasp workspace in camera frames
workspace: [-0.21, 0.29, -0.22, 0.15, 0.0, 1.0] # Realsense
#workspace: [-0.16, 0.34, -0.26, 0.14, 1.4, 1.8] # Mechmind
# gripper geometry parameters in metre
# finger_width: the finger thickness
# hand_outer_diameter: the maximum robot hand aperture
# hand_depth: the hand depth (the finger length)
# hand_height: the finger breadth
finger_width: 0.005
hand_outer_diameter: 0.100
hand_depth: 0.038
hand_height: 0.020
GraspPlanner:
ros__parameters:
grasp_score_threshold: 20
grasp_frame_id: "camera_color_optical_frame" # Realsense
#grasp_frame_id: "mechmind_camera" # Mechmind
grasp_offset: [0.000, 0.000, 0.0]
eef_offset: 0.174
eef_yaw_offset: 0.7854 # M_PI/4
finger_joint_names: ["panda_finger_joint1", "panda_finger_joint2"]
================================================
FILE: grasp_ros2/cfg/random_pick.yaml
================================================
GraspDetectorGPD:
ros__parameters:
#cloud_topic: /camera/pointcloud
cloud_topic: /mechmind/color_point_cloud
rviz: true
device: 0 # 0:CPU, 1:GPU, 2:VPU
auto_mode: true
plane_remove: true
# grasp workspace in camera frames
#workspace: [-0.21, 0.29, -0.22, 0.15, 0.0, 1.0] # Realsense
workspace: [-0.16, 0.28, -0.26, 0.14, 1.4, 1.65] # Mechmind
# gripper geometry parameters in metre
# finger_width: the finger thickness
# hand_outer_diameter: the maximum robot hand aperture
# hand_depth: the hand depth (the finger length)
# hand_height: the finger breadth
finger_width: 0.005
hand_outer_diameter: 0.100
hand_depth: 0.038
hand_height: 0.020
num_samples: 200
GraspPlanner:
ros__parameters:
grasp_score_threshold: 0
grasp_frame_id: "base"
grasp_approach: [0.0, 0.0, -1.0]
grasp_approach_angle: 0.523 # 1.047=PI/3 # 0.785=PI/4 # 0.523=PI/6 # 0.345=PI/9 # acceptable approaching angle
grasp_offset: [0.004, 0.000, 0.0]
# grasp boundry in grasp_frame_id
grasp_boundry: [-0.2, 0.2, -0.65, -0.30, -0.15, 0.15]
eef_offset: 0.162
eef_yaw_offset: -0.7854 # M_PI/4
finger_joint_names: ["panda_finger_joint1", "panda_finger_joint2"]
================================================
FILE: grasp_ros2/cfg/recognize_pick.yaml
================================================
GraspDetectorGPD:
ros__parameters:
cloud_topic: /camera/pointcloud
# cloud_topic: "/camera/aligned_depth_to_color/color/points"
object_topic: "/ros2_openvino_toolkit/segmented_obejcts"
rviz: true
device: 1 # 0:CPU, 1:GPU, 2:VPU
auto_mode: false
plane_remove: true
object_detect: true
# grasp workspace in camera frames
workspace: [-0.23, 0.23, -0.33, 0.05, 0.0, 1.0]
# gripper geometry parameters in metre
# finger_width: the finger thickness
# hand_outer_diameter: the maximum robot hand aperture
# hand_depth: the hand depth (the finger length)
# hand_height: the finger breadth
finger_width: 0.005
hand_outer_diameter: 0.100
hand_depth: 0.038
hand_height: 0.020
GraspPlanner:
ros__parameters:
grasp_score_threshold: 1
grasp_frame_id: "base"
grasp_approach: [0.0, 0.0, -1.0] # expect approaching in -z axis
grasp_approach_angle: 0.7 # 1.047=PI/3 # 0.785=PI/4 # 0.523=PI/6 # 0.345=PI/9 # acceptable approaching angle
grasp_offset: [0.006, -0.003, 0.000]
# grasp boundry in grasp_frame_id
grasp_boundry: [-0.2, 0.2, -0.65, -0.30, -0.15, 0.15]
eef_offset: 0.154
eef_yaw_offset: 0.7854 # M_PI/4
finger_joint_names: ["panda_finger_joint1", "panda_finger_joint2"]
================================================
FILE: grasp_ros2/cfg/test_grasp_ros2.yaml
================================================
GraspDetectorGPD:
ros__parameters:
cloud_topic: /camera/pointcloud
rviz: false
device: 0
auto_mode: false
GraspPlanner:
ros__parameters:
grasp_score_threshold: 0
grasp_frame_id: "camera_color_optical_frame"
================================================
FILE: grasp_ros2/include/grasp_library/ros2/consts.hpp
================================================
// Copyright (c) 2019 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef GRASP_LIBRARY__ROS2__CONSTS_HPP_
#define GRASP_LIBRARY__ROS2__CONSTS_HPP_
#include <string>
namespace grasp_ros2
{
/** Consts class
*
* \brief A class contains global constatnts definition for grasp library.
*
*/
class Consts
{
public:
/** Topic name of "PointCloud2" message published by an RGBD sensor.*/
static const char kTopicPointCloud2[];
/** Topic name of "detected objects" message published by Object Detector.*/
static const char kTopicDetectedObjects[];
/** Topic name of "detected grasps" message published by this Grasp Detector.*/
static const char kTopicDetectedGrasps[];
/** Topic name of "rviz grasps" message published by this Grasp Detector.*/
static const char kTopicVisualGrasps[];
/** Topic name of "tabletop pointcloud" message published by this Grasp Detector.*/
static const char kTopicTabletop[];
};
} // namespace grasp_ros2
#endif // GRASP_LIBRARY__ROS2__CONSTS_HPP_
================================================
FILE: grasp_ros2/include/grasp_library/ros2/grasp_detector_base.hpp
================================================
// Copyright (c) 2019 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef GRASP_LIBRARY__ROS2__GRASP_DETECTOR_BASE_HPP_
#define GRASP_LIBRARY__ROS2__GRASP_DETECTOR_BASE_HPP_
#include <grasp_msgs/msg/grasp_config_list.hpp>
#include <string>
namespace grasp_ros2
{
/** GraspCallback class
*
* \brief Abstract base class for grasp callback.
*
* A grasp planner inherits from this class get called back for grasp detection resutls.
*/
class GraspCallback
{
public:
/**
* \brief Callback for grasp detection results.
*
* \param msg Pointer to grasp detection results.
*/
virtual void grasp_callback(const grasp_msgs::msg::GraspConfigList::SharedPtr msg) = 0;
};
/** GraspDetectorBase class
*
* \brief A base class for detecting grasp poses from visual input.
*
* This class defines uniform interface for grasp library, regardless whichever algorithm
* is used for grasp detection.
*/
class GraspDetectorBase
{
public:
/**
* \brief Constructor.
*/
GraspDetectorBase()
: object_name_(""), grasp_cb_(nullptr)
{
}
/**
* \brief Destructor.
*/
~GraspDetectorBase()
{
}
/**
* \brief Start grasp detection.
* When this function is called, GraspDetector starts processing visual input.
* \param name Name of the object for which to detect grasps
*/
void start(std::string name = "")
{
started_ = true;
object_name_ = name;
}
/**
* \brief Stop grasp detection.
* When this function is called, GraspDetector stops processing visual input.
*/
void stop()
{
started_ = false;
}
/**
* \brief Register grasp callback function.
*
* \param cb Callback function to be registered.
*/
void add_callback(GraspCallback * cb)
{
grasp_cb_ = cb;
}
protected:
bool started_ = false;
std::string object_name_;
GraspCallback * grasp_cb_;
};
} // namespace grasp_ros2
#endif // GRASP_LIBRARY__ROS2__GRASP_DETECTOR_BASE_HPP_
================================================
FILE: grasp_ros2/include/grasp_library/ros2/grasp_detector_gpd.hpp
================================================
// Copyright (c) 2018 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef GRASP_LIBRARY__ROS2__GRASP_DETECTOR_GPD_HPP_
#define GRASP_LIBRARY__ROS2__GRASP_DETECTOR_GPD_HPP_
// ROS2
#include <geometry_msgs/msg/point.hpp>
#include <geometry_msgs/msg/vector3.hpp>
#ifdef RECOGNIZE_PICK
#include <people_msgs/msg/objects_in_masks.hpp>
#endif
#include <rclcpp/logger.hpp>
#include <rclcpp/rclcpp.hpp>
#include <sensor_msgs/msg/point_cloud2.hpp>
#include <visualization_msgs/msg/marker_array.hpp>
// PCL
#include <pcl/common/common.h>
#include <pcl/point_cloud.h>
#include <pcl/point_types.h>
// eigen
#include <Eigen/Geometry>
// GPG
#include <gpg/cloud_camera.h>
// this project (messages)
#include <gpd/grasp_detector.h>
#include <grasp_msgs/msg/grasp_config.hpp>
#include <grasp_msgs/msg/grasp_config_list.hpp>
// system
#include <algorithm>
#include <map>
#include <string>
#include <tuple>
#include <vector>
#include "grasp_library/ros2/consts.hpp"
#include "grasp_library/ros2/grasp_detector_base.hpp"
#include "grasp_library/ros2/grasp_planner.hpp"
namespace grasp_ros2
{
typedef pcl::PointCloud<pcl::PointXYZRGBA> PointCloudRGBA;
typedef pcl::PointCloud<pcl::PointNormal> PointCloudPointNormal;
/** GraspDetectorGPD class
*
* \brief A ROS node that can detect grasp poses in a point cloud.
*
* This class is a ROS node that handles all the ROS topics.
*
*/
class GraspDetectorGPD : public rclcpp::Node, public GraspDetectorBase
{
public:
/**
* \brief Constructor.
*/
explicit GraspDetectorGPD(const rclcpp::NodeOptions & options);
/**
* \brief Destructor.
*/
~GraspDetectorGPD()
{
delete cloud_camera_;
// todo stop and delete threads
}
private:
/**
* \brief Run the ROS node. Loops while waiting for incoming ROS messages.
*/
void onInit();
/**
* \brief Detect grasp poses in a point cloud received from a ROS topic.
* \return the list of grasp poses
*/
std::vector<Grasp> detectGraspPosesInTopic();
/**
* \brief Callback function for the ROS topic that contains the input point cloud.
* \param msg the incoming ROS message
*/
void cloud_callback(const sensor_msgs::msg::PointCloud2::SharedPtr msg);
#ifdef RECOGNIZE_PICK
/**
* \brief Callback function for the ROS topic that contains the detected and segmented objects
* \param msg The detected objects message
*/
void object_callback(const people_msgs::msg::ObjectsInMasks::SharedPtr msg);
#endif
/**
* \brief Create a ROS message that contains a list of grasp poses from a list of handles.
* \param hands the list of grasps
* \return the ROS message that contains the grasp poses
*/
grasp_msgs::msg::GraspConfigList createGraspListMsg(const std::vector<Grasp> & hands);
/**
* \brief Convert GPD Grasp into grasp message.
* \param hand A GPD grasp
* \return The Grasp message converted
*/
grasp_msgs::msg::GraspConfig convertToGraspMsg(const Grasp & hand);
/**
* \brief Convert GPD Grasps into visual grasp messages.
*/
visualization_msgs::msg::MarkerArray convertToVisualGraspMsg(
const std::vector<Grasp> & hands,
double outer_diameter, double hand_depth, double finger_width, double hand_height,
const std::string & frame_id);
/**
* \brief Create finger marker for visual grasp messages
*/
visualization_msgs::msg::Marker createFingerMarker(
const Eigen::Vector3d & center,
const Eigen::Matrix3d & frame, double length, double width, double height, int id,
const std::string & frame_id);
/**
* \brief Create hand base marker for visual grasp messages
*/
visualization_msgs::msg::Marker createHandBaseMarker(
const Eigen::Vector3d & start,
const Eigen::Vector3d & end, const Eigen::Matrix3d & frame, double length, double height,
int id,
const std::string & frame_id);
/** Converts an Eigen Vector into a Point message. Todo ROS2 eigen_conversions*/
void pointEigenToMsg(const Eigen::Vector3d & e, geometry_msgs::msg::Point & m)
{
m.x = e(0);
m.y = e(1);
m.z = e(2);
}
/** Converts an Eigen Vector into a Vector message. Todo ROS2 eigen_conversions*/
void vectorEigenToMsg(const Eigen::Vector3d & e, geometry_msgs::msg::Vector3 & m)
{
m.x = e(0);
m.y = e(1);
m.z = e(2);
}
Eigen::Vector3d view_point_; /**< (input) view point of the camera onto the point cloud*/
/** stores point cloud with (optional) camera information and surface normals*/
CloudCamera * cloud_camera_;
std_msgs::msg::Header cloud_camera_header_; /**< stores header of the point cloud*/
/** status variables for received (input) messages*/
bool has_cloud_;
std::string frame_; /**< point cloud frame*/
bool auto_mode_; /**< grasp detection mode*/
bool plane_remove_; /**< whether enable object detection>*/
#ifdef RECOGNIZE_PICK
/** the latest message on detected objects*/
people_msgs::msg::ObjectsInMasks::SharedPtr object_msg_;
#endif
std::vector<double> grasp_ws_;
rclcpp::callback_group::CallbackGroup::SharedPtr callback_group_subscriber1_;
rclcpp::callback_group::CallbackGroup::SharedPtr callback_group_subscriber2_;
/** ROS2 subscriber for point cloud messages*/
rclcpp::Subscription<sensor_msgs::msg::PointCloud2>::SharedPtr cloud_sub_;
#ifdef RECOGNIZE_PICK
/** ROS2 subscriber for object messages*/
rclcpp::Subscription<people_msgs::msg::ObjectsInMasks>::SharedPtr object_sub_;
#endif
/** ROS2 publisher for grasp list messages*/
rclcpp::Publisher<grasp_msgs::msg::GraspConfigList>::SharedPtr grasps_pub_;
/** ROS2 publisher for filtered point clouds*/
rclcpp::Publisher<sensor_msgs::msg::PointCloud2>::SharedPtr filtered_pub_;
/** ROS2 publisher for grasps in rviz (visualization)*/
rclcpp::Publisher<visualization_msgs::msg::MarkerArray>::SharedPtr grasps_rviz_pub_;
std::shared_ptr<GraspDetector> grasp_detector_; /**< used to run the grasp pose detection*/
GraspDetector::GraspDetectionParameters detection_param_; /**< grasp detector parameters*/
rclcpp::Logger logger_ = rclcpp::get_logger("GraspDetectorGPD");
std::thread * detector_thread_; /**< thread for grasp detection*/
};
} // namespace grasp_ros2
#endif // GRASP_LIBRARY__ROS2__GRASP_DETECTOR_GPD_HPP_
================================================
FILE: grasp_ros2/include/grasp_library/ros2/grasp_planner.hpp
================================================
// Copyright (c) 2018 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef GRASP_LIBRARY__ROS2__GRASP_PLANNER_HPP_
#define GRASP_LIBRARY__ROS2__GRASP_PLANNER_HPP_
#include <rclcpp/logger.hpp>
#include <rclcpp/rclcpp.hpp>
#include <grasp_msgs/msg/grasp_config_list.hpp>
#include <moveit_msgs/msg/grasp.h>
#include <moveit_msgs/srv/grasp_planning.hpp>
#include <tf2_geometry_msgs/tf2_geometry_msgs.h>
#include <tf2_ros/transform_listener.h>
#include <tf2_ros/static_transform_broadcaster.h>
#include <trajectory_msgs/msg/joint_trajectory_point.hpp>
#include <condition_variable>
#include <deque>
#include <map>
#include <memory>
#include <mutex>
#include <string>
#include <utility>
#include <vector>
#include "grasp_library/ros2/grasp_detector_base.hpp"
namespace grasp_ros2
{
/** GraspPlanner class
*
* \brief A MoveIt grasp planner
*
* This class provide ROS service for MoveIt grasp planning. Grasp Planner drives grasp detection
* and takes the results from Grasp Detector.
*/
class GraspPlanner : public rclcpp::Node, public GraspCallback
{
public:
struct GraspPlanningParameters
{
/** timeout in seconds for a service request waiting for grasp detection result*/
int grasp_service_timeout_;
/** minimum score expected for grasps returned from this service*/
int grasp_score_threshold_;
/** frame id expected for grasps returned from this service*/
std::string grasp_frame_id_;
/** approach direction in grasp_frame_id_ expected for grasps*/
tf2::Vector3 grasp_approach_;
/** maxmimum angle in radian acceptable between the expected 'approach_' and
* the real approach returned from this service*/
double grasp_approach_angle_;
/** offset [x, y, z] in metres applied to the grasps detected*/
std::vector<double> grasp_offset_;
/** boundry cube in grasp_frame_id_ expected for grasps returned from this service*/
std::vector<double> grasp_boundry_;
/** offset in metres from the gripper base (finger root) to the parent link of gripper*/
double eef_offset;
/** gripper yaw offset to its parent link, in radian (e.g. 0.0, or M_PI/4)*/
double eef_yaw_offset;
/** minimum distance in metres for a grasp to approach and retreat*/
double grasp_min_distance_;
/** desired distance in metres for a grasp to approach and retreat*/
double grasp_desired_distance_;
/** joint names of gripper fingers*/
std::vector<std::string> finger_joint_names_;
/** trajectory points in 'open' status, for joints in the same order as 'finger_joint_names_'*/
trajectory_msgs::msg::JointTrajectoryPoint finger_points_open_;
/** trajectory points in 'close' status, for joints in the same order as 'finger_joint_names_'*/
trajectory_msgs::msg::JointTrajectoryPoint finger_points_close_;
};
/**
* \brief Constructor.
* \param grasp_detector Grasp Detector used by this planner.
*/
explicit GraspPlanner(
const rclcpp::NodeOptions & options,
GraspDetectorBase * grasp_detector = nullptr);
/**
* \brief Destructor.
*/
~GraspPlanner()
{
delete tfBuffer_;
}
void grasp_callback(const grasp_msgs::msg::GraspConfigList::SharedPtr msg);
/**
* \brief Grasp planning service handler.
* When a grasp service request comes, Grasp Planner tells the Grasp Detector to start grasp
* detection, waits for grasp callback arrival or till a configurable timeout period, then stops
* grasp detection, skips grasps with low scores, transforms grasps into the specified frame_id
* (if TF available), applies the configured offset, skips grasps out of boundry, and returns the
* results via grasp service response.
*/
void grasp_service(
const std::shared_ptr<rmw_request_id_t> request_header,
const std::shared_ptr<moveit_msgs::srv::GraspPlanning::Request> req,
const std::shared_ptr<moveit_msgs::srv::GraspPlanning::Response> res);
private:
/**
* \brief Transform a grasp from original frame to the 'grasp_frame_id_' frame.
* Keep 'to' grasp identical to 'from' grasp, in case of transform missing or failure.
* \param from The grasp to transform.
* \param to The transformed output.
* \param header Message header for the frame of the 'from' grasp.
* \return true if transformation success, otherwise false.
*/
bool transform(
grasp_msgs::msg::GraspConfig & from, grasp_msgs::msg::GraspConfig & to,
const std_msgs::msg::Header & header);
/**
* \brief Check if the grasp position is in boundary.
* \param p Grasp position.
* \return True if the grasp position in boundary, otherwise False.
*/
bool check_boundry(const geometry_msgs::msg::Point & p);
/**
* \brief Translate a grasp message to MoveIt message.
* 'Grasp.grasp_pose.pose.position' was translated from 'GraspConfig.bottom', which is the
* position closest to the 'parent_link' of the end-effector.
* \param grasp Grasp message to be translated.
* \header Message header for the frame where the 'grasp' was detected.
* \return MoveIt message
*/
moveit_msgs::msg::Grasp toMoveIt(
grasp_msgs::msg::GraspConfig & grasp,
const std_msgs::msg::Header & header);
std::mutex m_;
std::condition_variable cv_;
GraspPlanningParameters param_;
rclcpp::Logger logger_ = rclcpp::get_logger("GraspPlanner");
/*buffer for grasps to be returned from this service*/
std::vector<moveit_msgs::msg::Grasp> moveit_grasps_;
rclcpp::callback_group::CallbackGroup::SharedPtr callback_group_subscriber3_;
rclcpp::Service<moveit_msgs::srv::GraspPlanning>::SharedPtr grasp_srv_; /*grasp service*/
tf2_ros::Buffer * tfBuffer_; /*buffer for transformation listener*/
std::shared_ptr<tf2_ros::TransformListener> tfListener_; /*Transform listener*/
tf2_ros::StaticTransformBroadcaster tfBroadcaster_; /*grasp pose transformation broadcaster*/
GraspDetectorBase * grasp_detector_; /*grasp detector node*/
};
} // namespace grasp_ros2
#endif // GRASP_LIBRARY__ROS2__GRASP_PLANNER_HPP_
================================================
FILE: grasp_ros2/include/grasp_library/ros2/ros_params.hpp
================================================
// Copyright (c) 2018 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef GRASP_LIBRARY__ROS2__ROS_PARAMS_HPP_
#define GRASP_LIBRARY__ROS2__ROS_PARAMS_HPP_
// ROS2 core
#include <rclcpp/rclcpp.hpp>
// ROS2 projects
#include <gpd/grasp_detector.h>
#include "grasp_library/ros2/grasp_planner.hpp"
namespace grasp_ros2
{
/** ROSParameters class
*
* \brief A class to bridge parameters passed from ROS.
*
*/
class ROSParameters
{
public:
static void getDetectionParams(
rclcpp::Node * node,
GraspDetector::GraspDetectionParameters & param);
static void getPlanningParams(rclcpp::Node * Node, GraspPlanner::GraspPlanningParameters & param);
};
} // namespace grasp_ros2
#endif // GRASP_LIBRARY__ROS2__ROS_PARAMS_HPP_
================================================
FILE: grasp_ros2/package.xml
================================================
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format2.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="2">
<name>grasp_ros2</name>
<version>0.5.0</version>
<description>ROS2 grasp library as MoveIt plug-in</description>
<maintainer email="sharron.liu@intel.com">Sharron LIU</maintainer>
<license>Apache License 2.0</license>
<buildtool_depend>ament_cmake</buildtool_depend>
<buildtool_depend>rosidl_default_generators</buildtool_depend>
<build_depend>builtin_interfaces</build_depend>
<build_depend>rclcpp</build_depend>
<build_depend>rclcpp_components</build_depend>
<build_depend>std_msgs</build_depend>
<build_depend>sensor_msgs</build_depend>
<build_depend>grasp_msgs</build_depend>
<build_depend>moveit_msgs</build_depend>
<build_depend>people_msgs</build_depend>
<build_depend>visualization_msgs</build_depend>
<build_depend>pcl_conversions</build_depend>
<build_depend>tf2</build_depend>
<build_depend>tf2_ros</build_depend>
<build_depend>tf2_geometry_msgs</build_depend>
<build_depend>trajectory_msgs</build_depend>
<exec_depend>rosidl_default_runtime</exec_depend>
<exec_depend>builtin_interfaces</exec_depend>
<exec_depend>rclcpp</exec_depend>
<exec_depend>rclcpp_components</exec_depend>
<exec_depend>std_msgs</exec_depend>
<exec_depend>sensor_msgs</exec_depend>
<exec_depend>grasp_msgs</exec_depend>
<exec_depend>moveit_msgs</exec_depend>
<exec_depend>people_msgs</exec_depend>
<exec_depend>visualization_msgs</exec_depend>
<exec_depend>pcl_conversions</exec_depend>
<exec_depend>tf2</exec_depend>
<exec_depend>tf2_ros</exec_depend>
<exec_depend>tf2_geometry_msgs</exec_depend>
<exec_depend>trajectory_msgs</exec_depend>
<test_depend>ament_lint_auto</test_depend>
<test_depend>ament_lint_common</test_depend>
<export>
<build_type>ament_cmake</build_type>
</export>
</package>
================================================
FILE: grasp_ros2/src/consts.cpp
================================================
// Copyright (c) 2019 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include "grasp_library/ros2/consts.hpp"
namespace grasp_ros2
{
const char Consts::kTopicPointCloud2[] = "/camera/pointcloud";
const char Consts::kTopicDetectedObjects[] = "/ros2_openvino_toolkit/segmented_obejcts";
const char Consts::kTopicDetectedGrasps[] = "/grasp_library/clustered_grasps";
const char Consts::kTopicVisualGrasps[] = "/grasp_library/grasps_rviz";
const char Consts::kTopicTabletop[] = "/grasp_library/tabletop_points";
} // namespace grasp_ros2
================================================
FILE: grasp_ros2/src/grasp_composition.cpp
================================================
// Copyright (c) 2019 Intel Corporation
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <rclcpp/rclcpp.hpp>
#include <memory>
#include "grasp_library/ros2/grasp_detector_gpd.hpp"
#include "grasp_library/ros2/grasp_planner.hpp"
using GraspDetectorGPD = grasp_ros2::GraspDetectorGPD;
using GraspDetectorBase = grasp_ros2::GraspDetectorBase;
using GraspPlanner = grasp_ros2::GraspPlanner;
int main(int argc, char ** argv)
{
rclcpp::init(argc, argv);
rclcpp::executors::MultiThreadedExecutor exec;
auto detect_node = std::make_shared<GraspDetectorGPD>(
rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true));
exec.add_node(detect_node);
GraspDetectorBase * grasp_detector = dynamic_cast<GraspDetectorBase *>(detect_node.get());
auto plan_node = std::make_shared<GraspPlanner>(
rclcpp::NodeOptions().automatically_declare_parameters_from_overrides(true), grasp_detector);
exec.add_node(plan_node);
exec.spin();
detect_node = nullptr;
plan_node = nullptr;
rclcpp::shutdown();
return 0;
}
================================================
FILE: grasp_ros2/src/grasp_detector_gpd.cpp
================================================
// Copyright (c) 2019 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <pcl_conversions/pcl_conversions.h>
#include <pcl/filters/passthrough.h>
#include <pcl/filters/crop_box.h>
#include <pcl/filters/extract_indices.h>
#include <limits>
#include <memory>
#include <string>
#include <thread>
#include <vector>
#include "grasp_library/ros2/grasp_detector_gpd.hpp"
#include "grasp_library/ros2/ros_params.hpp"
namespace grasp_ros2
{
GraspDetectorGPD::GraspDetectorGPD(const rclcpp::NodeOptions & options)
: Node("GraspDetectorGPD", options),
GraspDetectorBase(), cloud_camera_(NULL), has_cloud_(false), frame_(""),
#ifdef RECOGNIZE_PICK
object_msg_(nullptr), object_sub_(nullptr),
#endif
filtered_pub_(nullptr), grasps_rviz_pub_(nullptr)
{
std::vector<double> camera_position;
this->get_parameter_or("camera_position", camera_position,
std::vector<double>(std::initializer_list<double>({0, 0, 0})));
view_point_ << camera_position[0], camera_position[1], camera_position[2];
this->get_parameter_or("auto_mode", auto_mode_, true);
std::string cloud_topic, grasp_topic, rviz_topic, tabletop_topic, object_topic;
this->get_parameter_or("cloud_topic", cloud_topic,
std::string(Consts::kTopicPointCloud2));
bool rviz, object_detect;
this->get_parameter_or("rviz", rviz, false);
this->get_parameter_or("plane_remove", plane_remove_, false);
this->get_parameter_or("object_detect", object_detect, false);
callback_group_subscriber1_ = this->create_callback_group(
rclcpp::callback_group::CallbackGroupType::MutuallyExclusive);
auto sub1_opt = rclcpp::SubscriptionOptions();
sub1_opt.callback_group = callback_group_subscriber1_;
auto callback = [this](const sensor_msgs::msg::PointCloud2::SharedPtr msg) -> void {
this->cloud_callback(msg);
};
cloud_sub_ =
this->create_subscription<sensor_msgs::msg::PointCloud2>(cloud_topic,
rclcpp::QoS(10), callback, sub1_opt);
grasps_pub_ = this->create_publisher<grasp_msgs::msg::GraspConfigList>(
Consts::kTopicDetectedGrasps, 10);
if (rviz) {
grasps_rviz_pub_ = this->create_publisher<visualization_msgs::msg::MarkerArray>(
Consts::kTopicVisualGrasps, 10);
filtered_pub_ = this->create_publisher<sensor_msgs::msg::PointCloud2>(
Consts::kTopicTabletop, 10);
}
#ifdef RECOGNIZE_PICK
if (object_detect) {
callback_group_subscriber2_ = this->create_callback_group(
rclcpp::callback_group::CallbackGroupType::MutuallyExclusive);
auto sub2_opt = rclcpp::SubscriptionOptions();
sub2_opt.callback_group = callback_group_subscriber2_;
this->get_parameter_or("object_topic", object_topic,
std::string(Consts::kTopicDetectedObjects));
auto callback = [this](const people_msgs::msg::ObjectsInMasks::SharedPtr msg) -> void {
this->object_callback(msg);
};
object_sub_ =
this->create_subscription<people_msgs::msg::ObjectsInMasks>(object_topic,
rclcpp::QoS(10), callback, sub2_opt);
}
#endif
// GraspDetector::GraspDetectionParameters detection_param;
ROSParameters::getDetectionParams(this, detection_param_);
grasp_detector_ = std::make_shared<GraspDetector>(detection_param_);
RCLCPP_INFO(logger_, "ROS2 Grasp Library node up...");
detector_thread_ = new std::thread(&GraspDetectorGPD::onInit, this);
detector_thread_->detach();
}
void GraspDetectorGPD::onInit()
{
rclcpp::Rate rate(100);
RCLCPP_INFO(logger_, "Waiting for point cloud to arrive ...");
while (rclcpp::ok()) {
if (has_cloud_) {
// detect grasps in point cloud
std::vector<Grasp> grasps = detectGraspPosesInTopic();
// visualize grasps in rviz
if (grasps_rviz_pub_) {
const HandSearch::Parameters & params = grasp_detector_->getHandSearchParameters();
grasps_rviz_pub_->publish(convertToVisualGraspMsg(grasps, params.hand_outer_diameter_,
params.hand_depth_,
params.finger_width_, params.hand_height_, frame_));
}
// reset the system
has_cloud_ = false;
RCLCPP_INFO(logger_, "Waiting for point cloud to arrive ...");
}
// rclcpp::spin(shared_from_this());
rate.sleep();
}
}
std::vector<Grasp> GraspDetectorGPD::detectGraspPosesInTopic()
{
// detect grasp poses
std::vector<Grasp> grasps;
{
// preprocess the point cloud
grasp_detector_->preprocessPointCloud(*cloud_camera_);
// detect grasps in the point cloud
grasps = grasp_detector_->detectGrasps(*cloud_camera_);
}
// Publish the selected grasps.
grasp_msgs::msg::GraspConfigList selected_grasps_msg = createGraspListMsg(grasps);
if (grasp_cb_) {
grasp_cb_->grasp_callback(
std::make_shared<grasp_msgs::msg::GraspConfigList>(selected_grasps_msg));
}
grasps_pub_->publish(selected_grasps_msg);
RCLCPP_INFO(logger_, "Published %d highest-scoring grasps.", selected_grasps_msg.grasps.size());
return grasps;
}
void GraspDetectorGPD::cloud_callback(const sensor_msgs::msg::PointCloud2::SharedPtr msg)
{
if (!auto_mode_ && !started_) {return;}
#ifdef RECOGNIZE_PICK
people_msgs::msg::ObjectsInMasks::SharedPtr object_msg;
if (object_sub_) {
if (object_name_.empty()) {
RCLCPP_INFO(logger_, "Waiting for object name...");
return;
}
object_msg = object_msg_;
object_msg_ = nullptr;
if (nullptr == object_msg || object_msg->objects_vector.empty()) {
RCLCPP_INFO(logger_, "Waiting for object callback...");
return;
}
}
#endif
RCLCPP_DEBUG(logger_, "PCD callback...");
if (!has_cloud_) {
delete cloud_camera_;
cloud_camera_ = NULL;
Eigen::Matrix3Xd view_points(3, 1);
view_points.col(0) = view_point_;
if (msg->fields.size() == 6 && msg->fields[3].name == "normal_x" &&
msg->fields[4].name == "normal_y" &&
msg->fields[5].name == "normal_z")
{
PointCloudPointNormal::Ptr cloud(new PointCloudPointNormal);
pcl::fromROSMsg(*msg, *cloud);
cloud_camera_ = new CloudCamera(cloud, 0, view_points);
cloud_camera_header_ = msg->header;
} else {
PointCloudRGBA::Ptr cloud(new PointCloudRGBA);
pcl::fromROSMsg(*msg, *cloud);
// filter workspace
for (uint32_t i = 0; i < cloud->size(); i++) {
if (cloud->points[i].x > detection_param_.workspace_[0] && cloud->points[i].x < detection_param_.workspace_[1] &&
cloud->points[i].y > detection_param_.workspace_[2] && cloud->points[i].y < detection_param_.workspace_[3] &&
cloud->points[i].z > detection_param_.workspace_[4] && cloud->points[i].z < detection_param_.workspace_[5]) {
continue;
} else {
cloud->points[i].x = std::numeric_limits<float>::quiet_NaN();
cloud->points[i].y = std::numeric_limits<float>::quiet_NaN();
cloud->points[i].z = std::numeric_limits<float>::quiet_NaN();
}
}
// remove table plane
if (plane_remove_) {
pcl::ModelCoefficients::Ptr coefficients(new pcl::ModelCoefficients);
pcl::PointIndices::Ptr inliers(new pcl::PointIndices);
pcl::SACSegmentation<pcl::PointXYZRGBA> seg;
seg.setOptimizeCoefficients(true);
seg.setModelType(pcl::SACMODEL_PLANE);
seg.setMethodType(pcl::SAC_RANSAC);
seg.setDistanceThreshold(0.025);
seg.setInputCloud(cloud);
seg.segment(*inliers, *coefficients);
for (size_t i = 0; i < inliers->indices.size(); ++i) {
cloud->points[inliers->indices[i]].x = std::numeric_limits<float>::quiet_NaN();
cloud->points[inliers->indices[i]].y = std::numeric_limits<float>::quiet_NaN();
cloud->points[inliers->indices[i]].z = std::numeric_limits<float>::quiet_NaN();
}
}
#ifdef RECOGNIZE_PICK
// filter object location
if (object_sub_) {
bool found = false;
for (auto obj : object_msg->objects_vector) {
if (0 == obj.object_name.compare(object_name_)) {
RCLCPP_INFO(logger_, "obj name %s prob %f roi [%d %d %d %d] %d %d",
obj.object_name.c_str(), obj.probability, obj.roi.x_offset, obj.roi.y_offset,
obj.roi.width, obj.roi.height, msg->width, msg->height);
std::vector<int> indices;
for (size_t i = 0; i < obj.roi.height; i++) { // rows
int idx = (i + obj.roi.y_offset) * msg->width + obj.roi.x_offset;
for (size_t j = 0; j < obj.roi.width; j++) { // columns
// todo use mask_array from from object msg
if (!isnan(cloud->points[idx + j].x) &&
!isnan(cloud->points[idx + j].y) &&
!isnan(cloud->points[idx + j].z))
{
indices.push_back(idx + j);
}
}
}
pcl::ExtractIndices<pcl::PointXYZRGBA> filter;
filter.setInputCloud(cloud);
filter.setIndices(boost::make_shared<std::vector<int>>(indices));
filter.filter(*cloud);
Eigen::Matrix3Xf xyz =
cloud->getMatrixXfMap(3, sizeof(pcl::PointXYZRGBA) / sizeof(float), 0);
RCLCPP_INFO(logger_, "*************** %f %f, %f %f, %f %f",
xyz.row(0).minCoeff(), xyz.row(0).maxCoeff(),
xyz.row(1).minCoeff(), xyz.row(1).maxCoeff(),
xyz.row(2).minCoeff(), xyz.row(2).maxCoeff());
grasp_ws_ = {xyz.row(0).minCoeff(), xyz.row(0).maxCoeff(),
xyz.row(1).minCoeff(), xyz.row(1).maxCoeff(),
xyz.row(2).minCoeff(), xyz.row(2).maxCoeff()};
found = true;
break;
}
}
if (!found) {return;}
}
#endif
if (filtered_pub_) {
sensor_msgs::msg::PointCloud2 msg2;
pcl::toROSMsg(*cloud, msg2);
// workaround rviz rgba
msg2.fields[3].name = "rgb";
msg2.fields[3].datatype = 7;
filtered_pub_->publish(msg2);
}
cloud_camera_ = new CloudCamera(cloud, 0, view_points);
cloud_camera_header_ = msg->header;
}
RCLCPP_INFO(logger_, "Received cloud with %d points and normals.",
cloud_camera_->getCloudProcessed()->size());
has_cloud_ = true;
frame_ = msg->header.frame_id;
}
}
#ifdef RECOGNIZE_PICK
void GraspDetectorGPD::object_callback(const people_msgs::msg::ObjectsInMasks::SharedPtr msg)
{
RCLCPP_INFO(logger_, "Object callback *************************[%d]", msg->objects_vector.size());
for (auto obj : msg->objects_vector) {
RCLCPP_INFO(logger_, "obj name %s prob %f roi[%d %d %d %d]",
obj.object_name.c_str(), obj.probability,
obj.roi.x_offset, obj.roi.y_offset, obj.roi.width, obj.roi.height);
if (0 == obj.object_name.compare("orange")) {
for (size_t i = 0; i < obj.roi.height; i++) { // rows
// std::cout << "\n";
for (size_t j = 0; j < obj.roi.width; j++) { // columns
// int a = obj.mask_array[i * obj.roi.width + j] * 10;
// if (a>5) std::cout << a; else std::cout << "*";
}
}
}
}
if (msg->objects_vector.size() > 0) {
object_msg_ = msg;
}
}
#endif
grasp_msgs::msg::GraspConfigList GraspDetectorGPD::createGraspListMsg(
const std::vector<Grasp> & hands)
{
grasp_msgs::msg::GraspConfigList msg;
for (uint32_t i = 0; i < hands.size(); i++) {
msg.grasps.push_back(convertToGraspMsg(hands[i]));
}
msg.header = cloud_camera_header_;
msg.object_name = object_name_;
return msg;
}
grasp_msgs::msg::GraspConfig GraspDetectorGPD::convertToGraspMsg(const Grasp & hand)
{
grasp_msgs::msg::GraspConfig msg;
pointEigenToMsg(hand.getGraspBottom(), msg.bottom);
pointEigenToMsg(hand.getGraspTop(), msg.top);
pointEigenToMsg(hand.getGraspSurface(), msg.surface);
vectorEigenToMsg(hand.getApproach(), msg.approach);
vectorEigenToMsg(hand.getBinormal(), msg.binormal);
vectorEigenToMsg(hand.getAxis(), msg.axis);
msg.width.data = hand.getGraspWidth();
msg.score.data = hand.getScore();
pointEigenToMsg(hand.getSample(), msg.sample);
return msg;
}
visualization_msgs::msg::MarkerArray GraspDetectorGPD::convertToVisualGraspMsg(
const std::vector<Grasp> & hands,
double outer_diameter, double hand_depth, double finger_width, double hand_height,
const std::string & frame_id)
{
double width = outer_diameter;
double hw = 0.5 * width;
visualization_msgs::msg::MarkerArray marker_array;
visualization_msgs::msg::Marker left_finger, right_finger, base, approach;
Eigen::Vector3d left_bottom, right_bottom, left_top, right_top, left_center, right_center,
approach_center,
base_center;
for (uint32_t i = 0; i < hands.size(); i++) {
left_bottom = hands[i].getGraspBottom() - (hw - 0.5 * finger_width) * hands[i].getBinormal();
right_bottom = hands[i].getGraspBottom() + (hw - 0.5 * finger_width) * hands[i].getBinormal();
left_top = left_bottom + hand_depth * hands[i].getApproach();
right_top = right_bottom + hand_depth * hands[i].getApproach();
left_center = left_bottom + 0.5 * (left_top - left_bottom);
right_center = right_bottom + 0.5 * (right_top - right_bottom);
base_center = left_bottom + 0.5 * (right_bottom - left_bottom) - 0.01 * hands[i].getApproach();
approach_center = base_center - 0.04 * hands[i].getApproach();
base = createHandBaseMarker(left_bottom, right_bottom,
hands[i].getFrame(), 0.02, hand_height, i, frame_id);
left_finger = createFingerMarker(left_center,
hands[i].getFrame(), hand_depth, finger_width, hand_height, i * 3, frame_id);
right_finger = createFingerMarker(right_center,
hands[i].getFrame(), hand_depth, finger_width, hand_height, i * 3 + 1, frame_id);
approach = createFingerMarker(approach_center,
hands[i].getFrame(), 0.08, finger_width, hand_height, i * 3 + 2, frame_id);
marker_array.markers.push_back(left_finger);
marker_array.markers.push_back(right_finger);
marker_array.markers.push_back(approach);
marker_array.markers.push_back(base);
}
return marker_array;
}
visualization_msgs::msg::Marker GraspDetectorGPD::createFingerMarker(
const Eigen::Vector3d & center,
const Eigen::Matrix3d & frame, double length, double width, double height, int id,
const std::string & frame_id)
{
visualization_msgs::msg::Marker marker;
marker.header.frame_id = frame_id;
marker.header.stamp = rclcpp::Clock(RCL_ROS_TIME).now();
marker.ns = "finger";
marker.id = id;
marker.type = visualization_msgs::msg::Marker::CUBE;
marker.action = visualization_msgs::msg::Marker::ADD;
marker.pose.position.x = center(0);
marker.pose.position.y = center(1);
marker.pose.position.z = center(2);
marker.lifetime = rclcpp::Duration(20.0, 0);
// use orientation of hand frame
Eigen::Quaterniond quat(frame);
marker.pose.orientation.x = quat.x();
marker.pose.orientation.y = quat.y();
marker.pose.orientation.z = quat.z();
marker.pose.orientation.w = quat.w();
// these scales are relative to the hand frame (unit: meters)
marker.scale.x = length; // forward direction
marker.scale.y = width; // hand closing direction
marker.scale.z = height; // hand vertical direction
marker.color.a = 0.5;
marker.color.r = 0.0;
marker.color.g = 0.0;
marker.color.b = 0.5;
return marker;
}
visualization_msgs::msg::Marker GraspDetectorGPD::createHandBaseMarker(
const Eigen::Vector3d & start,
const Eigen::Vector3d & end, const Eigen::Matrix3d & frame, double length, double height, int id,
const std::string & frame_id)
{
Eigen::Vector3d center = start + 0.5 * (end - start);
visualization_msgs::msg::Marker marker;
marker.header.frame_id = frame_id;
marker.header.stamp = rclcpp::Clock(RCL_ROS_TIME).now();
marker.ns = "hand_base";
marker.id = id;
marker.type = visualization_msgs::msg::Marker::CUBE;
marker.action = visualization_msgs::msg::Marker::ADD;
marker.pose.position.x = center(0);
marker.pose.position.y = center(1);
marker.pose.position.z = center(2);
marker.lifetime = rclcpp::Duration(20.0, 0);
// use orientation of hand frame
Eigen::Quaterniond quat(frame);
marker.pose.orientation.x = quat.x();
marker.pose.orientation.y = quat.y();
marker.pose.orientation.z = quat.z();
marker.pose.orientation.w = quat.w();
// these scales are relative to the hand frame (unit: meters)
marker.scale.x = length; // forward direction
marker.scale.y = (end - start).norm(); // hand closing direction
marker.scale.z = height; // hand vertical direction
marker.color.a = 0.5;
marker.color.r = 0.0;
marker.color.g = 0.0;
marker.color.b = 1.0;
return marker;
}
} // namespace grasp_ros2
#include "rclcpp_components/register_node_macro.hpp"
RCLCPP_COMPONENTS_REGISTER_NODE(grasp_ros2::GraspDetectorGPD)
================================================
FILE: grasp_ros2/src/grasp_planner.cpp
================================================
// Copyright (c) 2019 Intel Corporation
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <tf2/LinearMath/Matrix3x3.h>
#include <tf2/LinearMath/Quaternion.h>
#include <tf2_ros/transform_listener.h>
#include <algorithm>
#include <condition_variable>
#include <map>
#include <memory>
#include <mutex>
#include <string>
#include <utility>
#include "grasp_library/ros2/consts.hpp"
#include "grasp_library/ros2/grasp_planner.hpp"
#include "grasp_library/ros2/ros_params.hpp"
namespace grasp_ros2
{
using GraspPlanning = moveit_msgs::srv::GraspPlanning;
GraspPlanner::GraspPlanner(const rclcpp::NodeOptions & options, GraspDetectorBase * grasp_detector)
: Node("GraspPlanner", options),
GraspCallback(), tfBroadcaster_(this), grasp_detector_(grasp_detector)
{
ROSParameters::getPlanningParams(this, param_);
callback_group_subscriber3_ = this->create_callback_group(
rclcpp::callback_group::CallbackGroupType::MutuallyExclusive);
auto service = [this](const std::shared_ptr<rmw_request_id_t> request_header,
const std::shared_ptr<GraspPlanning::Request> req,
const std::shared_ptr<GraspPlanning::Response> res) -> void {
this->grasp_service(request_header, req, res);
};
grasp_srv_ = this->create_service<GraspPlanning>("plan_grasps", service, rmw_qos_profile_default,
callback_group_subscriber3_);
grasp_detector_->add_callback(this);
tfBuffer_ = new tf2_ros::Buffer(std::make_shared<rclcpp::Clock>(RCL_ROS_TIME));
tfListener_ = std::make_shared<tf2_ros::TransformListener>(*tfBuffer_);
RCLCPP_INFO(logger_, "ROS2 Grasp Planning Service up...");
}
void GraspPlanner::grasp_callback(const grasp_msgs::msg::GraspConfigList::SharedPtr msg)
{
RCLCPP_INFO(logger_, "Received grasp callback");
rclcpp::Time rclcpp_time = std::make_shared<rclcpp::Clock>(RCL_ROS_TIME)->now() +
rclcpp::Duration(6, 0);
static bool tf_needed = (param_.grasp_frame_id_ != msg->header.frame_id);
RCLCPP_INFO(logger_, "tf_needed %d", tf_needed);
std_msgs::msg::Header header;
header.frame_id = tf_needed ? param_.grasp_frame_id_ : msg->header.frame_id;
header.stamp = msg->header.stamp;
grasp_msgs::msg::GraspConfig to_grasp;
for (auto from_grasp : msg->grasps) {
// skip low score grasp
if (from_grasp.score.data < param_.grasp_score_threshold_) {
RCLCPP_INFO(logger_, "skip low score grasps %f", from_grasp.score.data);
continue;
}
// transform grasp to grasp_frame_id
if (tf_needed) {
if (!transform(from_grasp, to_grasp, msg->header)) {
// skip transformation failure
continue;
}
}
if (param_.grasp_approach_angle_ != M_PI) {
// skip unacceptable approach
tf2::Vector3 approach(to_grasp.approach.x, to_grasp.approach.y, to_grasp.approach.z);
double ang = tf2::tf2Angle(param_.grasp_approach_, approach);
if (std::isnan(ang) ||
ang < -param_.grasp_approach_angle_ || ang > param_.grasp_approach_angle_)
{
RCLCPP_INFO(logger_, "skip unacceptable approach");
continue;
}
}
// apply grasp offset
to_grasp.bottom.x += param_.grasp_offset_[0];
to_grasp.bottom.y += param_.grasp_offset_[1];
to_grasp.bottom.z += param_.grasp_offset_[2];
// skip out of boundary grasps
if (!tf_needed || check_boundry(to_grasp.bottom)) {
// translate into moveit grasp
moveit_msgs::msg::Grasp moveit_msg = toMoveIt(to_grasp, header);
std::unique_lock<std::mutex> lock(m_);
moveit_grasps_.push_back(moveit_msg);
}
}
}
bool GraspPlanner::transform(
grasp_msgs::msg::GraspConfig & from, grasp_msgs::msg::GraspConfig & to,
const std_msgs::msg::Header & header)
{
geometry_msgs::msg::PointStamped from_top, to_top, from_surface, to_surface,
from_bottom, to_bottom;
geometry_msgs::msg::Vector3Stamped from_approach, to_approach, from_binormal, to_binormal,
from_axis, to_axis;
to = from;
from_top.point = from.top;
from_top.header = header;
from_surface.point = from.surface;
from_surface.header = header;
from_bottom.point = from.bottom;
from_bottom.header = header;
from_approach.vector = from.approach;
from_approach.header = header;
from_binormal.vector = from.binormal;
from_binormal.header = header;
from_axis.vector = from.axis;
from_axis.header = header;
while (rclcpp::ok()) {
try {
tfBuffer_->transform(from_top, to_top, param_.grasp_frame_id_);
tfBuffer_->transform(from_surface, to_surface, param_.grasp_frame_id_);
tfBuffer_->transform(from_bottom, to_bottom, param_.grasp_frame_id_);
tfBuffer_->transform(from_approach, to_approach, param_.grasp_frame_id_);
tfBuffer_->transform(from_binormal, to_binormal, param_.grasp_frame_id_);
tfBuffer_->transform(from_axis, to_axis, param_.grasp_frame_id_);
} catch (tf2::TransformException & ex) {
RCLCPP_WARN(logger_, "transform exception");
rclcpp::Rate(1).sleep();
continue;
}
break;
}
to.top = to_top.point;
to.surface = to_surface.point;
to.bottom = to_bottom.point;
to.approach = to_approach.vector;
to.binormal = to_binormal.vector;
to.axis = to_axis.vector;
return true;
}
bool GraspPlanner::check_boundry(const geometry_msgs::msg::Point & p)
{
RCLCPP_INFO(logger_, "point [%f %f %f]", p.x, p.y, p.z);
return p.x >= param_.grasp_boundry_[0] && p.x <= param_.grasp_boundry_[1] &&
p.y >= param_.grasp_boundry_[2] && p.y <= param_.grasp_boundry_[3] &&
p.z >= param_.grasp_boundry_[4] && p.z <= param_.grasp_boundry_[5];
}
moveit_msgs::msg::Grasp GraspPlanner::toMoveIt(
grasp_msgs::msg::GraspConfig & grasp,
const std_msgs::msg::Header & header)
{
moveit_msgs::msg::Grasp msg;
msg.grasp_pose.header = header;
msg.grasp_quality = grasp.score.data;
double offset = param_.eef_offset;
// set grasp position, translation from hand-base to the parent-link of EEF
msg.grasp_pose.pose.position.x = grasp.bottom.x - grasp.approach.x * offset;
msg.grasp_pose.pose.position.y = grasp.bottom.y - grasp.approach.y * offset;
msg.grasp_pose.pose.position.z = grasp.bottom.z - grasp.approach.z * offset;
// rotation matrix https://github.com/atenpas/gpd/blob/master/tutorials/hand_frame.png
tf2::Matrix3x3 r(
grasp.binormal.x, grasp.axis.x, grasp.approach.x,
grasp.binormal.y, grasp.axis.y, grasp.approach.y,
grasp.binormal.z, grasp.axis.z, grasp.approach.z);
tf2::Quaternion quat;
r.getRotation(quat);
// EEF yaw-offset to its parent-link (last link of arm)
quat *= tf2::Quaternion(tf2::Vector3(0, 0, 1), param_.eef_yaw_offset);
quat.normalize();
// set grasp orientation
msg.grasp_pose.pose.orientation = tf2::toMsg(quat);
RCLCPP_INFO(logger_, "==============offset is %f quat [%f %f %f %f]", offset,
msg.grasp_pose.pose.orientation.x, msg.grasp_pose.pose.orientation.y,
msg.grasp_pose.pose.orientation.z, msg.grasp_pose.pose.orientation.w);
// set pre-grasp approach
msg.pre_grasp_approach.direction.header = header;
msg.pre_grasp_approach.direction.vector = grasp.approach;
msg.pre_grasp_approach.min_distance = param_.grasp_min_distance_;
msg.pre_grasp_approach.desired_distance = param_.grasp_desired_distance_;
// set post-grasp retreat
msg.post_grasp_retreat.direction.header = header;
msg.post_grasp_retreat.direction.vector.x = -grasp.approach.x;
msg.post_grasp_retreat.direction.vector.y = -grasp.approach.y;
msg.post_grasp_retreat.direction.vector.z = -grasp.approach.z;
msg.post_grasp_retreat.min_distance = param_.grasp_min_distance_;
msg.post_grasp_retreat.desired_distance = param_.grasp_desired_distance_;
// set pre-grasp posture
msg.pre_grasp_posture.joint_names = param_.finger_joint_names_;
msg.pre_grasp_posture.points.push_back(param_.finger_points_open_);
// set grasp posture
msg.grasp_posture.joint_names = param_.finger_joint_names_;
msg.grasp_posture.points.push_back(param_.finger_points_close_);
return msg;
}
void GraspPlanner::grasp_service(
const std::shared_ptr<rmw_request_id_t> request_header,
const std::shared_ptr<GraspPlanning::Request> req,
const std::shared_ptr<GraspPlanning::Response> res)
{
(void)request_header;
(void)req;
RCLCPP_INFO(logger_, "Received Grasp Planning request");
{
std::unique_lock<std::mutex> lock(m_);
moveit_grasps_.clear();
grasp_detector_->start(req->target.id);
}
// blocking till grasps found
while (moveit_grasps_.empty()) {
rclcpp::Rate(20).sleep();
}
grasp_detector_->stop();
res->grasps = moveit_grasps_;
if (res->grasps.empty()) {
RCLCPP_INFO(logger_, "No expected grasp found.");
res->error_code.val = moveit_msgs::msg::MoveItErrorCodes::FAILURE;
} else {
RCLCPP_INFO(logger_, "%ld grasps found.", res->grasps.size());
res->error_code.val = moveit_msgs::msg::MoveItErrorCodes::SUCCESS;
}
}
} // namespace grasp_ros2
#include "rclcpp_components/register_node_macro.hpp"
RCLCPP_COMPONENTS_REGISTER_NODE(grasp_ros2::GraspPlanner)
================================================
FILE: grasp_ros2/src/ros_params.cpp
================================================
// Copyright (c) 2018 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <string>
#include <vector>
#include "grasp_library/ros2/ros_params.hpp"
namespace grasp_ros2
{
void ROSParameters::getDetectionParams(
rclcpp::Node * node,
GraspDetector::GraspDetectionParameters & param)
{
// Read hand geometry parameters.
node->get_parameter_or("finger_width", param.hand_search_params.finger_width_, 0.005);
node->get_parameter_or("hand_outer_diameter", param.hand_search_params.hand_outer_diameter_,
0.12);
node->get_parameter_or("hand_depth", param.hand_search_params.hand_depth_, 0.06);
node->get_parameter_or("hand_height", param.hand_search_params.hand_height_, 0.02);
node->get_parameter_or("init_bite", param.hand_search_params.init_bite_, 0.01);
// Read local hand search parameters.
node->get_parameter_or("nn_radius", param.hand_search_params.nn_radius_frames_, 0.01);
node->get_parameter_or("num_orientations", param.hand_search_params.num_orientations_, 8);
node->get_parameter_or("num_samples", param.hand_search_params.num_samples_, 100);
node->get_parameter_or("num_threads", param.hand_search_params.num_threads_, 4);
node->get_parameter_or("rotation_axis", param.hand_search_params.rotation_axis_, 2);
// Read plotting parameters.
node->get_parameter_or("plot_samples", param.plot_samples_, false);
node->get_parameter_or("plot_normals", param.plot_normals_, false);
param.generator_params.plot_normals_ = param.plot_normals_;
node->get_parameter_or("plot_filtered_grasps", param.plot_filtered_grasps_, false);
node->get_parameter_or("plot_valid_grasps", param.plot_valid_grasps_, false);
node->get_parameter_or("plot_clusters", param.plot_clusters_, false);
node->get_parameter_or("plot_selected_grasps", param.plot_selected_grasps_, false);
// Read general parameters.
param.generator_params.num_samples_ = param.hand_search_params.num_samples_;
param.generator_params.num_threads_ = param.hand_search_params.num_threads_;
node->get_parameter_or("plot_candidates", param.generator_params.plot_grasps_, false);
// Read preprocessing parameters.
node->get_parameter_or("remove_outliers", param.generator_params.remove_statistical_outliers_,
false);
node->get_parameter_or("voxelize", param.generator_params.voxelize_, true);
node->get_parameter_or("workspace", param.generator_params.workspace_,
std::vector<double>(std::initializer_list<double>({-1.0, 1.0, -1.0, 1.0, -1.0, 1.0})));
param.workspace_ = param.generator_params.workspace_;
// Read classification parameters and create classifier.
node->get_parameter_or("model_file", param.model_file_, std::string(""));
node->get_parameter_or("trained_file", param.weights_file_, std::string(""));
node->get_parameter_or("min_score_diff", param.min_score_diff_, 500.0);
node->get_parameter_or("create_image_batches", param.create_image_batches_, false);
node->get_parameter_or("device", param.device_, 0);
// Read grasp image parameters.
node->get_parameter_or("image_outer_diameter", param.image_params.outer_diameter_,
param.hand_search_params.hand_outer_diameter_);
node->get_parameter_or("image_depth", param.image_params.depth_,
param.hand_search_params.hand_depth_);
node->get_parameter_or("image_height", param.image_params.height_,
param.hand_search_params.hand_height_);
node->get_parameter_or("image_size", param.image_params.size_, 60);
node->get_parameter_or("image_num_channels", param.image_params.num_channels_, 15);
// Read learning parameters.
node->get_parameter_or("remove_plane_before_image_calculation", param.remove_plane_, false);
// Read grasp filtering parameters
node->get_parameter_or("filter_grasps", param.filter_grasps_, false);
node->get_parameter_or("filter_half_antipodal", param.filter_half_antipodal_, false);
param.gripper_width_range_.push_back(0.03);
param.gripper_width_range_.push_back(0.10);
// node->get_parameter("gripper_width_range", param.gripper_width_range_);
// Read clustering parameters
node->get_parameter_or("min_inliers", param.min_inliers_, 1);
// Read grasp selection parameters
node->get_parameter_or("num_selected", param.num_selected_, 5);
}
void ROSParameters::getPlanningParams(
rclcpp::Node * node,
GraspPlanner::GraspPlanningParameters & param)
{
node->get_parameter_or("grasp_service_timeout", param.grasp_service_timeout_, 0);
node->get_parameter_or("grasp_score_threshold", param.grasp_score_threshold_, 200);
node->get_parameter_or("grasp_frame_id", param.grasp_frame_id_, std::string("base"));
std::vector<double> approach;
node->get_parameter_or("grasp_approach", approach,
std::vector<double>(std::initializer_list<double>({0.0, 0.0, -1.0})));
param.grasp_approach_ = tf2::Vector3(approach[0], approach[1], approach[2]);
node->get_parameter_or("grasp_approach_angle", param.grasp_approach_angle_, M_PI);
node->get_parameter_or("grasp_offset", param.grasp_offset_,
std::vector<double>(std::initializer_list<double>({0.0, 0.0, 0.0})));
node->get_parameter_or("grasp_boundry", param.grasp_boundry_,
std::vector<double>(std::initializer_list<double>({-1.0, 1.0, -1.0, 1.0, -1.0, 1.0})));
node->get_parameter_or("eef_offset", param.eef_offset, 0.154);
node->get_parameter_or("eef_yaw_offset", param.eef_yaw_offset, 0.0);
node->get_parameter_or("grasp_min_distance", param.grasp_min_distance_, 0.06);
node->get_parameter_or("grasp_desired_distance", param.grasp_desired_distance_, 0.1);
// gripper parameters
std::vector<double> finger_opens, finger_closes;
node->get_parameter_or("finger_joint_names", param.finger_joint_names_,
std::vector<std::string>(std::initializer_list<std::string>({std::string("panda_finger_joint1"),
std::string("panda_finger_joint2")})));
node->get_parameter_or("finger_positions_open", param.finger_points_open_.positions,
std::vector<double>(std::initializer_list<double>({-0.01, 0.01})));
node->get_parameter_or("finger_positions_close", param.finger_points_close_.positions,
std::vector<double>(std::initializer_list<double>({-0.0, 0.0})));
}
} // namespace grasp_ros2
================================================
FILE: grasp_ros2/tests/CMakeLists.txt
================================================
find_package(ament_cmake REQUIRED)
find_package(ament_cmake_gtest REQUIRED)
find_package(rclcpp REQUIRED)
find_package(grasp_msgs REQUIRED)
find_package(moveit_msgs REQUIRED)
find_package(pcl_conversions REQUIRED)
set(TEST_NAME tgrasp_ros2)
ament_add_gtest(${TEST_NAME} tgrasp_ros2.cpp ${CMAKE_CURRENT_SOURCE_DIR}/../src/consts.cpp)
if(TARGET ${TEST_NAME})
get_filename_component(RESOURCE_DIR "resource" ABSOLUTE)
configure_file(tgrasp_ros2.h.in tgrasp_ros2.h)
include_directories(${CMAKE_CURRENT_BINARY_DIR} ${PCL_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS})
target_include_directories(${TEST_NAME} PUBLIC
${grasp_library_INCLUDE_DIRS}
)
ament_target_dependencies(${TEST_NAME}
pcl_conversions
rclcpp
grasp_msgs
moveit_msgs
sensor_msgs
)
target_link_libraries(${TEST_NAME} ${GTEST_LIBRARIES} ${PCL_LIBRARIES})
# Install binaries
install(TARGETS ${TEST_NAME}
RUNTIME DESTINATION bin
)
install(TARGETS ${TEST_NAME}
DESTINATION lib/${PROJECT_NAME}
)
endif()
================================================
FILE: grasp_ros2/tests/resource/table_top.pcd
================================================
[File too large to display: 13.2 MB]
================================================
FILE: grasp_ros2/tests/tgrasp_ros2.cpp
================================================
// Copyright (c) 2019 Intel Corporation
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <gtest/gtest.h>
#include <rclcpp/rclcpp.hpp>
#include <sensor_msgs/msg/point_cloud2.hpp>
#include <grasp_msgs/msg/grasp_config_list.hpp>
#include <moveit_msgs/msg/move_it_error_codes.hpp>
#include <moveit_msgs/srv/grasp_planning.hpp>
#include <pcl_conversions/pcl_conversions.h>
#include <cassert>
#include <memory>
#include <string>
#include <thread>
#include "grasp_library/ros2/consts.hpp"
#include "./tgrasp_ros2.h"
using Consts = grasp_ros2::Consts;
using GraspPlanning = moveit_msgs::srv::GraspPlanning;
static bool received_topic = false;
static int num_grasps = 0;
static bool pcd_stop = false;
static rclcpp::Node::SharedPtr node = nullptr;
static std::shared_ptr<GraspPlanning::Response> result = nullptr;
static void pcd_publisher()
{
char path[512];
snprintf(path, sizeof(path), "%s/table_top.pcd", RESOURCE_DIR);
pcl::PointCloud<pcl::PointXYZRGBA> cloud;
if (0 != pcl::io::loadPCDFile<pcl::PointXYZRGBA>(path, cloud)) {
return;
}
sensor_msgs::msg::PointCloud2 msg;
pcl::toROSMsg(cloud, msg);
msg.header.frame_id = "camera_color_optical_frame";
auto pcd_node = rclcpp::Node::make_shared("PCDPublisher");
auto pcd_pub = pcd_node->create_publisher<sensor_msgs::msg::PointCloud2>(
Consts::kTopicPointCloud2, 10);
rclcpp::Rate loop_rate(30);
while (!pcd_stop && rclcpp::ok()) {
pcd_pub->publish(msg);
loop_rate.sleep();
}
}
static void topic_cb(const grasp_msgs::msg::GraspConfigList::SharedPtr msg)
{
RCLCPP_INFO(node->get_logger(), "Topic received");
received_topic = true;
num_grasps = msg->grasps.size();
}
TEST(GraspLibraryTests, TestGraspService) {
EXPECT_TRUE(result->error_code.val == moveit_msgs::msg::MoveItErrorCodes::SUCCESS);
EXPECT_GT(result->grasps.size(), uint32_t(0));
}
TEST(GraspLibraryTests, TestGraspTopic) {
rclcpp::Rate(1).sleep();
EXPECT_TRUE(received_topic);
EXPECT_GT(num_grasps, 0);
}
int main(int argc, char * argv[])
{
rclcpp::init(argc, argv);
std::thread pcd_thread(pcd_publisher);
pcd_thread.detach();
node = rclcpp::Node::make_shared("GraspLibraryTest");
auto sub = node->create_subscription<grasp_msgs::msg::GraspConfigList>(
Consts::kTopicDetectedGrasps, rclcpp::QoS(rclcpp::KeepLast(1)), topic_cb);
auto client = node->create_client<GraspPlanning>("plan_grasps");
while (!client->wait_for_service(std::chrono::seconds(1))) {
if (!rclcpp::ok()) {
RCLCPP_ERROR(node->get_logger(), "Client interrupted");
return 1;
}
RCLCPP_INFO(node->get_logger(), "Wait for service");
}
auto request = std::make_shared<GraspPlanning::Request>();
auto result_future = client->async_send_request(request);
RCLCPP_INFO(node->get_logger(), "Request sent");
if (rclcpp::spin_until_future_complete(node, result_future) !=
rclcpp::executor::FutureReturnCode::SUCCESS)
{
RCLCPP_ERROR(node->get_logger(), "Request failed");
return 1;
}
result = result_future.get();
RCLCPP_INFO(node->get_logger(), "Response received %d", result->error_code.val);
testing::InitGoogleTest(&argc, argv);
int ret = RUN_ALL_TESTS();
pcd_stop = true;
rclcpp::Rate(3).sleep();
// pcd_thread.join() disabled. It causes runtest exit abnormally
node = nullptr;
rclcpp::shutdown();
return ret;
}
================================================
FILE: grasp_ros2/tests/tgrasp_ros2.h.in
================================================
// Copyright (c) 2018 Intel Corporation. All Rights Reserved
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef GRASP_LIBRARY__TGRASP_LIBRARY_H_
#define GRASP_LIBRARY__TGRASP_LIBRARY_H_
#include <pcl/io/pcd_io.h>
#include <pcl/point_types.h>
#define RESOURCE_DIR "@RESOURCE_DIR@"
#endif // GRASP_LIBRARY__TGRASP_LIBRARY_H_
================================================
FILE: grasp_tutorials/CMakeLists.txt
================================================
# Copyright (c) 2019 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
cmake_minimum_required(VERSION 3.5)
project(grasp_tutorials)
find_package(ament_cmake REQUIRED)
ament_package()
================================================
FILE: grasp_tutorials/README.md
================================================
# ROS2 Grasp Library Tutorials
This tutorials aim to introduce how to
* Install, build, and launch the ROS2 Grasp Planner and Detector
* Use launch options to customize in a new workspace
* Bring up the intelligent visual grasp solution on a new robot
* Do hand-eye calibration for a new camera setup
* Launch the example applications
## Build and test this tutorial
```bash
cd ros2_grasp_library/grasp_tutorials
sphinx-build . build # check the outputs in the ./build/ folder
cd ros2_grasp_library/grasp_utils/robot_interface
doxygen Doxyfile # check the outputs in the ./build/ folder
```
================================================
FILE: grasp_tutorials/conf.py
================================================
# -*- coding: utf-8 -*-
#
# app_tutorials documentation build configuration file, created by
# sphinx-quickstart on Thu Oct 18 17:31:36 2018.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys
import os
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#sys.path.insert(0, os.path.abspath('.'))
# -- General configuration ------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.doctest',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.mathjax',
'sphinx.ext.ifconfig',
'sphinx.ext.viewcode',
]
# Add any paths that contain templates here, relative to this directory.
# templates_path = []
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'ROS2 Grasp Library Tutorials'
copyright = u'2019, Intel Corporation'
author = u'sharron, liu; yu, yan'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = u'0.5.0'
# The full version, including alpha/beta/rc tags.
release = u'0.5.0'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# exclude_patterns = ['doc/ur5_setup_with_moveit.rst', 'doc/franka_setup_with_moveit.rst']
# The reST default role (used for this markup: `text`) to use for all
# documents.
#default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []
# If true, keep warnings as "system message" paragraphs in the built documents.
#keep_warnings = False
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = True
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
#html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
#html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None
# The name of an image file (relative to this directory) to use as a favicon of
# the docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
#html_extra_path = []
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}
# If false, no module index is generated.
#html_domain_indices = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None
# Language to be used for generating the HTML full-text search index.
# Sphinx supports the following languages:
# 'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja'
# 'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr'
#html_search_language = 'en'
# A dictionary with options for the search language support, empty by default.
# Now only 'ja' uses this config value
#html_search_options = {'type': 'default'}
# The name of a javascript file (relative to the configuration directory) that
# implements a search results scorer. If empty, the default will be used.
#html_search_scorer = 'scorer.js'
# Output file base name for HTML help builder.
htmlhelp_basename = 'grasp_tutorialsdoc'
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#'preamble': '',
# Latex figure (float) alignment
#'figure_align': 'htbp',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'grasp_tutorials.tex', u'grasp\\_tutorials Documentation',
u'Intel Corporation', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'grasp_tutorials', u'grasp_tutorials Documentation',
[author], 1)
]
# If true, show URL addresses after external links.
#man_show_urls = False
# -- Options for Texinfo output -------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'grasp_tutorials', u'grasp_tutorials Documentation',
author, 'grasp_tutorials', 'One line description of project.',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
#texinfo_appendices = []
# If false, no module index is generated.
#texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'
# If true, do not generate a @detailmenu in the "Top" node's menu.
#texinfo_no_detailmenu = False
# Example configuration for intersphinx: refer to the Python standard library.
intersphinx_mapping = {'https://docs.python.org/': None}
================================================
FILE: grasp_tutorials/doc/bringup_robot.rst
================================================
Bring up a New Robot
====================
This tutorial explains what is expected to do when bringing up this ROS2 Grasp Library on a new robot.
.. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html
.. _GPD: https://github.com/atenpas/gpd
.. _OpenVINO™: https://software.intel.com/en-us/openvino-toolkit
.. _Grasp: http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html
Minimum APIs to Implement
-------------------------
- `moveToTcpPose`
- Move the TCP (tool center point, usually the end effector of the robot arm, not the hand) to a pose specified with position [x, y, z] and orientation [alpha, betta, gamma] (also called [roll, pitch, yaw]). This function returns when the robot `moved` to the specified pose.
- `moveToJointValues`
- Move each joint of the robot to the specified values (usually angles). This function differs from the `moveToTcpPose` since a same TCP pose may be reached with various solutions of joint values. This function is used when the application expect the joints of the robot in specific state, that is proper to performe any successive picking or place action. This function returns when the robot `moved` to the specified joint values.
- `open`
- Open the gripper.
- `close`
- Close the gripper.
- `startLoop`
- Start a loop to read and publish the robot state. Robot states are subsribed by Rviz for visualization.
Optional implementation and possible extentions
-----------------------------------------------
- Optionally you may implement the `pick` and `place` interface to customize the pick and placed pipeline, or even plug-in the collision avoidance motion planning.
- Python extention is not supported. It's possible to implement the Robot Interface in python and bind to C++.
Refer to `Robot Interface API <../api/html/index.html>`_ for more detailed definition.
Example UR5 Implementation
--------------------------
Refer to the UR5 `example <https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/robot_interface/src>`_ implementatino for Robot Interface.
Test Your Implementation
------------------------
It's important to test your implementation before integrating this part with other components in ROS2 Grasp Library.
Refer to `UR5 tests <https://github.com/intel/ros2_grasp_library/blob/master/grasp_utils/robot_interface/test/ur_test_move_command.cpp>`_, adapt it to your robot tests.
Bring up Robot Control Applications
-----------------------------------
Once finished the testing, you may start to bring up the `Draw X <draw_x.html>`_ app or the `fixed position pick and place <fixed_position_pick.html>`_ app on your new robot. These application does not require camera, instead they control the robot only.
================================================
FILE: grasp_tutorials/doc/draw_x.rst
================================================
Draw X
======
Overview
--------------
This demo shows how to use the robot interface to draw letter ``X``
at the fixed positions with an UR5 robot arm.
Requirement
------------
Before running the code, make sure you have
followed the instructions below to setup the robot correctly.
- Hardware
- Host running ROS2
- `UR5`_
- Software
- `ROS2 Dashing`_ Desktop
- `robot_interface`_
.. _UR5: https://www.universal-robots.com/products/ur5-robot
.. _ROS2 Dashing: https://index.ros.org/doc/ros2/Installation/Dashing/Linux-Install-Debians/
.. _robot_interface: https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/robot_interface
Download and Build the Example Code
------------------------------------
Within your ROS2 workspace, download and compile the example code:
::
cd <path_of_your_ros2_workspace>/src
git clone https://github.com/intel/ros2_grasp_library.git
cd ..
colcon build --base-paths src/ros2_grasp_library/grasp_apps/draw_x
Launch the Application
----------------------
- Launch the application
::
ros2 launch draw_x draw_x.launch.py
.. note:: Please make sure the emergency button on the teach pendant is in your hand,
in case there is any accident.
- Expected Outputs:
1. The robot moves its arm to the home pose
2. The robot moves its arm to the pose above the first corner of X
3. The robot moves its arm down to the first corner of X
4. The robot moves its arm to the second corner of X
5. The robot moves its arm up to the pose above the second corner of X
6. The robot moves its arm to the pose above the third corner of X
7. The robot moves its arm down to the third corner of X
8. The robot moves its arm to the fourth corner of X
9. The robot moves its arm up to the pose above the fourth corner of X
10. The robot moves its arm to the home pose again
================================================
FILE: grasp_tutorials/doc/fixed_position_pick.rst
================================================
Fixed Position Pick
====================
Overview
--------------
This demo shows how to use the robot interface to pick and place a
object at a predefined location with an UR5 robot arm.
Requirement
------------
Before running the code, make sure you have
followed the instructions below to setup the robot correctly.
- Hardware
- Host running ROS2
- `UR5`_
- `Robot Gripper`_
- Software
- `ROS2 Dashing`_ Desktop
- `robot_interface`_
.. _UR5: https://www.universal-robots.com/products/ur5-robot
.. _ROS2 Dashing: https://index.ros.org/doc/ros2/Installation/Dashing/Linux-Install-Debians/
.. _robot_interface: https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/robot_interface
.. _Robot Gripper: https://www.universal-robots.com/plus/end-effectors/hitbot-electric-gripper
Download and Build the Example Code
------------------------------------
Within your ROS2 workspace, download and compile the example code:
::
cd <path_of_your_ros2_workspace>/src
git clone https://github.com/intel/ros2_grasp_library.git
cd ..
colcon build --base-paths src/ros2_grasp_library/grasp_apps/fixed_position_pick
Launch the Application
----------------------
- Launch the application
::
ros2 launch fixed_position_pick fixed_position_pick.launch.py
.. note:: Please make sure the emergency button on the teach pendant is in your hand,
in case there is any accident.
- Expected Outputs:
1. The robot moves to the home pose
2. The robot picks up an object from the predefined location
3. The robot places the object to another location
4. The robot moves back to the home pose
================================================
FILE: grasp_tutorials/doc/getting_start.rst
================================================
Getting Start
=============
This tutorial introduces getting start to use this ROS2 Grasp Library.
.. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html
.. _GPD: https://github.com/atenpas/gpd
.. _OpenVINO™: https://software.intel.com/en-us/openvino-toolkit
.. _Grasp: http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html
ROS2 Grasp Planner and Detector
-------------------------------
In this section, you will start with an RGBD sensor connecting to a Ubuntu host machine.
The grasp detection relys on OpenVINO™ toolkit. Follow this `grasp_planner <grasp_planner.html>`_ instruction to install the toolkit, then build and install the ROS2 Grasp Planner and Detector with your camera.
After launch the grasp planner, from rviz you will see grasp detection results highlighted as blue markers.
.. image:: ./grasp_ros2/img/ros2_grasp_library.png
:width: 643 px
:height: 435 px
:align: center
Use Launch Options for Customization
------------------------------------
ROS2 parameters are supported to customize the Grasp Detector and Grasp Planner for your local workspace. For example, the topic name of point cloud from RGBD sensor, the camera workspace (in the frame_id of the point cloud image), the grasp approach direction and angle, the grasp boundary (in the frame_id of the robot base).
Robot Interface
---------------
In this section, you will bring up your robot by implementing the Robot Interface. Currently the robot interface is defined in C++, python vesion is still working in progress.
Robot Interface are the minimum APIs a robot should provide to enable this solution. Follow this `robot_interface <bringup_robot.html>`_ insturction to implement the required `move`, `open`, `close`, `startLoop` interfaces.
Then make sure your implementation passed the Robot Interface tests, to garantee later integration with the example applications. Also you may try the "Robot Control Applications" (like Draw X, fixed position pick and place) to verify your implemntation working well.
Hand-eye Calibration
--------------------
Now start to generate transformation between the camera and the robot. Follow this `handeye_calibration <handeye_calibration.html>`_ insturtion to finish the procedure of hand-eye calibration. The calibration procedure need to be done at the time when camera is setup. The resulting transformation will be remembered in your local environment for later publishing when launching the applications.
Launch Intelligent Visual Grasp Applications
--------------------------------------------
To this step, you may start to launch the applications.
`Random Picking <random_pick.html>`_ runs OpenVINO grasp detection on GPU, and sends request to ROS2 MoveIt Grasp Planner for grasp planning and detection. The most likely successful grasps are returned by the Grasp Pose Detection from CNN inference, taking 3D point cloud inputs from the camera. The picking order is not pre-defined, so called random picking.
`Recognition Picking <recognize_pick.html>`_ runs OpenVINO grasp detection on GPU, and runs OpenVINO object segmentation on CPU or Movidius VPU. The masks of recognized objects are returned from the `mask_rcnn` model. The `place_publisher` publishing the name of the object to pick and the position to place, so called recognition picking.
================================================
FILE: grasp_tutorials/doc/grasp_api.rst
================================================
ROS2 Grasp Library APIs
=======================
.. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html
.. _GPD: https://github.com/atenpas/gpd
.. _OpenVINO™: https://software.intel.com/en-us/openvino-toolkit
.. _Grasp: http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html
.. _PointCloud2: https://github.com/ros2/common_interfaces/blob/master/sensor_msgs/msg/PointCloud2.msg
.. _ObjectsInMasks: https://github.com/intel/ros2_openvino_toolkit/blob/master/people_msgs/msg/ObjectsInMasks.msg
.. _Image: https://github.com/ros2/common_interfaces/blob/master/sensor_msgs/msg/Image.msg
.. _TransformStamped: https://github.com/ros2/common_interfaces/blob/master/geometry_msgs/msg/TransformStamped.msg
Grasp Planning ROS2 Interfaces
------------------------------
- Subscribed Topics
- PointCloud2 topic from RGBD sensor (sensor_msgs::msg::`PointCloud2`_)
- Segmented object topic (people_msgs::msg::`ObjectsInMasks`_)
- Delivered Services
- plan_grasps (moveit_msgs::srv::`GraspPlanning`_)
Hand-Eye Calibration ROS2 Interfaces
------------------------------------
- Subscribed Topics
- RGB image from sensor (sensor_msgs::msg::`Image`_)
- Broadcasted Transforms
- Static transform btw camera and robot (geometry_msgs::msg::`TransformStamped`_)
Robot Interface API
-------------------
- `API <../api/html/index.html>`_
================================================
FILE: grasp_tutorials/doc/grasp_planner.rst
================================================
Grasp Planner
=============
Tutorials
---------
- `Install OpenVINO™ toolkit`_
.. _Install OpenVINO™ toolkit: https://github.com/intel/ros2_grasp_library/tree/master/grasp_tutorials/doc/grasp_ros2/install_openvino.md
- `Launch ROS2 Grasp Planner and Detector`_
.. _Launch ROS2 Grasp Planner and Detector: https://github.com/intel/ros2_grasp_library/tree/master/grasp_tutorials/doc/grasp_ros2/tutorials_1_grasp_ros2_with_camera.md
- `Launch tests`_
.. _Launch tests: https://github.com/intel/ros2_grasp_library/tree/master/grasp_tutorials/doc/grasp_ros2/tutorials_2_grasp_ros2_test.md
- `Use launch options`_
.. _Use launch options: https://github.com/intel/ros2_grasp_library/tree/master/grasp_tutorials/doc/grasp_ros2/tutorials_3_grasp_ros2_launch_options.md
================================================
FILE: grasp_tutorials/doc/grasp_ros2/install_gpd.md
================================================
Installation guide for Grasp Pose Detection
### Install [GPG](https://github.com/atenpas/gpg)
1. Get the code
```bash
git clone https://github.com/atenpas/gpg.git
cd gpg
```
2. Build the library
```bash
mkdir build && cd build
cmake ..
make
sudo make install
# by default, "libgrasp_candidates_generator.so" shall be installed to "/usr/local/lib"
```
### Install [GPD](https://github.com/sharronliu/gpd)
1. Get the code, originally derived from [GPD](https://github.com/atenpas/gpd) tag 1.5.0
```bash
git clone https://github.com/sharronliu/gpd.git
git checkout libgpd
cd gpd/src/gpd
```
2. Build the library
```bash
mkdir build && cd build
cmake -DUSE_OPENVINO=ON ..
make
sudo make install
# by default, "libgrasp_pose_detection.so" shall be installed to "/usr/local/lib"
# and header files installed to "/usr/local/include/gpd"
```
================================================
FILE: grasp_tutorials/doc/grasp_ros2/install_openvino.md
================================================
# Intel® DLDT toolkit and Intel® OpenVINO™ toolkit
This tutorial introduces the DLDT toolkit and OpenVINO toolkit.
Intel® [DLDT](https://github.com/opencv/dldt) is a Deep Learning Deployment Toolkit common to all architectures. The toolkit allows developers to convert pre-trained deep learning models into optimized Intermediate Representation (IR) models, then deploy the IR models through a high-level C++ Inference Engine API integrated with application logic. Additionally, [Open Model Zoo](https://github.com/opencv/open_model_zoo) provides more than 100 pre-trained optimized deep learning models and a set of demos to expedite development of high-performance deep learning inference applications. Online tutorials are availble for
* [Inference Engine Build Instructions](https://github.com/opencv/dldt/blob/2019/inference-engine/README.md)
Intel® [OpenVINO™](https://software.intel.com/en-us/openvino-toolkit) (Open Visual Inference & Neural Network Optimization) toolkit enables CNN-based deep learning inference at the edge computation, extends workloads across Intel® hardware (including accelerators) and maximizes performance. The toolkit supports heterogeneous execution across various compution vision devices -- CPU, GPU, Intel® Movidius™ NCS, and FPGA -- using a common API. Online tutorials are available for
* [Model Optimize Developer Guide](https://software.intel.com/en-us/articles/OpenVINO-ModelOptimizer)
* [Inference Engine Developer Guide](https://software.intel.com/en-us/articles/OpenVINO-InferEngine)
* [Intel® Neural Compute Stick 2](https://software.intel.com/en-us/neural-compute-stick/get-started)
## Install DLDT and OpenVINO
It's recommended to refer to the online documents of the toolkits for the latest installation instruction. Below is detailed steps we verified with Ubuntu 18.04 on Intel NUC6i7KYK for your ref.
1. Build and install Inference Engine
```bash
git clone https://github.com/opencv/dldt.git
git checkout 2019_R3
# follow the instructions below to install all dependents, including mklml, opencl, etc.
# https://github.com/opencv/dldt/blob/2019_R3/inference-engine/README.md#build-on-linux-systems
# build
mkdir build && cd build
cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/usr/local -DGEMM=MKL -DMKLROOT=/usr/local/lib/mklml -DENABLE_MKL_DNN=ON -DENABLE_CLDNN=ON ..
make -j8
sudo make install
```
2. Share the CMake configures for Inference Engine to be found by other packages
```bash
sudo mkdir /usr/share/InferenceEngine
sudo cp InferenceEngineConfig*.cmake /usr/share/InferenceEngine
sudo cp targets.cmake /usr/share/InferenceEngine
```
Then Inference Engine will be found when adding "find_package(InferenceEngine)" into the CMakeLists.txt
3. Configure library path for dynamic loading
```bash
echo `pwd`/../bin/intel64/Release/lib | sudo tee -a /etc/ld.so.conf.d/openvino.conf
sudo ldconfig
```
4. Optionally install plug-ins for InferenceEngine deployment on heterogeneous devices
* Install [plug-in](https://github.com/opencv/dldt/blob/2019_R3/inference-engine/README.md#optional-additional-installation-steps-for-the-intel-movidius-neural-compute-stick-and-neural-compute-stick-2) for deployment on Intel Movidius Neural Computation Sticks Myriad X.
================================================
FILE: grasp_tutorials/doc/grasp_ros2/tutorials_1_grasp_ros2_with_camera.md
================================================
# OpenVINO Grasp Library with RGBD Camera
This tutorial introduce the OpenVINO environment setup, how to build and launch Grasp Library with RGBD camera.
## Requirements
### Hardware
* Host running ROS2/ROS
* RGBD sensor
### Software
We verified the software with Ubuntu 18.04 Bionic and ROS2 Dashing release. Verification with ROS2 MoveIt is still work in progress. Before this, we have verified the grasp detection with MoveIt Melodic branch (tag 0.10.8) and our visual pick & place application to be shared as [MoveIt Example Apps](https://github.com/ros-planning/moveit_example_apps).
* Install ROS2 packages
[ros-dashing-desktop](https://index.ros.org/doc/ros2/Installation/Dashing/Linux-Install-Debians)
* Install non ROS packages
```bash
sudo apt-get install libpcl-dev libeigen3-dev
```
* Install [Intel OpenVINO Toolkit](install_openvino.md)
* Install [GPD](install_gpd.md)
## Build Grasp Library
```bash
# get the source codes
mkdir ~/ros2_ws/src -p
cd ~/ros2_ws/src
git clone https://github.com/intel/ros2_grasp_library.git
# copy GPD models
cp -a <path-to-gpd-repo>/models ros2_grasp_library/gpd
# build
cd ..
source /opt/ros/dashing/setup.bash
colcon build --symlink-install --packages-select grasp_msgs moveit_msgs grasp_ros2
source ./install/local_setup.bash
```
## Launch Grasp Library
```bash
# Terminal 1, Optionally, launch Rviz2 to illustrate detection results.
ros2 run rviz2 rviz2 -d src/ros2_grasp_library/grasp_ros2/rviz2/grasp.rviz
# Note You may customize the ".rviz" file for your own camera, for example:
# To change to fixed frame: "Global Options -> Fixed Frame"
# To change the point cloud topic: "Point Cloud 2 -> Topic"
# Terminal 2, launch RGBD camera
# e.g. launch [ROS2 Realsenes](https://github.com/intel/ros2_intel_realsense/tree/refactor)
# or, with a ros-bridge, launch any ROS OpenNI RGBD cameras, like [ROS Realsense](https://github.com/intel-ros/realsense)
ros2 run realsense_node realsense_node
# Terminal 3, launch Grasp Library
ros2 run grasp_ros2 grasp_ros2 __params:=src/ros2_grasp_library/grasp_ros2/cfg/grasp_ros2_params.yaml
```
================================================
FILE: grasp_tutorials/doc/grasp_ros2/tutorials_2_grasp_ros2_test.md
================================================
# Grasp Library Tests and Exampels
This tutorial documents Grasp Library tests which also serve as example codes for the usage of Grasp Library.
## Grasp Library Tests
Test Suites enabled:
* ROS2 built-in tests for static code scanning like, copyright tests, cppcheck tests, cpplint tests, lint_cmake tests, uncrustify tests, xmllint tests.
* Grasp ROS2 basic functional tests: tgrasp_ros2, basic tests cover ROS2 topic and ROS2 service of Grasp Library.
Before test, make sure you have setup the environment to build the Grasp Library, following tutorials [Grasp Library with RGBD Camera](tutorials_1_grasp_library_with_camera.md). The tests take inputs from a pre-stored PointCloud file (.pcd). Thus it's unnecessary to launch an RGBD camera.
```bash
# Terminal 1, launch Grasp Library
ros2 run grasp_ros2 grasp_ros2 __params:=src/ros2_grasp_library/grasp_ros2/cfg/test_grasp_ros2.yaml
# Terminal 2, run tests
colcon test --packages-select grasp_msgs grasp_ros2
```
For failed cases check detailed logs at "log/latest_test/grasp_ros2/stdout.log".
## Grasp Library Examples
The [grasp test codes](../grasp_ros2/tests/tgrasp_ros2.cpp) also demonstrate how to use this Grasp Library for grasp detection and grasp planning.
### Grasp Detection Example (Non-MoveIt App)
This example creats ROS2 subscription to the "Detected Grasps" topic and get the detection results from callback. Grasp Library is expected to work in 'auto_mode=true', sensor-driven grasp detection, see example launch options [here](../grasp_ros2/cfg/grasp_ros2_params.yaml).
```bash
#include <rclcpp/rclcpp.hpp>
#include <grasp_msgs/msg/grasp_config_list.hpp>
#include "grasp_ros2/consts.hpp"
static rclcpp::Node::SharedPtr node = nullptr;
static void topic_cb(const grasp_msgs::msg::GraspConfigList::SharedPtr msg)
{
RCLCPP_INFO(node->get_logger(), "Grasp Callback Received");
}
int main(int argc, char * argv[])
{
// init ROS2
rclcpp::init(argc, argv);
// create ROS2 node
node = rclcpp::Node::make_shared("GraspDetectionExample");
// subscribe to the "Detected Grasps" topic
auto sub = node->create_subscription<grasp_msgs::msg::GraspConfigList>
(Consts::kTopicDetectedGrasps, rclcpp::QoS(rclcpp::KeepLast(1)), topic_cb);
// create ROS2 executor to process any pending in/out messages
rclcpp::spin(node);
node = nullptr;
rclcpp::shutdown();
return 0;
}
```
### Grasp Planning Example (MoveIt App)
This example creates ROS2 client for the "plan_grasps" service and get the palnning results from async service response. Grasp Library is expected to work in 'auto_mode=false', service-driven grasp detection, see launch option example [here](../grasp_ros2/cfg/test_grasp_ros2.yaml).
```bash
#include <rclcpp/rclcpp.hpp>
#include <moveit_msgs/msg/move_it_error_codes.hpp>
#include <moveit_msgs/srv/grasp_planning.hpp>
#include "grasp_ros2/consts.hpp"
static rclcpp::Node::SharedPtr node = nullptr;
static std::shared_ptr<GraspPlanning::Response> result = nullptr;
int main(int argc, char * argv[])
{
// init ROS2
rclcpp::init(argc, argv);
// create ROS2 node
node = rclcpp::Node::make_shared("GraspPlanningExample");
// create ROS2 client for MoveIt "plan_grasps" service
auto client = node->create_client<GraspPlanning>("plan_grasps");
// wait for ROS2 service ready
while (!client->wait_for_service(std::chrono::seconds(1))) {
if (!rclcpp::ok()) {
RCLCPP_ERROR(node->get_logger(), "Client interrupted");
return 1;
}
RCLCPP_INFO(node->get_logger(), "Wait for service");
}
// fill in a request
auto request = std::make_shared<GraspPlanning::Request>();
// send async request
auto result_future = client->async_send_request(request);
RCLCPP_INFO(node->get_logger(), "Request sent");
// wait for response
if (rclcpp::spin_until_future_complete(node, result_future) !=
rclcpp::executor::FutureReturnCode::SUCCESS)
{
RCLCPP_ERROR(node->get_logger(), "Request failed");
return 1;
}
// get grasp planning results from response
result = result_future.get();
RCLCPP_INFO(node->get_logger(), "Response received %d", result->error_code.val);
node = nullptr;
rclcpp::shutdown();
return 0;
}
```
================================================
FILE: grasp_tutorials/doc/grasp_ros2/tutorials_3_grasp_ros2_launch_options.md
================================================
# Grasp Library Launch Options and Customization Notes
This tutorial documents the launch options which are used for customization. Each option will be introduced in the following format:
* **option_name** [**default_value**|other_values]: Description of this option. Customization Notes.
## GraspDetectorGPD Launch Options
* **cloud_topic** [**"/camera/depth_registered/points"**|"string"]: Name of point cloud topic as input to the grasp detection, default value compliant with an RGBD OpenNI camera.
* **device** [**0**|1|2|3]: Configure device for grasp pose inference to execute, 0 for CPU, 1 for GPU, 2 for VPU, 3 for FPGA. In case OpenVINO plug-ins are installed ([tutorial](install_openvino.md)), this configure deploy the CNN based deep learning inference on to the target device. Deploying the inference onto **GPU or VPU** will save CPU loads for other computation tasks.
* **auto_mode** [false|**true**]: Configure grasp detection mode. When auto_mode is true, Grasp Library works in sensor-driven mode, processing grasp detection when point cloud message arrives. When auto_mode is false, Grasp Library works in service-driven mode, processing grasp detection when a service request arrives. Configure to **service-driven** mode will save most CPU loads against that of the sensor-driven mode.
* **plane_remove** [**false**|true]: Configure whether or not remove the planes (like the table plane) from point cloud input. Enabling this helps to avoid generating grasp poses across the table.
* **workspace** [**[-1.0, 1.0, -1.0, 1.0, -1.0, 1,0]**|[1*6 double]]: Configure a boundry cube in camera frame for grasp generation and detection. *This need to be customized according to user's setup.*
* **finger_width** [**0.005**|double]: The finger thickness in metres. *This need to be customized according to user's robot hand.*
* **hand_outer_diameter** [**0.12**|double]: The maximum robot hand aperture in metres. *This need to be customized according to user's robot hand.*
* **hand_depth** [**0.06**|double]: The hand depth (the finger length) in metres. Tuning this parameter will affect the "GraspConfig::bottom" field (the hand base) in the grasp detection results. *This need to be customized according to user's robot hand.*
* **hand_height** [**0.02**|double]: The finger breadth in metres. *This need to be customized according to user's robot hand.*
## GraspPlanner Launch Options
* **grasp_service_timeout** [**5**|double]: Timeout in seconds for a service request waiting for grasp detection result. Grasp Planner will not take point could inputs from the history buffer. Indeed after receiving a service request, Grasp Planner will start grasp detection on the coming point cloud input. This parameter configures the timeout period for Grasp Planner to wait for the grasp detection result. Usually this's an amount of max latencies in RGBD sensor, Grasp Detector, Grasp Planner, any other nodes in the pipeline, additionally with an estimated worst delay in the system.
* **grasp_score_threshold** [**200**|integer]: Minimum score expected for grasps returned from this service.
* **grasp_frame_id** [**"base"**|"string"]: Frame id expected for grasps returned from this service. When this parameter is specified, Grasp Planner try to transform the grasp from the original frame (usually a camera's color frame) to this target frame, given the TF available.
* **grasp_approach** [**[0.0, 0.0, -1.0]**|[1*3 double]]: Specify expected approach direction in the target frame specified by 'grasp_frame_id'. Grasp Planner will return grasp poses with approach direction approximate to this parameter. This is useful when a MoveIt application wants to constraint the approaching direction. *This need to be customized according to user's setup.*
* **grasp_approach_angle** [**M_PI**|3.14|1.57|double]: Maximum angle in radian acceptable between the expected 'grasp_approach' and the real approach returned from this service. Default is [-M_PI, M_PI], which implies any approach directions are acceptable. *This need to be customized according to user's setup.*
* **grasp_offset** [**[0.0, 0.0, 0.0]**|[1*3 double]]: Offset [x, y, z] in metres applied to the grasps detected. This offset allows adjustment over the final grasp position, to overcome erros that might be accumulated from camera calibration, hand-eye calibration, grasp pose detection, etc. *This need to be customized according to user's setup.*
* **grasp_boundry** [**[-1.0, 1.0, -1.0, 1.0, -1.0, 1,0]**|[1*6 double]]: Boundry cube in grasp_frame_id expected for grasps returned from this service. This parameter takes effect only after transformation into the target frame specified by "grasp_frame_id". When the transformation is unavailalbe, boundry checking will be skipped, and in such case the "GraspDetectorGPD::workspace" parameter still takes effect. *This need to be customized according to user's setup.*
* **eef_offset** [**0.16**|double]: Offset in metres from the gripper base (finger root) to the parent link of gripper. The parent link is usually the end of the robot arm. *This need to be customized according to the gripper geometry.*
* **eef_yaw_offset** [**0.0**|double]: Gripper yaw offset to its parent link, in radian. *This need to be customized if the gripper has yaw offset to TCP (tool center point) of the robot arm.*
* **finger_joint_names** [**["panda_finger_joint1", "panda_finger_joint2"]**|[1*2 "string"]]: Joint names of gripper fingers. Joint names are filled into MoveIt's grasp interface, to control the posture of hand for the position of 'pre_grasp_posture' and 'grasp_posture' (see [moveit_msgs::msg::Grasp](http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html)). Joint names are usually defined in URDF of the robot hand. *This need to be customized according to user's setup.*
Other ROS parameters not mentioned here, refer to the codes [ros_params.cpp](../grasp_ros2/src/ros_params.cpp) for details.
## Customization Notes
* **Model training for grasp detection**: It depends on which back-end grasp detection algorithm is used. For [Grasp Pose Detection](https://github.com/atenpas/gpd), the model was trained with 185K labeled grasps and 55 object models from [bigBIRD](http://rll.berkeley.edu/bigbird). In case of any necessity to re-train, please refer to the discussion [#49](https://github.com/atenpas/gpd/issues/49) in the upstream project.
================================================
FILE: grasp_tutorials/doc/handeye_calibration.rst
================================================
Hand-eye Calibration
=====================
Hand-eye calibration is used to get the camera pose with respect to the robot.
- `handeye_target_detection <https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/handeye_target_detection>`_
- `handeye_dashboard <https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils/handeye_dashboard>`_
================================================
FILE: grasp_tutorials/doc/overview.rst
================================================
Overview
========
.. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html
.. _GPD: https://github.com/atenpas/gpd
.. _OpenVINO™: https://software.intel.com/en-us/openvino-toolkit
.. _Grasp: http://docs.ros.org/api/moveit_msgs/html/msg/Grasp.html
ROS2 Grasp Library consists of
.. image:: ../_static/images/ros2_grasp_library.png
:width: 523 px
:height: 311 px
:align: center
- A ROS2 Grasp Planner providing grasp planning service, as an extensible capability of MoveIt (moveit_msgs::srv::`GraspPlanning`_), translating grasp detection results into MoveIt Interfaces (moveit_msgs::msg::`Grasp`_). A ROS2 Grasp Detctor abstracting interfaces for grasp detection results
- A ROS2 hand-eye calibration module gen
gitextract_b6simm2f/
├── .gitignore
├── CHANGELOG.rst
├── LICENSE
├── README.md
├── docker/
│ ├── Dockerfile
│ ├── README.md
│ ├── script/
│ │ ├── 00_ros2_install.sh
│ │ ├── 10_eigen_install.sh
│ │ ├── 11_libpcl_install.sh
│ │ ├── 12_opencv_install.sh
│ │ ├── 13_openvino_install.sh
│ │ ├── 20_librealsense_install.sh
│ │ ├── 30_gpg_install.sh
│ │ ├── 31_gpd_install.sh
│ │ ├── 32_ur_modern_driver_install.sh
│ │ ├── 50_ros2_deps.sh
│ │ ├── install_ros2_grasp_library.sh
│ │ ├── ros_entrypoint.sh
│ │ └── ros_env.sh
│ └── setup_docker_display.sh
├── grasp_apps/
│ ├── draw_x/
│ │ ├── CMakeLists.txt
│ │ ├── launch/
│ │ │ ├── draw_x.launch.py
│ │ │ └── draw_x.yaml
│ │ ├── package.xml
│ │ └── src/
│ │ └── draw_x.cpp
│ ├── fixed_position_pick/
│ │ ├── CMakeLists.txt
│ │ ├── launch/
│ │ │ ├── fixed_position_pick.launch.py
│ │ │ └── fixed_position_pick.yaml
│ │ ├── package.xml
│ │ └── src/
│ │ └── fixed_position_pick.cpp
│ ├── random_pick/
│ │ ├── CMakeLists.txt
│ │ ├── cfg/
│ │ │ └── random_pick.yaml
│ │ ├── package.xml
│ │ └── src/
│ │ └── random_pick.cpp
│ └── recognize_pick/
│ ├── CMakeLists.txt
│ ├── cfg/
│ │ └── recognize_pick.yaml
│ ├── package.xml
│ └── src/
│ ├── place_publisher.cpp
│ └── recognize_pick.cpp
├── grasp_msgs/
│ ├── CMakeLists.txt
│ ├── msg/
│ │ ├── CloudIndexed.msg
│ │ ├── CloudSamples.msg
│ │ ├── CloudSources.msg
│ │ ├── GraspConfig.msg
│ │ ├── GraspConfigList.msg
│ │ └── SamplesMsg.msg
│ └── package.xml
├── grasp_ros2/
│ ├── CMakeLists.txt
│ ├── cfg/
│ │ ├── grasp_ros2_params.yaml
│ │ ├── random_pick.yaml
│ │ ├── recognize_pick.yaml
│ │ └── test_grasp_ros2.yaml
│ ├── include/
│ │ └── grasp_library/
│ │ └── ros2/
│ │ ├── consts.hpp
│ │ ├── grasp_detector_base.hpp
│ │ ├── grasp_detector_gpd.hpp
│ │ ├── grasp_planner.hpp
│ │ └── ros_params.hpp
│ ├── package.xml
│ ├── src/
│ │ ├── consts.cpp
│ │ ├── grasp_composition.cpp
│ │ ├── grasp_detector_gpd.cpp
│ │ ├── grasp_planner.cpp
│ │ └── ros_params.cpp
│ └── tests/
│ ├── CMakeLists.txt
│ ├── resource/
│ │ └── table_top.pcd
│ ├── tgrasp_ros2.cpp
│ └── tgrasp_ros2.h.in
├── grasp_tutorials/
│ ├── CMakeLists.txt
│ ├── README.md
│ ├── _static/
│ │ └── images/
│ │ └── workflow.vsdx
│ ├── conf.py
│ ├── doc/
│ │ ├── bringup_robot.rst
│ │ ├── draw_x.rst
│ │ ├── fixed_position_pick.rst
│ │ ├── getting_start.rst
│ │ ├── grasp_api.rst
│ │ ├── grasp_planner.rst
│ │ ├── grasp_ros2/
│ │ │ ├── install_gpd.md
│ │ │ ├── install_openvino.md
│ │ │ ├── tutorials_1_grasp_ros2_with_camera.md
│ │ │ ├── tutorials_2_grasp_ros2_test.md
│ │ │ └── tutorials_3_grasp_ros2_launch_options.md
│ │ ├── handeye_calibration.rst
│ │ ├── overview.rst
│ │ ├── random_pick.rst
│ │ ├── recognize_pick.rst
│ │ ├── robot_interface.rst
│ │ └── template.rst
│ ├── index.rst
│ └── package.xml
├── grasp_utils/
│ ├── handeye_dashboard/
│ │ ├── README.md
│ │ ├── config/
│ │ │ └── Default.perspective
│ │ ├── data/
│ │ │ ├── camera-robot.json
│ │ │ └── dataset.json
│ │ ├── launch/
│ │ │ └── handeye_dashboard.launch.py
│ │ ├── package.xml
│ │ ├── plugin.xml
│ │ ├── resource/
│ │ │ └── handeye_dashboard
│ │ ├── setup.py
│ │ └── src/
│ │ └── handeye_dashboard/
│ │ ├── __init__.py
│ │ ├── handeye_calibration.py
│ │ ├── main.py
│ │ └── plugin.py
│ ├── handeye_target_detection/
│ │ ├── .clang-format
│ │ ├── CMakeLists.txt
│ │ ├── README.md
│ │ ├── include/
│ │ │ └── PoseEstimator.h
│ │ ├── launch/
│ │ │ ├── pose_estimation.launch.py
│ │ │ └── pose_estimation.yaml
│ │ ├── package.xml
│ │ └── src/
│ │ ├── pose_estimation_node.cpp
│ │ └── pose_estimator.cpp
│ ├── handeye_tf_service/
│ │ ├── CMakeLists.txt
│ │ ├── README.md
│ │ ├── package.xml
│ │ ├── src/
│ │ │ └── handeye_tf_server.cpp
│ │ └── srv/
│ │ └── HandeyeTF.srv
│ └── robot_interface/
│ ├── CMakeLists.txt
│ ├── Doxyfile
│ ├── README.md
│ ├── include/
│ │ └── robot_interface/
│ │ ├── control_base.hpp
│ │ └── control_ur.hpp
│ ├── launch/
│ │ ├── ur_test.launch.py
│ │ └── ur_test.yaml
│ ├── package.xml
│ ├── src/
│ │ ├── control_base.cpp
│ │ └── control_ur.cpp
│ └── test/
│ ├── ur_test_move_command.cpp
│ └── ur_test_state_publish.cpp
└── moveit_msgs_light/
├── CMakeLists.txt
├── README.md
├── msg/
│ ├── CollisionObject.msg
│ ├── Grasp.msg
│ ├── GripperTranslation.msg
│ ├── MoveItErrorCodes.msg
│ ├── ObjectType.msg
│ └── PlaceLocation.msg
├── package.xml
└── srv/
└── GraspPlanning.srv
SYMBOL INDEX (77 symbols across 30 files)
FILE: grasp_apps/draw_x/launch/draw_x.launch.py
function generate_launch_description (line 23) | def generate_launch_description():
FILE: grasp_apps/draw_x/src/draw_x.cpp
function main (line 13) | int main(int argc, char **argv)
FILE: grasp_apps/fixed_position_pick/launch/fixed_position_pick.launch.py
function generate_launch_description (line 23) | def generate_launch_description():
FILE: grasp_apps/fixed_position_pick/src/fixed_position_pick.cpp
function main (line 11) | int main(int argc, char **argv)
FILE: grasp_apps/random_pick/src/random_pick.cpp
function main (line 26) | int main(int argc, char **argv)
FILE: grasp_apps/recognize_pick/src/place_publisher.cpp
function main (line 19) | int main(int argc, char ** argv) {
FILE: grasp_apps/recognize_pick/src/recognize_pick.cpp
function place_callback (line 27) | static void place_callback(const moveit_msgs::msg::PlaceLocation::Shared...
function main (line 31) | int main(int argc, char **argv)
FILE: grasp_ros2/include/grasp_library/ros2/consts.hpp
type grasp_ros2 (line 20) | namespace grasp_ros2
class Consts (line 28) | class Consts
FILE: grasp_ros2/include/grasp_library/ros2/grasp_detector_base.hpp
type grasp_ros2 (line 21) | namespace grasp_ros2
class GraspCallback (line 30) | class GraspCallback
class GraspDetectorBase (line 48) | class GraspDetectorBase
method GraspDetectorBase (line 54) | GraspDetectorBase()
method start (line 71) | void start(std::string name = "")
method stop (line 81) | void stop()
method add_callback (line 91) | void add_callback(GraspCallback * cb)
FILE: grasp_ros2/include/grasp_library/ros2/grasp_detector_gpd.hpp
type grasp_ros2 (line 56) | namespace grasp_ros2
class GraspDetectorGPD (line 70) | class GraspDetectorGPD : public rclcpp::Node, public GraspDetectorBase
method pointEigenToMsg (line 152) | void pointEigenToMsg(const Eigen::Vector3d & e, geometry_msgs::msg::...
method vectorEigenToMsg (line 160) | void vectorEigenToMsg(const Eigen::Vector3d & e, geometry_msgs::msg:...
FILE: grasp_ros2/include/grasp_library/ros2/grasp_planner.hpp
type grasp_ros2 (line 39) | namespace grasp_ros2
class GraspPlanner (line 49) | class GraspPlanner : public rclcpp::Node, public GraspCallback
type GraspPlanningParameters (line 52) | struct GraspPlanningParameters
FILE: grasp_ros2/include/grasp_library/ros2/ros_params.hpp
type grasp_ros2 (line 25) | namespace grasp_ros2
class ROSParameters (line 33) | class ROSParameters
FILE: grasp_ros2/src/consts.cpp
type grasp_ros2 (line 17) | namespace grasp_ros2
FILE: grasp_ros2/src/grasp_composition.cpp
function main (line 26) | int main(int argc, char ** argv)
FILE: grasp_ros2/src/grasp_detector_gpd.cpp
type grasp_ros2 (line 27) | namespace grasp_ros2
FILE: grasp_ros2/src/grasp_planner.cpp
type grasp_ros2 (line 31) | namespace grasp_ros2
FILE: grasp_ros2/src/ros_params.cpp
type grasp_ros2 (line 20) | namespace grasp_ros2
FILE: grasp_ros2/tests/tgrasp_ros2.cpp
function pcd_publisher (line 42) | static void pcd_publisher()
function topic_cb (line 63) | static void topic_cb(const grasp_msgs::msg::GraspConfigList::SharedPtr msg)
function TEST (line 70) | TEST(GraspLibraryTests, TestGraspService) {
function TEST (line 75) | TEST(GraspLibraryTests, TestGraspTopic) {
function main (line 81) | int main(int argc, char * argv[])
FILE: grasp_utils/handeye_dashboard/launch/handeye_dashboard.launch.py
function generate_launch_description (line 7) | def generate_launch_description():
FILE: grasp_utils/handeye_dashboard/src/handeye_dashboard/handeye_calibration.py
class bcolors (line 22) | class bcolors:
function save_samples_to_file (line 32) | def save_samples_to_file(samples, file_name='dataset.json', pkg='handeye...
class HandEyeCalibration (line 64) | class HandEyeCalibration(Plugin):
method __init__ (line 66) | def __init__(self, context):
method clear (line 212) | def clear(self):
method get_tf_transform (line 219) | def get_tf_transform(self, frame_id, child_frame_id):
method publish_tf_transform (line 238) | def publish_tf_transform(self, transform_to_publish):
method take_snapshot (line 254) | def take_snapshot(self):
method calibration (line 291) | def calibration(self):
method execution (line 315) | def execution(self):
method shutdown_plugin (line 353) | def shutdown_plugin(self):
method save_settings (line 359) | def save_settings(self, plugin_settings, instance_settings):
method restore_settings (line 363) | def restore_settings(self, plugin_settings, instance_settings):
FILE: grasp_utils/handeye_dashboard/src/handeye_dashboard/main.py
function main (line 7) | def main():
FILE: grasp_utils/handeye_target_detection/include/PoseEstimator.h
function namespace (line 42) | namespace tf2
function class (line 49) | class PoseEstimator
FILE: grasp_utils/handeye_target_detection/launch/pose_estimation.launch.py
function generate_launch_description (line 23) | def generate_launch_description():
FILE: grasp_utils/handeye_target_detection/src/pose_estimation_node.cpp
function main (line 22) | int main(int argc, char** argv)
FILE: grasp_utils/handeye_tf_service/src/handeye_tf_server.cpp
class ServerNode (line 27) | class ServerNode : public rclcpp::Node
method ServerNode (line 30) | explicit ServerNode(const rclcpp::NodeOptions & options)
method timer_callback (line 89) | void timer_callback()
function main (line 110) | int main(int argc, char ** argv)
FILE: grasp_utils/robot_interface/include/robot_interface/control_base.hpp
type TcpPose (line 44) | struct TcpPose
class ArmControlBase (line 57) | class ArmControlBase: public rclcpp::Node
method ArmControlBase (line 66) | ArmControlBase(const std::string node_name, const rclcpp::NodeOptions ...
FILE: grasp_utils/robot_interface/include/robot_interface/control_ur.hpp
type ProgArgs (line 42) | struct ProgArgs
class IgnorePipelineStoppedNotifier (line 50) | class IgnorePipelineStoppedNotifier : public INotifier
method started (line 53) | void started(std::string name)
method stopped (line 57) | void stopped(std::string name)
class ShutdownOnPipelineStoppedNotifier (line 63) | class ShutdownOnPipelineStoppedNotifier : public INotifier
method started (line 66) | void started(std::string name)
method stopped (line 70) | void stopped(std::string name)
class URControl (line 78) | class URControl: public ArmControlBase, public URRTPacketConsumer
method URControl (line 81) | URControl(const std::string node_name, const rclcpp::NodeOptions & opt...
method setupConsumer (line 122) | virtual void setupConsumer()
method teardownConsumer (line 125) | virtual void teardownConsumer()
method stopConsumer (line 128) | virtual void stopConsumer()
FILE: grasp_utils/robot_interface/launch/ur_test.launch.py
function generate_launch_description (line 23) | def generate_launch_description():
FILE: grasp_utils/robot_interface/test/ur_test_move_command.cpp
function main (line 21) | int main(int argc, char * argv[])
FILE: grasp_utils/robot_interface/test/ur_test_state_publish.cpp
function main (line 21) | int main(int argc, char * argv[])
Condensed preview — 139 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (499K chars).
[
{
"path": ".gitignore",
"chars": 77,
"preview": "# rviz file\n*.rviz\n\n# document buid file\nbuild\n_build\n\n# vscode file\n.vscode\n"
},
{
"path": "CHANGELOG.rst",
"chars": 1681,
"preview": "changelog for ros2_grasp_library\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n0.5.0 (2019-11-06)\n------------------\n* Added example"
},
{
"path": "LICENSE",
"chars": 11348,
"preview": "\n Apache License\n Version 2.0, January 2004\n "
},
{
"path": "README.md",
"chars": 4772,
"preview": "# DISCONTINUATION OF PROJECT # \nThis project will no longer be maintained by Intel. \nIntel has ceased development and "
},
{
"path": "docker/Dockerfile",
"chars": 486,
"preview": "########################################################\n# Based on Ubuntu 18.04\n#######################################"
},
{
"path": "docker/README.md",
"chars": 2287,
"preview": "# Precondition\n## add docker group\n```\nsudo groupadd docker\nsudo usermod -aG docker $USER\n```\n## Build docker image\n```\n"
},
{
"path": "docker/script/00_ros2_install.sh",
"chars": 967,
"preview": "#!/bin/bash\n\nDEPS_DIR=${DEPS_PATH}\nros2_version=dashing\nSUDO=$1\nif [ \"$SUDO\" == \"sudo\" ];then\n\tSUDO=\"sudo\"\nelse\n\tSUDO=\"\""
},
{
"path": "docker/script/10_eigen_install.sh",
"chars": 520,
"preview": "#!/bin/bash \n\nset -e\n\nDEPS_DIR=${DEPS_PATH}\neigen_version=https://gitlab.com/libeigen/eigen/-/archive/3.2/eigen-3.2.tar."
},
{
"path": "docker/script/11_libpcl_install.sh",
"chars": 444,
"preview": "#!/bin/bash \n\nset -e\n\nDEPS_DIR=${DEPS_PATH}\npcl_version=https://github.com/PointCloudLibrary/pcl/archive/pcl-1.8.1.tar.g"
},
{
"path": "docker/script/12_opencv_install.sh",
"chars": 1086,
"preview": "#!/bin/bash\n\nDEPS_DIR=${DEPS_PATH}\nopencv_version=4.1.2\nSUDO=$1\nif [ \"$SUDO\" == \"sudo\" ];then\n SUDO=\"sudo\"\nelse\n "
},
{
"path": "docker/script/13_openvino_install.sh",
"chars": 2762,
"preview": "#!/bin/bash\n\nDEPS_DIR=${DEPS_PATH}\nMKL_URL=https://github.com/intel/mkl-dnn/releases/download/v0.19/mklml_lnx_2019.0.5.2"
},
{
"path": "docker/script/20_librealsense_install.sh",
"chars": 1101,
"preview": "#!/bin/bash\n\nDEPS_DIR=${DEPS_PATH}\nlibrealsense_version=2.31.0-0~realsense0.1791\nSUDO=$1\nif [ \"$SUDO\" == \"sudo\" ];then\n "
},
{
"path": "docker/script/30_gpg_install.sh",
"chars": 456,
"preview": "#!/bin/bash\n\nDEPS_DIR=${DEPS_PATH}\nSUDO=$1\nif [ \"$SUDO\" == \"sudo\" ];then\n SUDO=\"sudo\"\nelse\n SUDO=\"\"\nfi\n\n# "
},
{
"path": "docker/script/31_gpd_install.sh",
"chars": 309,
"preview": "#!/bin/bash\n\nDEPS_DIR=${DEPS_PATH}\nSUDO=$1\nif [ \"$SUDO\" == \"sudo\" ];then\n SUDO=\"sudo\"\nelse\n SUDO=\"\"\nfi\n\n# "
},
{
"path": "docker/script/32_ur_modern_driver_install.sh",
"chars": 382,
"preview": "#!/bin/bash\n\nDEPS_DIR=${DEPS_PATH}\nSUDO=$1\nif [ \"$SUDO\" == \"sudo\" ];then\n SUDO=\"sudo\"\nelse\n SUDO=\"\"\nfi\n\n# "
},
{
"path": "docker/script/50_ros2_deps.sh",
"chars": 1204,
"preview": "#!/bin/bash\n\nSUDO=$1\nif [ \"$SUDO\" == \"sudo\" ];then\n SUDO=\"sudo\"\nelse\n SUDO=\"\"\nfi\n\n$SUDO apt-get install -y"
},
{
"path": "docker/script/install_ros2_grasp_library.sh",
"chars": 1145,
"preview": "#!/bin/bash\n\nset -e\n\ndeps_path=$1\nif [ -z \"$deps_path\" ]; then\n echo -e \"warring:\\n install_ros2_grasp_library_deps."
},
{
"path": "docker/script/ros_entrypoint.sh",
"chars": 133,
"preview": "#!/bin/bash\n\nset -e\n\n# setup ros2 environment\nsource /opt/ros/dashing/setup.bash\n\nsource /root/ros2_ws/install/setup.bas"
},
{
"path": "docker/script/ros_env.sh",
"chars": 603,
"preview": "#!/bin/bash\n\nROS_PATH=$(pwd)\n\n# setup ros2 environment\nsource /opt/ros/dashing/setup.bash\nsource ${ROS_PATH}/install/set"
},
{
"path": "docker/setup_docker_display.sh",
"chars": 175,
"preview": "#!/bin/bash\nset -e\n\n# setup docker display\nXSOCK=/tmp/.X11-unix\nXAUTH=/tmp/.docker.xauth\ntouch $XAUTH\nxauth nlist $DISPL"
},
{
"path": "grasp_apps/draw_x/CMakeLists.txt",
"chars": 3713,
"preview": "# Copyright (c) 2019 Intel Corporation\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not"
},
{
"path": "grasp_apps/draw_x/launch/draw_x.launch.py",
"chars": 1190,
"preview": "# Copyright (c) 2019 Intel Corporation. All Rights Reserved\n# \n# Licensed under the Apache License, Version 2.0 (the \"Li"
},
{
"path": "grasp_apps/draw_x/launch/draw_x.yaml",
"chars": 234,
"preview": "robot_control:\n ros__parameters:\n host: \"192.168.0.5\"\n shutdown_on_disconnect: true\n joint_names"
},
{
"path": "grasp_apps/draw_x/package.xml",
"chars": 801,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format2.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_apps/draw_x/src/draw_x.cpp",
"chars": 2505,
"preview": "#include <geometry_msgs/msg/pose_stamped.hpp>\n#include <rclcpp/rclcpp.hpp>\n#include <robot_interface/control_ur.hpp>\n\n/*"
},
{
"path": "grasp_apps/fixed_position_pick/CMakeLists.txt",
"chars": 3738,
"preview": "# Copyright (c) 2019 Intel Corporation\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not"
},
{
"path": "grasp_apps/fixed_position_pick/launch/fixed_position_pick.launch.py",
"chars": 1242,
"preview": "# Copyright (c) 2019 Intel Corporation. All Rights Reserved\n# \n# Licensed under the Apache License, Version 2.0 (the \"Li"
},
{
"path": "grasp_apps/fixed_position_pick/launch/fixed_position_pick.yaml",
"chars": 234,
"preview": "robot_control:\n ros__parameters:\n host: \"192.168.0.5\"\n shutdown_on_disconnect: true\n joint_names"
},
{
"path": "grasp_apps/fixed_position_pick/package.xml",
"chars": 813,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format2.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_apps/fixed_position_pick/src/fixed_position_pick.cpp",
"chars": 1992,
"preview": "#include <geometry_msgs/msg/pose_stamped.hpp>\n#include <rclcpp/rclcpp.hpp>\n#include <robot_interface/control_ur.hpp>\n\n/*"
},
{
"path": "grasp_apps/random_pick/CMakeLists.txt",
"chars": 3900,
"preview": "# Copyright (c) 2019 Intel Corporation\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not"
},
{
"path": "grasp_apps/random_pick/cfg/random_pick.yaml",
"chars": 64,
"preview": "robot_control:\n ros__parameters:\n host: \"192.168.1.5\"\n"
},
{
"path": "grasp_apps/random_pick/package.xml",
"chars": 1099,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format2.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_apps/random_pick/src/random_pick.cpp",
"chars": 4056,
"preview": "#include <geometry_msgs/msg/pose_stamped.hpp>\n#include <moveit_msgs/msg/move_it_error_codes.hpp>\n#include <moveit_msgs/m"
},
{
"path": "grasp_apps/recognize_pick/CMakeLists.txt",
"chars": 4300,
"preview": "# Copyright (c) 2019 Intel Corporation\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not"
},
{
"path": "grasp_apps/recognize_pick/cfg/recognize_pick.yaml",
"chars": 64,
"preview": "robot_control:\n ros__parameters:\n host: \"192.168.1.5\"\n"
},
{
"path": "grasp_apps/recognize_pick/package.xml",
"chars": 1116,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format2.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_apps/recognize_pick/src/place_publisher.cpp",
"chars": 2255,
"preview": "// Copyright (c) 2019 Intel Corporation\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may"
},
{
"path": "grasp_apps/recognize_pick/src/recognize_pick.cpp",
"chars": 4839,
"preview": "#include <geometry_msgs/msg/pose_stamped.hpp>\n#include <moveit_msgs/msg/move_it_error_codes.hpp>\n#include <moveit_msgs/m"
},
{
"path": "grasp_msgs/CMakeLists.txt",
"chars": 741,
"preview": "cmake_minimum_required(VERSION 3.5)\n\nproject(grasp_msgs)\n\nif(NOT CMAKE_CXX_STANDARD)\n set(CMAKE_CXX_STANDARD 14)\nendif("
},
{
"path": "grasp_msgs/msg/CloudIndexed.msg",
"chars": 267,
"preview": "# This message holds a point cloud and a list of indices into the point cloud \n# at which to sample grasp candidates.\n\n#"
},
{
"path": "grasp_msgs/msg/CloudSamples.msg",
"chars": 282,
"preview": "# This message holds a point cloud and a list of samples at which the grasp \n# detector should search for grasp candidat"
},
{
"path": "grasp_msgs/msg/CloudSources.msg",
"chars": 496,
"preview": "# This message holds a point cloud that can be a combination of point clouds \n# from different camera sources (at least "
},
{
"path": "grasp_msgs/msg/GraspConfig.msg",
"chars": 830,
"preview": "# This message describes a 2-finger grasp configuration by its 6-DOF pose, \n# consisting of a 3-DOF position and 3-DOF o"
},
{
"path": "grasp_msgs/msg/GraspConfigList.msg",
"chars": 278,
"preview": "# This message stores a list of grasp configurations.\n\n# The time of acquisition, and the coordinate frame ID.\nstd_msgs/"
},
{
"path": "grasp_msgs/msg/SamplesMsg.msg",
"chars": 215,
"preview": "# This message describes a set of point samples at which to detect grasps.\n\n# Header\nstd_msgs/Header header\n\n# The sampl"
},
{
"path": "grasp_msgs/package.xml",
"chars": 1020,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format2.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_ros2/CMakeLists.txt",
"chars": 6949,
"preview": "# Copyright (c) 2018 Intel Corporation\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not"
},
{
"path": "grasp_ros2/cfg/grasp_ros2_params.yaml",
"chars": 1056,
"preview": "GraspDetectorGPD:\n ros__parameters:\n cloud_topic: /camera/pointcloud\n #cloud_topic: /mechmind/color_point_cloud\n "
},
{
"path": "grasp_ros2/cfg/random_pick.yaml",
"chars": 1242,
"preview": "GraspDetectorGPD:\n ros__parameters:\n #cloud_topic: /camera/pointcloud\n cloud_topic: /mechmind/color_point_cloud\n "
},
{
"path": "grasp_ros2/cfg/recognize_pick.yaml",
"chars": 1280,
"preview": "GraspDetectorGPD:\n ros__parameters:\n cloud_topic: /camera/pointcloud\n # cloud_topic: \"/camera/aligned_depth_to_co"
},
{
"path": "grasp_ros2/cfg/test_grasp_ros2.yaml",
"chars": 235,
"preview": "GraspDetectorGPD:\n ros__parameters:\n cloud_topic: /camera/pointcloud\n rviz: false\n device: 0\n auto_mode: fa"
},
{
"path": "grasp_ros2/include/grasp_library/ros2/consts.hpp",
"chars": 1555,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_ros2/include/grasp_library/ros2/grasp_detector_base.hpp",
"chars": 2493,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_ros2/include/grasp_library/ros2/grasp_detector_gpd.hpp",
"chars": 6771,
"preview": "// Copyright (c) 2018 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_ros2/include/grasp_library/ros2/grasp_planner.hpp",
"chars": 6545,
"preview": "// Copyright (c) 2018 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_ros2/include/grasp_library/ros2/ros_params.hpp",
"chars": 1289,
"preview": "// Copyright (c) 2018 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_ros2/package.xml",
"chars": 1932,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format2.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_ros2/src/consts.cpp",
"chars": 1089,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_ros2/src/grasp_composition.cpp",
"chars": 1562,
"preview": "// Copyright (c) 2019 Intel Corporation\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may"
},
{
"path": "grasp_ros2/src/grasp_detector_gpd.cpp",
"chars": 17296,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_ros2/src/grasp_planner.cpp",
"chars": 9506,
"preview": "// Copyright (c) 2019 Intel Corporation\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may"
},
{
"path": "grasp_ros2/src/ros_params.cpp",
"chars": 6701,
"preview": "// Copyright (c) 2018 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_ros2/tests/CMakeLists.txt",
"chars": 1028,
"preview": "find_package(ament_cmake REQUIRED)\nfind_package(ament_cmake_gtest REQUIRED)\nfind_package(rclcpp REQUIRED)\nfind_package(g"
},
{
"path": "grasp_ros2/tests/tgrasp_ros2.cpp",
"chars": 3858,
"preview": "// Copyright (c) 2019 Intel Corporation\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may"
},
{
"path": "grasp_ros2/tests/tgrasp_ros2.h.in",
"chars": 843,
"preview": "// Copyright (c) 2018 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_tutorials/CMakeLists.txt",
"chars": 700,
"preview": "# Copyright (c) 2019 Intel Corporation\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not"
},
{
"path": "grasp_tutorials/README.md",
"chars": 594,
"preview": "# ROS2 Grasp Library Tutorials\n\nThis tutorials aim to introduce how to\n* Install, build, and launch the ROS2 Grasp Plann"
},
{
"path": "grasp_tutorials/conf.py",
"chars": 9690,
"preview": "# -*- coding: utf-8 -*-\n#\n# app_tutorials documentation build configuration file, created by\n# sphinx-quickstart on Thu "
},
{
"path": "grasp_tutorials/doc/bringup_robot.rst",
"chars": 2745,
"preview": "Bring up a New Robot\n====================\n\nThis tutorial explains what is expected to do when bringing up this ROS2 Gras"
},
{
"path": "grasp_tutorials/doc/draw_x.rst",
"chars": 1864,
"preview": "Draw X\n======\n\nOverview\n--------------\n\nThis demo shows how to use the robot interface to draw letter ``X``\nat the fixed"
},
{
"path": "grasp_tutorials/doc/fixed_position_pick.rst",
"chars": 1642,
"preview": "Fixed Position Pick\n====================\n\nOverview\n--------------\n\nThis demo shows how to use the robot interface to pic"
},
{
"path": "grasp_tutorials/doc/getting_start.rst",
"chars": 3339,
"preview": "Getting Start\n=============\n\nThis tutorial introduces getting start to use this ROS2 Grasp Library.\n\n.. _GraspPlanning: "
},
{
"path": "grasp_tutorials/doc/grasp_api.rst",
"chars": 1367,
"preview": "ROS2 Grasp Library APIs\n=======================\n\n.. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPl"
},
{
"path": "grasp_tutorials/doc/grasp_planner.rst",
"chars": 770,
"preview": "Grasp Planner\n=============\n\nTutorials\n---------\n\n- `Install OpenVINO™ toolkit`_\n\n.. _Install OpenVINO™ toolkit: https:/"
},
{
"path": "grasp_tutorials/doc/grasp_ros2/install_gpd.md",
"chars": 836,
"preview": "Installation guide for Grasp Pose Detection\n\n### Install [GPG](https://github.com/atenpas/gpg)\n1. Get the code\n```bash\ng"
},
{
"path": "grasp_tutorials/doc/grasp_ros2/install_openvino.md",
"chars": 3297,
"preview": "# Intel® DLDT toolkit and Intel® OpenVINO™ toolkit\n\nThis tutorial introduces the DLDT toolkit and OpenVINO toolkit.\n\nInt"
},
{
"path": "grasp_tutorials/doc/grasp_ros2/tutorials_1_grasp_ros2_with_camera.md",
"chars": 2096,
"preview": "# OpenVINO Grasp Library with RGBD Camera\n\nThis tutorial introduce the OpenVINO environment setup, how to build and laun"
},
{
"path": "grasp_tutorials/doc/grasp_ros2/tutorials_2_grasp_ros2_test.md",
"chars": 4179,
"preview": "# Grasp Library Tests and Exampels\n\nThis tutorial documents Grasp Library tests which also serve as example codes for th"
},
{
"path": "grasp_tutorials/doc/grasp_ros2/tutorials_3_grasp_ros2_launch_options.md",
"chars": 6383,
"preview": "# Grasp Library Launch Options and Customization Notes\nThis tutorial documents the launch options which are used for cus"
},
{
"path": "grasp_tutorials/doc/handeye_calibration.rst",
"chars": 362,
"preview": "Hand-eye Calibration\n=====================\n\nHand-eye calibration is used to get the camera pose with respect to the robo"
},
{
"path": "grasp_tutorials/doc/overview.rst",
"chars": 1051,
"preview": "Overview\n========\n\n.. _GraspPlanning: http://docs.ros.org/api/moveit_msgs/html/srv/GraspPlanning.html\n.. _GPD: https://g"
},
{
"path": "grasp_tutorials/doc/random_pick.rst",
"chars": 2624,
"preview": "Random Pick (OpenVINO Grasp Detection)\n======================================\n\nOverview\n--------\n\nA simple application d"
},
{
"path": "grasp_tutorials/doc/recognize_pick.rst",
"chars": 3146,
"preview": "Recognize Pick (OpenVINO Grasp Detection + OpenVINO Object Segmentation)\n==============================================="
},
{
"path": "grasp_tutorials/doc/robot_interface.rst",
"chars": 248,
"preview": "Robot Interface\n===============\n\n- `Robot Interface <https://github.com/intel/ros2_grasp_library/tree/master/grasp_utils"
},
{
"path": "grasp_tutorials/doc/template.rst",
"chars": 2537,
"preview": "[App Tutorial Template]\n=======================\n\nOverview\n--------\n*(Describe what this application is in one topic sent"
},
{
"path": "grasp_tutorials/index.rst",
"chars": 886,
"preview": "Welcome to ROS2 Grasp Library Tutorials\n=======================================\n\nROS2 Grasp Library is a ROS2 intelligen"
},
{
"path": "grasp_tutorials/package.xml",
"chars": 925,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format2.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_utils/handeye_dashboard/README.md",
"chars": 4560,
"preview": "# handeye_dashboard\n\n## 1.Prerequisite\n\n* System install\n * Install [ROS2 Dashing](https://index.ros.org/doc/ros2/Insta"
},
{
"path": "grasp_utils/handeye_dashboard/config/Default.perspective",
"chars": 14325,
"preview": "{\n \"keys\": {},\n \"groups\": {\n \"mainwindow\": {\n \"keys\": {\n \"geometry\": {\n \"repr(QByteArray.hex)\""
},
{
"path": "grasp_utils/handeye_dashboard/data/camera-robot.json",
"chars": 283,
"preview": "[[-0.036301454848076786, 0.9992976851533406, -0.009291976274627967, 0.04561821843056508], [-0.6780505333941169, -0.03145"
},
{
"path": "grasp_utils/handeye_dashboard/data/dataset.json",
"chars": 27452,
"preview": "[[[[-0.010438622230043895, -0.9979323269914826, -0.06342007498657491, 0.0033080439705093223], [-0.9998615024531576, 0.00"
},
{
"path": "grasp_utils/handeye_dashboard/launch/handeye_dashboard.launch.py",
"chars": 722,
"preview": "import os\n\nfrom ament_index_python.packages import get_package_share_directory\nfrom launch import LaunchDescription\nfrom"
},
{
"path": "grasp_utils/handeye_dashboard/package.xml",
"chars": 821,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format3.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_utils/handeye_dashboard/plugin.xml",
"chars": 620,
"preview": "<library path=\"src\">\n <class name=\"HandEyeCalibration\" type=\"handeye_dashboard.plugin.HandEyeCalibration\" base_class_ty"
},
{
"path": "grasp_utils/handeye_dashboard/resource/handeye_dashboard",
"chars": 0,
"preview": ""
},
{
"path": "grasp_utils/handeye_dashboard/setup.py",
"chars": 1441,
"preview": "from glob import glob\nfrom setuptools import setup\nfrom setuptools import find_packages\n\npackage_name = 'handeye_dashboa"
},
{
"path": "grasp_utils/handeye_dashboard/src/handeye_dashboard/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "grasp_utils/handeye_dashboard/src/handeye_dashboard/handeye_calibration.py",
"chars": 13734,
"preview": "#!/usr/bin/env python\n\nimport json\nimport rclpy\nimport numpy as np\nimport baldor as br\nfrom rclpy.clock import ROSClock\n"
},
{
"path": "grasp_utils/handeye_dashboard/src/handeye_dashboard/main.py",
"chars": 211,
"preview": "#!/usr/bin/env python3\n\nimport sys\n\nfrom rqt_gui.main import Main\n\ndef main():\n sys.exit(Main().main(sys.argv, standa"
},
{
"path": "grasp_utils/handeye_dashboard/src/handeye_dashboard/plugin.py",
"chars": 74,
"preview": "#!/usr/bin/env python\nfrom .handeye_calibration import HandEyeCalibration\n"
},
{
"path": "grasp_utils/handeye_target_detection/.clang-format",
"chars": 1883,
"preview": "---\nBasedOnStyle: Google\nAccessModifierOffset: -2\nConstructorInitializerIndentWidth: 2\nAlignEscapedNewlinesLeft: false\n"
},
{
"path": "grasp_utils/handeye_target_detection/CMakeLists.txt",
"chars": 2278,
"preview": "# Copyright (c) 2019 Intel Corporation\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not"
},
{
"path": "grasp_utils/handeye_target_detection/README.md",
"chars": 7729,
"preview": "# handeye_target_detection\n\n## 1. Introduction\n\nThis package is used to estimate the pose of\ncalibration patterns. Curre"
},
{
"path": "grasp_utils/handeye_target_detection/include/PoseEstimator.h",
"chars": 3711,
"preview": "/** Copyright (c) 2019 Intel Corporation\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you ma"
},
{
"path": "grasp_utils/handeye_target_detection/launch/pose_estimation.launch.py",
"chars": 1505,
"preview": "# Copyright (c) 2019 Intel Corporation. All Rights Reserved\n# \n# Licensed under the Apache License, Version 2.0 (the \"Li"
},
{
"path": "grasp_utils/handeye_target_detection/launch/pose_estimation.yaml",
"chars": 523,
"preview": "pose_estimation:\n ros__parameters:\n pattern: \"ARUCO\"\n image_topic: \"/camera/color/image_raw\"\n ca"
},
{
"path": "grasp_utils/handeye_target_detection/package.xml",
"chars": 965,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format3.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_utils/handeye_target_detection/src/pose_estimation_node.cpp",
"chars": 3195,
"preview": "/** Copyright (c) 2019 Intel Corporation\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you ma"
},
{
"path": "grasp_utils/handeye_target_detection/src/pose_estimator.cpp",
"chars": 18123,
"preview": "/** Copyright (c) 2019 Intel Corporation\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you ma"
},
{
"path": "grasp_utils/handeye_tf_service/CMakeLists.txt",
"chars": 1284,
"preview": "cmake_minimum_required(VERSION 3.5)\n\nproject(handeye_tf_service)\n\n# Default to C++14\nif(NOT CMAKE_CXX_STANDARD)\n set(CM"
},
{
"path": "grasp_utils/handeye_tf_service/README.md",
"chars": 20,
"preview": "# handeye_tf_service"
},
{
"path": "grasp_utils/handeye_tf_service/package.xml",
"chars": 912,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format3.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_utils/handeye_tf_service/src/handeye_tf_server.cpp",
"chars": 4288,
"preview": "/** Copyright (c) 2019 Intel Corporation. All Rights Reserved\n *\n * Licensed under the Apache License, Version 2.0 (th"
},
{
"path": "grasp_utils/handeye_tf_service/srv/HandeyeTF.srv",
"chars": 114,
"preview": "geometry_msgs/TransformStamped transform\nstd_msgs/Bool publish\n---\ngeometry_msgs/TransformStamped tf_lookup_result"
},
{
"path": "grasp_utils/robot_interface/CMakeLists.txt",
"chars": 4210,
"preview": "# Copyright (c) 2018 Intel Corporation\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not"
},
{
"path": "grasp_utils/robot_interface/Doxyfile",
"chars": 108679,
"preview": "# Doxyfile 1.8.13\n\n# This file describes the settings to be used by the documentation system\n# doxygen (www.doxygen.org)"
},
{
"path": "grasp_utils/robot_interface/README.md",
"chars": 1053,
"preview": "# robot_interface\n\nROS2 package to use robot native interface\n\n## Install\n\nInstall dependency **ur_modern_driver**:\n\n```"
},
{
"path": "grasp_utils/robot_interface/include/robot_interface/control_base.hpp",
"chars": 14224,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_utils/robot_interface/include/robot_interface/control_ur.hpp",
"chars": 4843,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_utils/robot_interface/launch/ur_test.launch.py",
"chars": 1800,
"preview": "# Copyright (c) 2019 Intel Corporation. All Rights Reserved\n# \n# Licensed under the Apache License, Version 2.0 (the \"Li"
},
{
"path": "grasp_utils/robot_interface/launch/ur_test.yaml",
"chars": 227,
"preview": "ur_test:\n ros__parameters:\n host: \"192.168.0.5\"\n shutdown_on_disconnect: true\n joint_names: [\"sh"
},
{
"path": "grasp_utils/robot_interface/package.xml",
"chars": 982,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format3.xsd\" schematypens=\"http://www.w3."
},
{
"path": "grasp_utils/robot_interface/src/control_base.cpp",
"chars": 8950,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_utils/robot_interface/src/control_ur.cpp",
"chars": 7535,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_utils/robot_interface/test/ur_test_move_command.cpp",
"chars": 2703,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "grasp_utils/robot_interface/test/ur_test_state_publish.cpp",
"chars": 1216,
"preview": "// Copyright (c) 2019 Intel Corporation. All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \""
},
{
"path": "moveit_msgs_light/CMakeLists.txt",
"chars": 1023,
"preview": "cmake_minimum_required(VERSION 3.5)\n\nproject(moveit_msgs)\n\nif(NOT CMAKE_CXX_STANDARD)\n set(CMAKE_CXX_STANDARD 14)\nendif"
},
{
"path": "moveit_msgs_light/README.md",
"chars": 79,
"preview": "This's a temporary solution for ROS2 interface, till MoveIt is ported to ROS2.\n"
},
{
"path": "moveit_msgs_light/msg/CollisionObject.msg",
"chars": 1180,
"preview": "# a header, used for interpreting the poses\nstd_msgs/Header header\n\n# the id of the object (name used in MoveIt)\nstring "
},
{
"path": "moveit_msgs_light/msg/Grasp.msg",
"chars": 2031,
"preview": "# This message contains a description of a grasp that would be used\n# with a particular end-effector to grasp an object,"
},
{
"path": "moveit_msgs_light/msg/GripperTranslation.msg",
"chars": 403,
"preview": "# defines a translation for the gripper, used in pickup or place tasks\n# for example for lifting an object off a table o"
},
{
"path": "moveit_msgs_light/msg/MoveItErrorCodes.msg",
"chars": 852,
"preview": "int32 val\n\n# overall behavior\nint32 SUCCESS=1\nint32 FAILURE=99999\n\nint32 PLANNING_FAILED=-1\nint32 INVALID_MOTION_PLAN=-2"
},
{
"path": "moveit_msgs_light/msg/ObjectType.msg",
"chars": 1044,
"preview": "################################################## OBJECT ID #########################################################\n\n"
},
{
"path": "moveit_msgs_light/msg/PlaceLocation.msg",
"chars": 636,
"preview": "# A name for this grasp\nstring id\n\n# The internal posture of the hand for the grasp\n# positions and efforts are used\ntra"
},
{
"path": "moveit_msgs_light/package.xml",
"chars": 1188,
"preview": "<?xml version=\"1.0\"?>\n<?xml-model href=\"http://download.ros.org/schema/package_format2.xsd\" schematypens=\"http://www.w3."
},
{
"path": "moveit_msgs_light/srv/GraspPlanning.srv",
"chars": 730,
"preview": "# Requests that grasp planning be performed for the target object\n# returns a list of candidate grasps to be tested and "
}
]
// ... and 2 more files (download for full content)
About this extraction
This page contains the full source code of the intel/ros2_grasp_library GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 139 files (13.7 MB), approximately 132.1k tokens, and a symbol index with 77 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.