アプリとサービスのすすめ

アプリやIT系のサービスを中心に書いていきます。たまに副業やビジネス関係の情報なども気ままにつづります

Jetson Nano上のRealSense-Lidar(L515)とホストPCをROSで遠隔通信してみた

今回はJetson Nano上でRealsense Lidar(L515)を動かして、遠隔通信でデータをホストPCに送るまでの備忘録

環境

ubuntu 18.04.5 LTS
CMake 3.10.2
Python 3.6.9
Opencv 4.1.1
JetPack 4.3

作業全体構成

f:id:trafalbad:20220118235219p:plain


目次
1. Librealsenseのinstall
2. Jetson nanoにrealsense-rosのinstall
3. Jetson上でLidarの動作確認
4. rostopic list一覧
5. ホストPCとJetsonを遠隔通信




1. Librealsenseのinstall

参考:installation_jetson.md

installed at 2022/01/17

# public keyの登録
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE || sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE

# サーバーレポジトリのlistの登録
sudo add-apt-repository "deb https://librealsense.intel.com/Debian/apt-repo $(lsb_release -cs) main" -u

#  SDKのinstall
sudo apt-get install librealsense2-utils
sudo apt-get install librealsense2-dev


2. Jetson nanoにrealsense-rosのinstall

とにかく苦労したところ。Jetsonはarchがarmなので、ホストPCと入れ方が違うので注意

Installed at 2022/01/23

# 必要パッケージのinstall
sudo apt-get install ros-melodic-realsense2-camera
sudo apt-get install ros-melodic-realsense2-description

cd ~/catkin_ws/src
git clone https://github.com/IntelRealSense/realsense-ros.git
cd realsense-ros/
git checkout `git tag | sort -V | grep -P "^2.\d+\.\d+" | tail -1`
#### Note: checking out '2.3.2' 
#### HEAD is now at f400d68 2.3.2
# catkin make
cd ~/catkin_ws
catkin_make clean
>>>
### -- Build files have been written to: /home/jetson/catkin_ws/build
####
#### Running command: "make clean -j4 -l4" in "/home/jetson/catkin_ws/build"
####

catkin_make -DCATKIN_ENABLE_TESTING=False -DCMAKE_BUILD_TYPE=Release
>>>
〜〜
#### [100%] Linking CXX shared library /home/jetson/catkin_ws/devel/lib/librealsense2_camera.so
#### [100%] Built target realsense2_camera

catkin_make install
>>>
〜〜
#### -- Installing: /home/jetson/catkin_ws/install/share/realsense2_camera/rviz/pointcloud.rviz
#### -- Installing: /home/jetson/catkin_ws/install/share/realsense2_camera/nodelet_plugins.xml
source ~/.bashrc
# 再起動
sudo reboot


installできた。





3. Jetson上でLidarの動作確認

試しにjetson内で動作を試す。

realsense-viewerで動かす

realsense-viewer
でRealsense Lidarをjetson上で動かしてみる。

# 認識してるか確認
ls /dev/video*
### /dev/video0  /dev/video1  /dev/video2  /dev/video3

realsense-viewerでLidarを写してみる。

f:id:trafalbad:20220118091927j:plain

roslaunchで動かす

# 1つ目のterminalで
$ roslaunch realsense2_camera rs_camera.launch
# 2つ目で
$ rosrun image_view image_view image:=/camera/color/image_raw

f:id:trafalbad:20220120191201p:plain
写った。(写真はモザイクかけてあります)



4. rostopic list一覧

ホストPCと同じ。opencvシンボリックリンクを忘れるとエラーだけじゃなく、rgb画像が読めなくなる。

$ rostopic list
#### /camera/color/camera_info
#### /camera/color/image_raw
#### /camera/color/image_raw/compressed
#### /camera/color/image_raw/compressed/parameter_descriptions
#### /camera/color/image_raw/compressed/parameter_updates
#### /camera/color/image_raw/compressedDepth
#### /camera/color/image_raw/compressedDepth/parameter_descriptions
#### /camera/color/image_raw/compressedDepth/parameter_updates
#### /camera/color/image_raw/theora
#### /camera/color/image_raw/theora/parameter_descriptions
#### /camera/color/image_raw/theora/parameter_updates
#### /camera/color/metadata
#### /camera/depth/camera_info
#### /camera/depth/image_rect_raw
#### /camera/depth/image_rect_raw/compressed
#### /camera/depth/image_rect_raw/compressed/parameter_descriptions
#### /camera/depth/image_rect_raw/compressed/parameter_updates
#### /camera/depth/image_rect_raw/compressedDepth
#### /camera/depth/image_rect_raw/compressedDepth/parameter_descriptions
#### /camera/depth/image_rect_raw/compressedDepth/parameter_updates
#### /camera/depth/image_rect_raw/theora
#### /camera/depth/image_rect_raw/theora/parameter_descriptions
#### /camera/depth/image_rect_raw/theora/parameter_updates
#### /camera/depth/metadata
#### /camera/extrinsics/depth_to_color
#### /camera/l500_depth_sensor/parameter_descriptions
#### /camera/l500_depth_sensor/parameter_updates
#### /camera/motion_module/parameter_descriptions
#### /camera/motion_module/parameter_updates
#### /camera/realsense2_camera_manager/bond
#### /camera/rgb_camera/parameter_descriptions
#### /camera/rgb_camera/parameter_updates
#### /diagnostics
#### /rosout
#### /rosout_agg
#### /tf
#### /tf_static

5. ホストPCとJetsonを遠隔通信

お試しようなので、Lidarに写った画像の中の青と赤の領域の値を表示するscript。

subscriber用pythonスクリプト(color_vel.py)

#!/usr/bin/env python3

import rospy
import cv2
import numpy as np
from sensor_msgs.msg import Image
from geometry_msgs.msg import Twist
from cv_bridge import CvBridge, CvBridgeError

LIDAR_NODE_NAME = '/camera/color/image_raw'
class ColorExtract(object):
    def __init__(self):
        self._vel_pub = rospy.Publisher('cmd_vel', Twist, queue_size=1)
        self._blue_pub = rospy.Publisher('blue_image', Image, queue_size=1)
        self._red_pub = rospy.Publisher('red_image', Image, queue_size=1)
        self._image_sub = rospy.Subscriber(LIDAR_NODE_NAME, Image, self.callback)
        self._bridge = CvBridge()
        self._vel = Twist()

    def get_colored_area(self, cv_image, lower, upper):
        hsv_image = cv2.cvtColor(cv_image, cv2.COLOR_BGR2HSV)
        mask_image = cv2.inRange(hsv_image, lower, upper)
        extracted_image = cv2.bitwise_and(cv_image, cv_image, mask=mask_image)
        area = cv2.countNonZero(mask_image)
        return (area, extracted_image)
        
    def callback(self, data):
        try:
            cv_image = self._bridge.imgmsg_to_cv2(data, 'bgr8')
        except CvBridgeError as e:
            print(e)
        blue_area, blue_image = self.get_colored_area(
            cv_image, np.array([50,100,100]), np.array([150,255,255]))
        red_area, red_image = self.get_colored_area(
            cv_image, np.array([150,100,150]), np.array([180,255,255]))
            
        try:
            self._blue_pub.publish(self._bridge.cv2_to_imgmsg(blue_image, 'bgr8'))
            self._red_pub.publish(self._bridge.cv2_to_imgmsg(red_image, 'bgr8'))
        except CvBridgeError as e:
            print(e)
        rospy.loginfo('blue=%d, red=%d' % (blue_area, red_area))
        if blue_area > 10:
            self._vel.linear.x = 0.5
            self._vel_pub.publish(self._vel)
        if red_area > 10:
            self._vel.linear.x = -0.5
            self._vel_pub.publish(self._vel)


if __name__ == '__main__':
    rospy.init_node('color_extract')
    color = ColorExtract()
    try:
        rospy.spin()
    except KeyboardInterrupt:
        pass

ホストPCとjetsonで遠隔通信

# ホストPC側の環境設定
export ROS_MASTER_URI=http://192.168.10.111:11311  # ROS_MASTER側のアドレスを設定
export ROS_IP=192.168.10.111                      # 自分のIPアドレスを設定


# jetson側の環境設定
export ROS_MASTER_URI=http://192.168.10.111:11311
export ROS_IP=192.168.10.102
# Jetson側で実行コマンド
$ roslaunch realsense2_camera rs_camera.launch
# ホストPC側で実行コマンド
$ roscore
$ rosrun ros_start color_vel.py 
>>>
#### [INFO] [1642672394.648472]: blue=0, red=0
#### [INFO] [1642672395.452979]: blue=3741, red=288
#### [INFO] [1642672396.209016]: blue=6081, red=717
#### [INFO] [1642672396.993956]: blue=6182, red=708
#### [INFO] [1642672397.801500]: blue=6095, red=689
#### [INFO] [1642672398.609366]: blue=6338, red=716
〜〜〜
〜〜〜


Jetson側Lidarで読み込んでる画像の青と赤の領域の値が、手前のホストPCに送られてくるのがわかる。

f:id:trafalbad:20220120191237j:plain

なんとかJetsonでLidarのセットアップと、Jetson側のLidarからホストPC側へのデータ送信(Topic通信)にも成功した。

参考

how to install (installation_jetson.md)
GitHub:realsense-ros