1. Introduction
This is the second half of the 2-part blog posts. See part 1 if you are interested in a Python implementation of this same project but not related to Robot Operating System.
Synopsis
Utilizing the Intel RealSense Tracking Camera T265 for multiple purposes:
- Stable, accurate GPS-less flight with its standalone 6-dof (200Hz): The first and foremost usage of the VIO tracking camera, which has been realized in this GSoC 2019 project. Wikis are available for Python and ROS support.
- Precision landing: without (part 1) and with (part 2, this blog post) ROS.
In the below sections I will first briefly outline the system, followed by a step-by-step guide to make the system work and actual experimental process, and finally some discussions on the many choices for implementation. Note that details on some parts will be omitted if they are already covered in the linked wikis or previous blog posts.
Let’s dive in.
2. Prerequisite
Hardware
-
Vehicle: A copter vehicle to your liking. If you are new to the DIY world, the Copter wiki is your one-stop shop for all the basics that you need to learn.
-
Onboard computer: a Linux system with USB3 port(s) is required. The Up board is what I am using, but the Jetson Nano or RPi 4 are also suitable alternatives for low budget setup.
-
Sensor: the Realsense T265 camera in downfacing configuration (USB port pointing to the right of the vehicle). Different camera orientations require some parameter modifications.
-
Landing target: One AprilTag’s fiducial marker printed, measured and fixed to the landing surface.
Software
Prepare the landing target
-
First, select (or create) the tag that you want to use as a landing target. Say you decide to use tag id 00
of the 36h11
type. Print, then measure and note down the actual tag’s size (the black part of the tag), which should be 0.144m if printed on an A4 paper.
-
Fix the tag to some landing pad or to the ground, make sure that it won’t get blown away when the copter is descending.
The pdf for all tags of the 36h11
family can be found here. Other tag options can be found here.
3. How does it work?
How to perform accurate localization and navigation in ROS?
You can take a look at our past GSoC 2019 project for a complete explaination on the VIO integration part.
How to perform precision landing in ROS?
In a nutshell, the process includes:
- Obtaining the raw images from the downfacing camera.
- Applying the computer vision algorithm of choice to acquire the target.
- Performing necessary calculation to create the right kind of data for the flight stack, then send them to the FCU.
- Experimenting to figure out the limits of the system, which include things like the min-max target detection range for a given target size, min-max data rate for a stable landing and how it affects other parts of the system, tuning parameters to achieve desired results, etc.
The bulk of this blog post is about how to realize step #1-#3 for one specific setup which includes the Intel Realsense T265 as the sensor of choice, AprilTag as the target detection algorithm, and ArduPilot as our flight stack.
More specifically, here’s the data flow for our ROS-based precision landing project:
ID |
Launch command |
Description |
1 |
roslaunch realsense2_camera rs_t265.launch |
The camera driver node realsense-ros , which publishes raw images in the topics /camera/fisheye1/image_raw and /camera/fisheye2/image_raw . |
2 |
roslaunch vision_to_mavros t265_fisheye_undistort.launch |
A node that subscribes to the raw stereo images, undistorts and rectifies them, then publishes the rectified images as well as corresponding camera_info topics. |
3 |
roslaunch apriltag_ros continuous_detection.launch |
ROS wrapper node for apriltag, requires rectified image and camera_info topics. Detect the tag and publish the tag’s pose in the camera frame, given the correct settings in the tags.yaml file. |
4 |
roslaunch vision_to_mavros t265_downfacing_tf_to_mavros.launch |
1. Transform 6-DoF pose data from T265 to VISION_POSITION_ESTIMATE data for MAVROS. 2. Transform landing tag’s pose in the camera frame (output from apriltag_ros ) into LANDING_TARGET data for MAVROS. |
5 |
roslaunch mavros apm.launch |
MAVROS node for communication between ROS and ArduPilot. |
4. Installation
First, a quick clarification: apriltag_ros
(and mavros
, if built from source) is built with catkin build
, whereas vision_to_mavros
(and realsense-ros
, if built from source) is built with catkin_make
. Thus, there are two separate catkin workspaces (catkin_ws
for catkin_make
, and catkin_ws_build
for catkin build
) in the sections below.
Install AprilTag 3 library and ROS wrapper
-
Note that apriltag_ros
requires catkin build
workspace, which must be separated from catkin_make
workspace for realsense-ros
(if built from source) and vision_to_mavros
packages.
-
The commands below are adopted from the apriltag_ros
repo. Always check the repo for the most up-to-date instructions. Currently, ROS Kinetic
and Melodic
are supported.
export ROS_DISTRO=kinetic # Set this to your distro, e.g. kinetic or melodic
source /opt/ros/$ROS_DISTRO/setup.bash # Source your ROS distro
mkdir -p ~/catkin_ws_build/src # Make a new workspace, change the name if you have a catkin_ws folder created with catkin_make
cd ~/catkin_ws_build/src # Navigate to the source space
git clone https://github.com/AprilRobotics/apriltag.git # Clone Apriltag library
git clone https://github.com/AprilRobotics/apriltag_ros.git # Clone Apriltag ROS wrapper
cd ~/catkin_ws_build # Navigate to the workspace
rosdep install --from-paths src --ignore-src -r -y # Install any missing packages
catkin build # Build all packages in the workspace (catkin_make_isolated will work also)
source devel/setup.bash # Make sure to source otherwise ros cannot find the nodes from this workspace
- The default installation will place headers in
/usr/local/include
and shared library in /usr/local/lib
. Python wrapper will also be installed if python3 is installed.
Install vision_to_mavros
package
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
git clone https://github.com/hoangthien94/vision_to_mavros.git
cd ..
catkin_make
echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc # Add source command to .bashrc, if you have not done so
source ~/.bashrc
Install mavros
and establish connection between companion computer and ArduPilot
Follow the wiki page, then it would be simplier if all necessary changes are saved in apm.launch
. Otherwise, remember to append the exact params (such as fcu_url
) in all the subsequent launch commands for MAVROS.
5. Modification of parameters
After all necessary packages are installed, certain parameters need to be changed for things to work properly.
Change camera calibration file for vision_to_mavros
package according your specific device
These parameters are required to perform image rectification with the Kanalla-Brandt distortion model which is implemented in OpenCV as the “fisheye” module.
-
Obtain the real params: With the T265 plugged in a computer with librealsense
installed, you can run rs-enumerate-devices -c
, which will show a host of information under the name Stream Profiles supported by Tracking Module
. Example output of the command can be seen here.
-
Then, modify the params K1
, D1
, K2
, D2
, R
, T
in the calibration file catkin_ws/src/vision_to_mavros/cfg/t265.yaml
as follows:
Params in t265.yaml
|
Description |
Output obtained from rs-enumerate-devices -c
|
K1 |
Intrinsic matrix of left camera |
Intrinsic of “Fisheye 1” : PPX , PPY , Fx , Fy
|
K2 |
Intrinsic matrix of right camera |
Intrinsic of “Fisheye 2” : PPX , PPY , Fx , Fy
|
D1 |
Distortion coefficients of left camera |
Intrinsic of “Fisheye 1” : First 4 numbers of Coeffs
|
D2 |
Distortion coefficients of right camera |
Intrinsic of “Fisheye 2” : First 4 numbers of Coeffs
|
R |
Rotation matrix from left camera to right camera |
Extrinsic from “Fisheye 1” To “Fisheye 2” : Rotation Matrix
|
T |
Translation vector from left camera to right camera |
Extrinsic from “Fisheye 1” To “Fisheye 2” : Translation Vector
|
with the camera matrix K
as:
Change configuration of apriltag_ros
package according to the landing tag
- Find the file
apriltag_ros/apriltag_ros/config/tags.yaml
and add a standalone tag with the actual id
, size
and set NAME
as landing_target
. For example, with the example A4 tag above we would have:
standalone_tags:
[
{id: 0, size: 0.144, name: landing_target}
]
The id and size are required for the AprilTag algorithm, and the name is the frame_id under which the tag’s pose in the camera frame would be published in the /tf
ROS topic.
- Next, modify
camera_name
and image_topic
params in the launch file apriltag_ros/apriltag_ros/launch/continuous_detection.launch
as follows:
<arg name="camera_name" default="/camera/fisheye2/rect" />
<arg name="camera_frame" default="camera_fisheye2_optical_frame" />
<arg name="image_topic" default="image" />
Where do these values come from? Note that the outputs from t265_fisheye_undistort
node include:
- Rectified image topics:
/camera/fisheye1/rect/image
(left) and /camera/fisheye2/rect/image
(right).
- Camera info topics:
/camera/fisheye1/rect/camera_info
(left) and /camera/fisheye1/rect/camera_info
(right).
and since apriltag_ros
subscribes to $(arg camera_name)/$(arg image_topic)
, we split the image topic’s name into 2 parts, and for the right camera (fisheye2
), thus we end up with the above params. camera_frame
can be anything, but the name of camera’s frame as published by realsense-ros
is camera_fisheye2_optical_frame
(default value for the right camera), thus we will also use it for consistency. This might come in handy for calculation related to /tf
as well as visualization in RViz.
Note: if you wish to use the left camera, simply find all instances of fisheye2
and change them into fisheye1
in 2 launch files: continuous_detection.launch
and t265_downfacing_tf_to_mavros.launch
.
- Finally, make sure:
-
precland_target_frame_id
in t265_downfacing_tf_to_mavros.launch
must matches the tag’s name in apriltag_ros/apriltag_ros/config/tags.yaml
, and
-
precland_camera_frame_id
in t265_downfacing_tf_to_mavros.launch
must matches camera_frame
in apriltag_ros/apriltag_ros/launch/continuous_detection.launch
.
Change MAVROS setting to allow relaying LANDING_TARGET
messages
Navigate to mavros launch folder, edit the apm_config.yaml
file and change listen_lt
from false
to true
.
roscd mavros/launch
sudo nano apm_config.yaml
# Find landing_target, change liten_lt: true
ArduPilot params
Once all of the params are set correctly, it’s time for some ground test.
6. Ground test
Let’s take the system apart, and test them one by one before any real flights. You might want to run the following commands one by one, make sure each of them produces the right results before moving on:
-
roslaunch realsense2_camera rs_t265.launch
: all the data from the tracking camera should be coming through, including the raw fisheye images:
-
roslaunch vision_to_mavros t265_fisheye_undistort.launch
: view the rectified image /camera/fisheye2/rect/image
with rqt_image_view
or RViz. Additionally, check that /camera/fisheye2/rect/camera_info
are being published.
-
roslaunch apriltag_ros continuous_detection.launch
: view the detection image tag_detections_image
with rqt_image_view
or RViz. Then move the copter so that the camera points at the landing tag, then /tag_detections
and /tf
should publish the detected tag’s pose in the camera’s frame.
-
roslaunch vision_to_mavros t265_downfacing_tf_to_mavros.launch
: the ROS topics /mavros/vision_pose/pose
and /mavros/landing_target/raw
should be populated. The latter only has data coming in when the landing target is in the field of view of the camera (and distinguishable, i.e. not too far away or too close).
-
roslaunch mavros apm.launch
: there should be connection with the FCU, with messages popping up on the terminal.
After launching all the 5 nodes, hold the vehicle on top of the landing tag such that the camera can see the landing tag clearly. Then, check on the GCS:
Verify that ArduPilot is receiving LANDING_TARGET
data
- Whenever the landing tag is in the camera’s view and MAVROS is running correctly, view MAVLink messages on the GCS (Mission Planner:
Ctrl+F
> MAVLink Inspector
, QGC: Widgets
> MAVLink Inspector
). LANDING_TARGET
should have updated values in the angle_x
, angle_y
, distance
fields (the rest are mostly ignored).
- Move the camera around and check the values in all xyz directions are correct. In this project’s default configuration (USB port to the right), the camera’s x-axis will point to the right, y-axis will point to the back, and z-axis points downward (to the ground).
Verify that ArduPilot is receiving VISION_POSE
data
- Similarly, check that the message
VISION_POSITION_ESTIMATE
is coming through to the flight controller.
Improve detection rate of AprilTag if it is too low
Ideally, the LANDING_TARGET
message should come at 10Hz and above for a smooth landing. If the output of AprilTag is lower than 10Hz (check the rate with rostopic hz /tag_detections
or in GCS but with USB connection), you can improve the detection rate by checking out the corresponding ROS wiki and adjust the settings in settings.yaml
.
7. Flight test
Once everything is confirmed to work properly, all of the individual launch files are grouped in the t265_precland.launch
. Thus, for the flight tests the only command needed is:
roslaunch vision_to_mavros t265_precland.launch
Thereafter, we can simply follow the ArduPilot precision landing with IR-Lock wiki for instructions on how to actuate the landing in-flight. Below is a short summary:
- For downfacing configuration of the Realsense T265, there is a problem with inconsistent yaw angle between runs. The temporary fix is to tilt the nose up a little, so that the camera is not facing flat down, when the script starts. Once the FCU starts receiving vision messages, the vehicle can be put flat on the ground.
- Take-off and hover above the target.
- When target is detected (see above about verifying that ArduPilot is receiving
LANDING_TARGET
data), you can switch the vehicle to LAND
.
- If everything is working properly, the copter should move towards then land on the tag.
- As usual, be prepared to retake control if there are sudden unexpected movements (i.e. change back the mode to Stabilize, AltHold or Loiter).
- If all is well, some successful flight tests can be seen below:
8. Discussion and alternative approaches
For this blog post, almost every steps have some alternatives that should be considered for your particular situation.
Fiducial marker system
There are many other options out there such as Aruco which is already supported in this wiki. Note that Aruco is now GPL licensed while AprilTag is using BSD license. The latest version of AprilTag (3) also support new and flexible layouts (circle, nested, etc.). You can find some comparisons in this paper.
Camera orientation
- It is implicitly assumed that the camera’s x axis is aligned with the vehicle’s y axis (to the right) while the camera’s y axis aligned with the vehicle’s -x (to the back). This is the same as pitching the camera down 90 degrees from facing forward.
- If your camera is oriented differently, then a transformation is mandatory. For reference, here’s the T265’s coordinates (note the fisheye’s axes), whereas the FCU’s local frame is (X Forward, Y Right and Z Down).
Image rectification for AprilTag detection
-
The AprilTag’s algorithm assumes rectified image and camera_info
topics as input. Since the T265’s Kannala-Brandt distortion model (so-called fisheye
model in OpenCV) is not yet supported in image_proc
pipeline, we need to perform rectification in some other ways. In this project, I wrote a node that takes inspiration from the sample script t265_stereo.py, but in C++ and ROS.
-
Additionally, left and right images are also synced with ApproximateTime_Policy to produce time-synched rectified stereo images, which might become important for other purposes, such as generating disparity maps.
-
I would highly recommend you check out the image_undistort
repo, which supports a number of other less popular models and a bunch of cool computer vision functions, however not the ones we specifically need for this project.
How to populate the LANDING_TARGET
data messages
- If the landing tag is detected in the image, precision landing can be performed following MAVLink’s landing target protocol.
- As of this writing, different flight stack (PX4 and AP) supports different kinds of input. See the MAVLink wiki for details. For ArduPilot, only
LANDING_TARGET
fields relative to the captured image are supported (hence the ignored fields in the LANDING_TARGET
message).
How to run MAVROS landing target plugin?
There seems to be no documentation anywhere on how to use the plugin correctly that I could find. Hence, to the source code!
Below is a brief rundown from my understanding of reading the code.
- First, take a look at the main source code for
landing_target
plugin and the definition of the LANDING_TARGET
message.
- All input params for this plugin can be set in this section of
apm_config.yaml
before launching mavros.
- The data input type for
landing_target
plugin can be in 1 of 3 forms, as seen in this part:
- Through listening to
/tf
transform. Can be enabled by setting listen_tf
to true
.
- Through subscribing to
/mavros/landing_target/pose
messages (of type PoseStamped
). Can be enabled by setting both listen_tf
and listen_lt
to false
.
- Through subscribing to
/mavros/landing_target/raw
messages (of type LANDING_TARGET
). Can be enabled by setting listen_tf
to false
and listen_lt
to true
.
- You can choose whichever input method that works for you. However, note that for option i and ii (as of this writing) there are several ENU-NED frame transformation functions implemented in the code that one needs to decipher in order to get the correct output given an (presumably well-understood) input. Specifically, read this function to learn about the conversions, related parameters and what goes where.
- In this project, option iii was selected for it provides the most control of the final MAVLink messages to be sent and the data that we obtained from AprilTag is already in the required NED frame. Looky here. If you are working with PX4, however, the results might not be the same (see point above).
9. Useful links
Happy flying!