Indoor Autonomous Navigation
UAV Based AprilTag Recognition
01
Platform

To operate effectively indoors, we need a small platform. We are using the common 5" drone side with tri-blade propellers to reduce required propeller area. To safely operate in an enclosed environment, we needed propeller guards.

02
Optical Flow

Optical flow is a great way to receive quick odometry estimates indoors, but doesn't provide precise localization over longer time periods due to drift. It also only works well on textured floors.

03
AprilTags

AprilTags are the perfect pairing for optical flow, as they provide very accurate localization over longer time spans, although they can't always be accessed (when the camera doesn't see an apriltag).

The drone is based on the Ardupilot flight control software running on a Pixracer flight controller and autopilot. The pixracer uses a kalman filter to synthesize data from onboard sensors. As well as the common sensors (IMU and barometer), we add a Raspberry Pi that acts as another sensor, sending location data based on visual apriltag positioning.

The Raspberry Pi connects to a Pi cam that sits on the bottom of the drone, and to the Pixracer via a serial port. THe Raspberry Pi has wifi so that it can be controlled and programmed via ssh from a nearby computer.

We also experimented with GPS and Optical Flow systems to use in conjunction with the AprilTag localization. GPS doesn't work indoors, and the optical flow proved unreliable on untextured floors.

The core component of this project is the software on the Raspberry Pi that collectively takes an image as input and outputs a mavlink message providing position to the drone. The first part of this is a node that outputs a calibrated image from the camera, based on a prexisting ubiquity robotics node.

Then, the ROS apriltag library helps identify any apriltags that the camera can see and establish relative position to the drone. We predetermine the positions of all apriltags relative to the world by using a tag 'bundle', and then we can apply a series of tranformations that allows us to get the drone's position relative to the world.

Finally, this message needs to be converted from a pose into a mavlink vision position message. Another node takes care of this process, with the help of the MavROS package to translate the resulting ROS topic into a format that can be read by the autopilot.

The other major component of this project was setting up the autopilot to function with the vision position input. This is done by modifying the internal kalman filter to use input from the AprilTag vision system as well as internal sensors such as gyroscope and accelerometer. The vision system only provides an update rate of about 10Hz. While this is enough for a navigational positioning system, the drone needs much faster sensor input to remain stable. Fortunately this is provided by the accelerometer and gyro, which maintain effective odometry while the vision system provides slower absolute positions.

Blog 11.29.2022

We got AprilTag recognition working, a key feature for our project. As you can see in the video, when we show one of the AprilTags we have configured the ROS node to recognize, it draws bounds around the edges of the tag, labels the center with the tag ID number, and we can show the pose of the tag in relation to the camera as a vector on rviz.



We used the AprilTag ROS package to create the Apriltag-recognizing node. To set this up, we installed and built the Apriltag library, calibrated the camera and set up the camera node so it was compatible with the Apriltag ROS node, and supplied information about the Apriltags we want the system to recognize within the configuration file. We worked around the issue of the camera info not publishing synchronously to camera images, and parsed pose information from the /tag_detections topic to publish it to a new topic we could view in rviz.



In other news, we are still having difficulties with flying the drone. The motor mapping was wrong causing strange behaviour when we tried to fly. After fixing the motor mapping, we found out that the optical flow sensor to provide relative position of the drone was not performing as we had hoped. We will have to rely on the other sensors (accelerometer & barometer) which do not provide as accurate position.

To combat the problem of position drift we are going to use apriltag localization to provide position of the drone when it is in guided mode rather than manual. To do this we will override the global positioning information and use the position we compute relative to apriltags placed around the room. Our plan going forward is to work more on the apriltag localization and to figure out how to override global position.

Blog 12.9.2022

We are successfully sending vision position data from the raspberry pi to the pixhawk flight controller! The raspberry pi camera publishes an image topic through the raspicam node, that can then be interpreted by the AprilTag ROS node and finally translated into information that mavros can read. We can see this information being read by the flight controller, but need to make sure the coordinate frame orientations are correct before flying. This is crucial because if the orientations are incorrect, the drone will become unstable and crash.

As we were checking frame orientations, we decided to re-orient to pi cam to sit on the bottom of the drone and face down, allowing it to detect apriltags on the floor. This decision was made based on the fact that when the drone is flying, the camera will always be pointed close to vertically downwards. Any changes in yaw will not result in the quadcopter losing sight of the apriltag, and minimal changes in position will keep the apriltag in view.

Our Team

This project is a collaboration of Olin College of Engineering students for Computational Robotics, Fall 2022

Tigey

Robotics Engineering Student

Lilo

Computing Student

Address

1000 Olin Way, Needham, MA

Website

olin.edu

Drop Us A Line