A cascading, dual-frame, motion-tracking estimator for velocity-denied mobile robots

Case ID: 20/0031

Description

Cascading, Dual-Frame, Motion-Tracking Estimator for Velocity-Denied Mobile Robots

 

A novel state estimation algorithm to estimate position and velocity of any dynamic in motion using only a low-cost GNSS and IMU sensor.

 

Case ID

20/0031

 

Background

To traverse an environment, autonomous mobile robots require a robust navigation system that computes the position and velocity of the robot in real time. A typical mobile robot navigation system employs sensors and a computer-implemented algorithm that combines the sensor data with known robot dynamics to compute the best-known position and velocity; this algorithmic process is often called state estimation or sensor fusion. Position and acceleration can be cheaply measured using low cost global navigation satellite system (GNSS) and inertial measurement unit (IMU) sensor modules; however, linear velocity sensors like wheeled odometers, air speed, optical flow, or doppler velocity log sensors are only situationally applicable and/or cost prohibitive for many robotic applications. Existing mobile robot state estimation algorithms require measurements of position, velocity, and acceleration for mathematical convergence, which is incompatible with this low-cost sensor system. The state estimation algorithm described in this invention can convergently fuse position and acceleration measurements provided by a low-cost GNSS+IMU sensor system without the need for a linear velocity sensor, and provides the framework to elegantly fuse information from any other sensor sources present in the system.

 

Technology Overview

The goal of the novel state estimation algorithm described in this invention is to estimate robot position and velocity, while provided position and acceleration sensor information. To accomplish this, the algorithm proposes four novel techniques:

•       A discrete, delayed-time interpretation of the robot body frame. Enables the state estimator to fuse sensor data referenced to the map and/or robot body frames.

•       Adaptive covariance profiling. Enables the state estimator to accommodate the large difference in refresh rates between the GNSS and IMU sensors.

•       Unobserved linear velocity estimation. Enables the state estimator to extract the hidden/unobserved linear velocity by analyzing data from the measured linear position and acceleration states.

•       The cascading, dual-frame, motion-tracking estimator. This framework enables the state estimator to minimize error due to unnecessary frame transformations. Additionally, the propose motion-tracking estimation model can be applied to any moving system, independent of vehicle mass or environmental dynamics.

 

In real-world testing using low-cost, off-the-shelf GNSS and IMU sensors shown in Figure 1, the proposed state estimation algorithm maintains a dead-reckoning pose accuracy of <1 m of the post-interpolated pose measurements, and a hidden linear velocity accuracy of <±1 m/s.

 

 

Figure 1: Example low cost IMU (left) and GNSS (right) sensors used to evaluate state estimator. 

 

 

Features

This algorithm is applicable to cost-conscious robots, requiring only low-cost sensors.

The algorithm works for any moving system, regardless of the vehicle dynamics (smartphones to self-driving cars).

The algorithm was developed to elegantly integrate any additional sensor information.

 

Potential Benefits

This novel state estimation algorithm can benefit any system requiring positional navigation.  This includes autonomous, unmanned vehicles and robotics, personal navigation systems, smartphone tracking applications, body tracking devices, etc.

 

IP Status

Utility patent application filed

 

Seeking

Development partner

Commercial partner

Licensing

University spin out

Investment

 

Key Words

State estimation

Sensor fusion

Mobile robotics

Navigation

Kalman filter

Autonomous systems

Unmanned systems

 

Patent Information:
Inventors
Brennan Yamamoto
A. Zachary Trimble

For information, contact:
Rafael Gacel-Sinclair
Technology Licensing Associate
University of Hawaii
gacel@hawaii.edu
Keywords


© 2024. All Rights Reserved. Powered by Inteum