Skip to main content
arc logo
Automotive Research Center
hero image

Navigation for the Twenty-first Century

March 15, 2021
Coastal Navigation With Dead Reckoning by J.S. Bond, licensed under Creative Commons.
Coastal Navigation With Dead Reckoning by J.S. Bond, image licensed under Creative Commons.

In 1707, the British Royal Navy lost 22 ships and more than 2,000 souls when the lead ship, Association, smashed into the rocky shoreline. This disaster was a result of the captain’s use of dead reckoning, in essence, the gut instinct to identify the ship’s location. The scope of the catastrophe highlighted the need to accurately determine a ship’s location at sea, leading to the race to determine longitude. More than 300 years later, the Army is exploring new technologies to modernize robotic ‘dead reckoning’ with improved technologies to help autonomous vehicles navigate unknown terrain.

Maani Ghaffari and his team are developing the two components necessary to revolutionize autonomous vehicle navigation — a fail-safe ability to sense and track the vehicle’s motion (proprioceptive observer) and a model to accept this data and create three-dimensional maps that can be used to navigate the surrounding landscape.

While still in the early stages of the project, the team is developing the mathematical algorithms for an odometer-like filter that is capable of tracking the distance the vehicle travels at any time of day and in any weather conditions. They will use these data to detect motion and make distance estimates.

“Off-road vehicular navigation is not a problem for current self-driving cars, because they operate in a structured, urban environment, by navigating with prebuilt maps,” said Ghaffari, assistant professor at the Department of Naval Architecture and Marine Engineering and the Robotics Institute, at the University of Michigan, and lead investigator on the project. “If you remove the prior information, the problem becomes much harder. This project plans to develop onboard sensors and computers for [autonomous] vehicles so they can [navigate unstructured environments even if] disconnected from the network.”

In addition, the team will develop a multi-layer mapping model to maximize information storage so the autonomous vehicle can multi-task, from identifying its location to navigating the landscape to interacting with a human handler.

“Modeling traversability of the off-road environment is a great challenge,” said Ghaffari. “If we can track vehicles accurately, we can build a local map around where it travels to understand the dynamics of the environment. This provides the vehicle with situational awareness so it can make locally reliable decisions.”

Finally, the team will develop a deep neural network to capture what is happening in a sequence, like frames in a movie, and pair these frames with the traversability measurements. They plan to test this approach using available datasets and simulated data. The final stages of the project will test this approach in a real-world setting.

Although the project was funded to develop military applications, the products have civilian applications.

“The project is generic to any robot that wants to operate in an unstructured environment, ranging from search and rescue to missions on Mars,” said Ghaffari. “The nature of the platform may change but the algorithms and math are transferrable.”

Ghaffari was joined by Kira Barton at UMich, Paramsothy Jayakumar at the Ground Vehicle System Center, and Andrew Capodieci at Neya Systems on the project, titled “A robust semantic-aware perception system using proprioception, geometry, and semantics in unstructured and unknown environments.” The project received funding from Ground Vehicle System Center through the Automotive Research Center at the University of Michigan.

-–

Stacy W. Kish