Skip to main content
arc logo
Automotive Research Center
hero image
Back to all projects

Vehicle Controls & Behaviors

Annual Plan

A Robust Semantic-aware Perception System Using Proprioception, Geometry, and Semantics in Unstructured and Unknown Environments

Project Team

Principal Investigator

Maani Ghaffari, University of Michigan Kira Barton, University of Michigan

Government

Paramsothy Jayakumar, U.S. Army GVSC

Industry

Andrew Capodieci, Neya Systems

Student

Joey Wilson, U. of Michigan

Project Summary

Project begins in 2021.

A real-time robust perception system is a precondition to achieve autonomous off-road mobility at high speed, real-world autonomy, and operation in unstructured and uncertain environments. Today we do not have such a robust perception system capable of supporting complex dynamic off-road missions. Without reliable proprioceptive dead reckoning, the vehicle can be lost and never recovered in perceptually degraded situations, e.g., completely dark, bright, uniform, or foggy scenes. Without a dense, dynamic semantic map, the vehicle’s scene understanding is not sufficient for informed decision-making. Furthermore, autonomous off-road mobility at high speed requires high-frequency and resource-constrained state estimation algorithms that work on-board. The investigators consider proprioception, geometric, and semantics as independent bases that must be considered simultaneously within perception algorithms.

multimodal nature of proprioception and exteroception

An autonomous off-road vehicle cannot rely on high-definition maps and structured road networks as commercial self-driving vehicles do. Its perception capabilities dictate the behavior of an autonomous vehicle in an unknown environment. This work addresses two core necessities by developing:

  1. a fail-safe proprioceptive high-frequency state estimator using invariant observer design theory for dead reckoning over long trajectories (i.e., 1 km and above);
  2. a multilayer semantic mapping framework that models both geometry and semantics of a complex dynamic scene in 3D and runs onboard.

This work addresses the fundamental research questions of:

Q1: What is the performance limit of onboard proprioceptive observers to deal with drifts in the absence of exteroceptive measurements or perceptually degraded situations?

Q2: How to consistently incorporate 3D scene flow estimated from stereo cameras or LIDAR data into a dense semantic map in real-time?

Publications:

  • Wilson, J., Song, J., Fu, Y., Zhang, A., Capodieci, A., Jayakumar, P., Barton, K., and Ghaffari, M., “MotionSC: Data Set and Network for Real-Time Semantic Mapping in Dynamic Environments,” in IEEE Robotics and Automation Letters, vol. 7, no. 3, pp. 8439-8446, July 2022, doi: 10.1109/LRA.2022.3188435.
  • Unnikrishnan, A., Wilson, J., Gan, L., Capodieci, A., Jayakumar, P., Barton, K., and Ghaffari, M., “Dynamic Semantic Occupancy Mapping Using 3D Scene Flow and Closed-Form Bayesian Inference,” in IEEE Access, vol. 10, pp. 97954-97970, 2022, doi: 10.1109/ACCESS.2022.3205329.
  • Wilson, J., Fu, Y., Zhang, A., Song, J., Capodieci, A., Jayakumar, P., Barton, K. and Ghaffari, M., Convolutional Bayesian Kernel Inference for 3D Semantic Mapping, 2023 IEEE International Conference on Robotics and Automation (ICRA), London, United Kingdom, 2023, pp. 8364-8370, doi: 10.1109/ICRA48891.2023.10161360.
  • Wilson, J., Fu, Y., Friesen, J., Ewen, P., A., Jayakumar, P., Barton, K. and Ghaffari, M., “ConvBKI: Real-Time Probabilistic Semantic Mapping with Quantifiable Uncertainty”, In Review at IEEE Trans. On Robotics.

Others:

Software:

#1.36