Skip to main content
arc logo
Automotive Research Center
hero image
Back to all projects

Human-Autonomy Interaction

Annual Plan

Language Communication and Collaboration with Autonomous Vehicles Under Unexpected Situations

Project Team

Principal Investigator

Joyce Chai, University of Michigan


Chris Mikulski Gregory Hartman, U.S. Army GVSC

Matthew Marge, Felix Gervits, U.S. Army Research Lab.


Andy Dallas, Soar Technology, Inc.


Ziqiao Ma, Cristian-Paul Bara, Ben VanDerPloeg, Owen Huang, Elina Kim, University of Michigan

Project Summary

Project begins 2021.

In a human-autonomy team, particularly in the context of autonomous vehicles, the highly dynamic environment often leads to unexpected situations where pre-trained models are inadequate or less reliable. In these situations, what’s immediately available to vehicles is often only human operators. This raises an important question: how to enable collaboration between humans and vehicles to jointly handle these unexpected situations? As human operators often are not programmers who can readily change the code as unpredicted situations arise in the field, approaches that enable natural communication and collaboration between humans and autonomy become critical.

Although recent years have seen an increasing amount of work in natural language communication with robots, not much work is done to address unexpected or adversarial situations. This work targets this under-explored area, particularly focusing on collaborating with unmanned autonomous vehicles (UAV).

The objective of this work is to empower autonomous vehicles with the ability to harness human knowledge and expertise and to enable natural language communication and collaboration in tackling unexpected situations. We seek to understand unique characteristics of human language and dialogue behaviors in collaborative exception handling and their implications in computational models for enabling language-based collaboration between humans and UAVs. Through this exploratory research, we aim to address the following fundamental questions:

  • What are some common exceptions which can be handled by a collaborative effort between humans and UAVS?
  • What are the characteristics of language behaviors and dialogue discourse in collaborative exception handling?
  • How to connect natural language with UAVs’ internal representations to facilitate communication?
  • How to model discourse and dialogue so that humans and UAVs can come to a common understanding of exceptions and jointly resolve exceptions?


  • DOROTHIE: Spoken Dialogue for Handling Unexpected Situations in Interactive Autonomous Driving Agents. Ziqiao Ma, Ben VanDerPloeg, Cristian-Paul Bara, Yidong Huang, Eui-In Kim, Felix Gervits*, Matthew Marge*, Joyce Chai. Accepted to Findings of EMNLP 2022.