Human-Autonomy Interaction
Annual PlanSensory Integration in Simulated and Remote Piloting of Vehicle
Project Team
Government
Harry Zywiol, U.S. RDECOM-TARDEC
Kaleb McDowell, U.S. Army Research Lab.
Industry
Micah Steele, John Deere Corp.
Student
Kevin Rider, University of Michigan
Project Summary
Our objective is to determine the features in a vehicle’s driver interface that are most critical to the development of “overlearned” driving skill, or sensory-motor behavior that has become so automatic that secondary tasks can be undertaken without degrading driving performance. We will determine the relative contribution of visual, haptic, and ride-motion cues to driving skill using a dual task experimental paradigm. We will validate models of sensory integration and open and closed-loop motor behavior operating under limited cognitive resources. Applications for the model include remote piloting of vehicles in addition to traditional driving during conditions of supplemental cognitive and decision-making loads.
A new driver model that incorporates sensory integration will quantify the relative value of multiple sensory channels to driving performance under single and dual motor/cognitive task demands. This will have direct application to remote piloting of unmanned vehicles using visual feedback from on-board cameras without haptic or ride motion feedback. The impact of the missing information display channels will be assessed and sensory substitution will be explored. The results of this work will couple with development and testing of the ARC-supported Virtual Driver that integrates cognitive and physical modeling.