Skip to main content
arc logo
Automotive Research Center
hero image
Back to all projects

Human-Autonomy Interaction

Annual Plan

Virtual Experimentation for Soldier Evaluation of Autonomous and Non-Autonomous Technologies Using Multi-User Immersive Gaming Environments

Project Summary

Principal Investigators

  • Wing-Yue Geoffrey Louie, Oakland University
  • Jia Li, Shadi Alawaneh, Osamah Rawshdeh, Khalid Mirza (Co‐PIs), Oakland University

Students

  • Sean Dallas, Motaz Abuhijleh, Oakland University

Government

  • Mark Brudnak, US Army GVSC

Industry

  • Heiko Wenczel, Epic Games
  • David Bouwkamp, FAAC

Project began Q4 2022.

Autonomous ground vehicles, robots, and other enabling technologies currently being explored by the military are complex systems with a range of interacting heterogeneous sensors, actuators, algorithms, controls, and interfaces in addition to traditional automotive components. This high level of complexity is not just associated with the enabling technology but also the context in which these technologies are used and evaluated. Military engagements consist of (1) large scale interactions between 80-100 soldiers at different levels of organizations, which each have their own tasks towards an overall objective; (2) soldiers interacting with a heterogeneous set of technologies (e.g., UAVs, manned/unmanned ground vehicles, robotic combat vehicles [1], [2]) by either directly controlling the technology or indirectly being affected by it; and (3) large spaces (~ 5km) spanning a variety of environments with varied terrain and weather.

Due to the complexity and scalability required to test new enabling technologies, traditional development follows a sequential design, develop, and test waterfall model that carries notable limitations. For one, end-users are not introduced to developing technologies until late in the project’s lifecycle. This poses significant risks in additional cost and development time if unexpected or counterproductive human-robot/autonomy interactions emerge. Second, there are inherent limitations to experimentation of physical prototypes that hinder the depth and pace of producing actionable insight. These include restrictions on the capacity of human researchers to visually spectate all soldier-to-robot, soldier-to-autonomy, and soldier-to-soldier interactions; quantity of data that can feasibly be collected at the company- and crew-level in physical contexts; and time needed to prepare and analyze collected data.

A promising direction for addressing these challenges is the adoption of immersive gaming environments to introduce end-users to enabling technologies earlier. This research project proposes a new paradigm for experimentation of ground vehicles, robots, and other enabling technologies that utilizes immersive gaming to 1) reduce the time needed to develop virtual prototypes and situations reflective of typical military engagements, 2) involve soldiers earlier and more frequently in assessment of developing technologies, and 3) surpass the inherent limitations of data collection and analysis with physical world experimentation to unlock entirely new capabilities for evaluating human-robot/autonomy interactions.

This project focuses on developing a foundational platform for investigating large scale human-technology interactions via virtual experiments in video game environments. In this project, we will develop an all-in-one virtual prototyping and experimentation platform that supports the rapid design, development, implementation, and analysis of human-technology interactions and experiments within video game environments. In parallel, Dr. Kessentini’s complementary project focuses on rapidly configuring and testing new crew interface concepts in augmented reality and virtual reality frameworks and enable user sentiment analysis during virtual experiments while aligning sentiments with real-time interaction data.

The research objectives are:

  1. Rapid prototyping of multi-user and multi-technology virtual simulations for experimentation
  2. Synchronizing company-level and individual/crew-level interactions with novel technologies during virtual experimentation
  3. Conducting and analyzing virtual experiments on new enabling technologies

References:

[1] S. Dallas and W.-Y.G. Louie, “A Spectator System for Virtual Experimentation on Military Robot Technology,” SAE World Congress, Detroit, Michigan, 2022. [2] T. Zhang, K. Zhang, J. Lin, W.-Y. G. Louie, and H. Huang, “Sim2Real Learning of Obstacle Avoidance for Robotic Manipulators in Uncertain Environments,” IEEE Robotics and Automation Letters, vol. 7, no. 1, pp. 65-72, 2021.

#2.A97