Skip to main content
arc logo
Automotive Research Center
hero image
Back to all projects

Human-Autonomy Interaction

Annual Plan

Multi-Objective Rapid Generation of Virtual Environments and Data Analysis in Multi-User Immersive Gaming Environments for Testing New Crew Interface Concepts

Project Summary

Principal Investigators

  • Wing-Yue Geoffrey Louie, Oakland University
  • Jia Li, Shadi Alawaneh, Osamah Rawshdeh, Khalid Mirza (Co‐PIs), Oakland University

Students

  • Absalat Getachew, Andrea Macklem-Zabel, Oakland University

Government

  • Mark Brudnak, US Army GVSC

Industry

  • Heiko Wenczel, Epic Games
  • David Bouwkamp, FAAC
  • Tim Bates, Lenovo
  • Frank Black, Magi Leap

Project began Q4 2022.

The design, development, and testing of ground vehicle technologies are extremely costly. Indeed, ground vehicles represent complex cyber-physical systems involving a large number of interfaces, ECUs, sensors, actuators, software modules, and other components which lead to various types of complex interactions with users and surrounding environments. Typically, soldiers are introduced late in the life cycle of developing ground vehicles. If the physical prototype fails to deliver the intended capability, the technology design process may have to start all over. This can potentially double the development cost while delaying the intended benefit from reaching the field to support soldier survivability/mobility [11].

Even when early user evaluations are introduced, the number of test cases required to evaluate new ground vehicles, including crew interfaces, is large with complex stress testing scenarios which are almost impossible to perform in a real environment without huge costs and damages. It is critical during the testing process of ground vehicles to enable the replication of those scenarios including the collection of data about the vehicles, drives, and other stakeholders but also the extraction of relevant patterns to explain why those test cases/scenarios may pass or fail [12,13].

Virtual experimentation using immersive gaming environments has the potential to dramatically change the physical prototype paradigm by introducing end-users to realistic virtual prototypes of technologies early in a project’s lifecycle. Namely, these technologies enable multi-user and multimodal interactions in unstructured and uncertain environments in a range of potential interaction scenarios. Immersive gaming environments enable the replication of test cases/scenarios and the collection of relevant data from the simulation that can be used for explainability purposes. In addition, the real-time monitoring of the data from both the ground vehicles and drives can easily be integrated during the simulations compared to real-world scenarios that can be associated with connectivity issues and limited hardware/ECUs resources. Thus, the potential risks and costs associated with the autonomous ground vehicle and robotic system development can be reduced while expediting projects by performing end-user evaluations in parallel with technology development. However, several challenges are associated with the adoption of immersive gaming environments for military applications including the early testing and validation of ground vehicles including crew interfaces [14]. The current gaming environments and technologies are driven by the needs of consumers and the entertainment industry which do not coincide with the virtual experimentation requirements for military applications.

We will be addressing the following novel research questions during this project:

  • How to optimize the design of the virtual experiments to provide sufficient coverage of environment, technology, and user conditions while reducing the number of virtual experiments that need to be conducted?
  • How to analyze user sentiments in real-time during a virtual experiment?
  • How to automate the identification of optimal crew station configurations?

The main aim of this project is to rapidly configure and test new crew interface concepts in an AR/VR framework and also enable efficient data mining to extract relevant patterns from immersive simulations including interaction data and interview data. The project is complementary to the project led by Dr. Louie to develop an all-in-one virtual prototyping and experimentation platform using immersive gaming environments.

The research objectives are:

  1. Multi-objective rapid generation of virtual multi-user and multi-technology environments for testing new crew interface concepts
  2. Monitoring the execution of virtual simulations and analyzing virtual experiment data via multimodal neuro-symbolic techniques
  3. Automated Localization of issues in new crew interface concepts

References:

[11] Bo Fu, Tribhi Kathuria, Denise M. Rizzo, Matthew P. Castanier, X. Jessie Yang, Maani Ghaffari, Kira Barton: Simultaneous Human-robot Matching and Routing for Multi-robot Tour Guiding under Time Uncertainty. CoRR abs/2201.10635 (2022) [12] D. J. Gorsich, “The use of gaming engines for design requirements,” in Proceedings of the Design Society: DESIGN Conference, May 2020, vol. 1, pp. 141–146. doi: 10.1017/dsd.2020.333. [13] C. Chen, A. Seff, A. Kornhauser, and J. Xiao, “DeepDriving: Learning Affordance for Direct Perception in Autonomous Driving,” in 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, Dec. 2015, pp. 2722–2730. doi: 10.1109/ICCV.2015.312. [14] D. J. Harris, J. M. Bird, P. A. Smart, M. R. Wilson, and S. J. Vine, “A Framework for the Testing and Validation of Simulated Environments in Experimentation and Training,” Front. Psychol., vol. 11, p. 605, Mar. 2020, doi: 10.3389/fpsyg.2020.00605.

#2.A98