Towards the Accelerator Metaverse: Spatial Computing Interfaces for Crossmodal Human-Robot Collaboration in Accelerators
Supervisors: Prof. Wim Leemans (DESY, UHH), Prof. Frank Steinicke (UHH), Dr. Susanne Schmidt (UHH)
Robot technology, spatial computing, and extended reality (XR) have shown enormous potential to significantly enhance the operation, monitoring, repair, and maintenance of accelerators and their infrastructure, facilitating more efficient user experiments. In particular, advancements in artificial intelligence (AI), such as natural language processing (NLP) or eye and hand tracking, enable the combination of multiple interaction modalities such as speech, gaze, and gestures for such robot control. Moreover, instead of focusing on low-level tasks such as using the hands for navigation control, users could guide robots with simple verbal instructions or gestures, e.g., looking at an object and performing an air tap to make the robot locomote semi-autonomously toward the intended object.
While such crossmodal mappings of input and output modalities between humans and robots have great potential to improve human-robot collaboration, it is unexplored how such spatial interaction should be implemented in order to reach a high level of usability, user experience and acceptability. In the scope of this PhD project, we aim to develop novel crossmodal human-robot interaction (HRI) for spatial computing interfaces to enhance collaboration between users and remote robots at accelerator facilities. The objective is to transition HRI from merely controlling machines to collaborating with intelligent, embodied companions, making operations more intuitive and efficient.