LLM-DRIVEN HUMAN-ROBOT INTERACTION WITH DIGITAL TWINS FOR FACILITY MANAGEMENT
Accepted version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
This paper presents a framework for integrating user interaction, building digital twins, and robotic automation to enhance facility management. The system leverages a Large Language Model (LLM) as the central interface, enabling the user to intuitively retrieve data from the digital twin and command the robotic agents for facility inspection and monitoring tasks. Commands from the user are processed by the LLM, which translates them into actionable tasks. These tasks are then interpreted by the robotics middleware and executed autonomously by robots equipped with navigation and data acquisition capabilities. The collected data is then presented to human operators, who can use it to update the digital twin and inform maintenance decisions. By combining the natural language processing power of LLMs with digital twin based data and robotic automation, the proposed framework reduces manual effort while streamlining facility inspections and supporting maintenance decision-making in facility management. A theoretical case study demonstrates the system's capabilities, illustrating its ability to process user queries, allocate robotic tasks, collect and deliver inspection data, and support informed decision-making. This approach bridges the gap between human decision-making, digital representations, and physical site operations, offering a user-friendly solution for modern facility management.
Description
Keywords
Journal Title
Conference Name
Journal ISSN
Volume Title
Publisher
Publisher DOI
Rights and licensing
Sponsorship
Engineering and Physical Sciences Research Council (2728220)

