Summary
The “Multiple Patient Monitoring with Smartglasses and Auditory Displays” project focused on developing and evaluating an integrated, bimodal system to enhance clinicians’ ability to monitor several patients simultaneously while performing other demanding tasks. This system combined Head-Worn Displays (HWDs), often referred to as smartglasses, with an auditory display technology called Spearcons (time-compressed speech). This research demonstrated that the combination of both visual and auditory displays can provide superior monitoring accuracy and supports clinical multitasking.
The research was conducted through a collaboration between the PsyErgo group University of Würzburg, where most of the data collection took place, and the CERG group at the University of Queensland, where the project was developed and piloted.
Motivation
In modern hospital environments, clinicians frequently face highly demanding scenarios, such as nurses monitoring five to seven patients, or supervising anaesthesiologists overseeing multiple operating rooms. However, traditional patient monitoring systems face mutiple challenges. For exmaple, visual displays are constrained to fixed locations, requiring clinicians to physically move to access critical data. Furthermore, conventional auditory alarms are often uninformative and frequently generate false alarms (up to 94%), contributing significantly to the critical issue of “alarm fatigue”, which can cause staff to miss genuine patient deterioration. The project sought a location-independent solution to provide continuous awareness and context before alarms sound.
Concept and Technological Approach
To overcome the limitations of fixed displays and alarm fatigue, this project investigated the synergistic potential of bimodal monitoring. HWDs provide location-independent visual access to continuous and exact patient vital sign information, such as Heart Rate (HR) and Oxygen Saturation (SpO2). The auditory component utilized spearcons, which are short time-compressed speech phrases found to be significantly more effective and learnable than abstract auditory motifs at communicating patient status levels.
The resulting bimodal system supports both endogenous attending (the clinician proactively checking the HWD) and exogenous attending (salient auditory spearcons capturing attention when important changes occur). Non-clinician participants monitoring five simulated patients demonstrated that the combination of HWD and spearcons achieved the best patient identification accuracy and recognized significantly more changes than the auditory-only condition. Importantly, using the bimodal display did not statistically hinder performance on a demanding concurrent tracking task, suggesting it is a suitable solution for multitasking clinical environments.
