The focus of our research is to understand how humans perceive, act, and collaborate in immersive environments and apply this knowledge to extended reality (XR) systems that support interaction, learning, and creative performance. A common goal is to develop scientific and engineering principles that improve user experience, accessibility, and performance in VR, AR, and MR.
Our research uses approaches from psychophysics, computational modelling, human–computer interaction, machine learning, and real-time motion analysis. The lab fosters collaboration between researchers in psychology, computer science, music technology, robotics, and the creative industries.
Funding for our work comes from:
alongside support and collaboration with industry partners such as:
and cultural and charitable partners including:
This research area examines core problems in XR interaction where current technologies remain limited or where no single solution can address all use cases. Some challenges, such as realistic haptic feedback, are constrained by the physical limits of existing hardware and remain largely unsolved. Others—such as locomotion, text entry, object manipulation, and menu navigation—admit many competing solutions, each suited to specific tasks, environments, or user abilities.
Our research investigates how humans perceive and act under these constraints and develops principles that guide the design of effective interaction techniques. We study topics such as:
By combining perceptual science, motion analysis, and engineering, the project defines when and why particular solutions succeed, how users adapt to system limitations, and how XR interfaces can be designed to support diverse tasks and user groups. The outcomes include conceptual frameworks, open datasets, and validated guidelines for XR interaction design used by researchers and industry partners.
This research area investigates how immersive technologies can support performance, collaboration, and audience engagement across music, sport, and cultural heritage. The work combines motion capture, computer vision, spatial audio, real-time visualisation, and AI-driven feedback systems to create interactive environments that respond to human movement. The programme includes several strands:
The lab is fundamentally collaborative and interdisciplinary. It is open to anyone that would like to collaborate, has an idea and is looking for technical support, or would like to use some of the lab equipment. In this spirit, we founded BhamXR, a cross-disciplinary community of over 120 researchers at the University of Birmingham working on VR, AR, haptics, wearables, data science, performance, and creative technologies. The network promotes collaboration through seminars, workshops, research showcases, and training events. It connects students with active research groups across Psychology, Computer Science, Engineering, Medicine, and the Arts.
The lab maintains a strong portfolio of industry engagement and commercialisation activities, including several spinoffs:
A selection of projects that are currently active in the lab.