<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Eyal Ofek | Virtual Reality Lab</title><link>https://virtualrealitylab.netlify.app/author/eyal-ofek/</link><atom:link href="https://virtualrealitylab.netlify.app/author/eyal-ofek/index.xml" rel="self" type="application/rss+xml"/><description>Eyal Ofek</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><item><title>How can we avoid hitting objects we can't see?</title><link>https://virtualrealitylab.netlify.app/blog/belt-and-whistles/</link><pubDate>Sat, 25 Apr 2026 00:00:00 +0000</pubDate><guid>https://virtualrealitylab.netlify.app/blog/belt-and-whistles/</guid><description>&lt;article class="lab-blog-article"&gt;
&lt;header class="lab-blog-article-header"&gt;
&lt;p class="lab-blog-author"&gt;Prof. Eyal Ofek&lt;/p&gt;
&lt;p class="lab-blog-affiliation"&gt;University of Birmingham. UK&lt;/p&gt;
&lt;/header&gt;
&lt;p&gt;Imagine walking through a dark room at night, trying to find your keys without waking anyone. You can't turn on the lights, so you move slowly, arms outstretched, reading the space through subtle cues - the faint vibration of furniture nearby, the shift of air around a chair. You're navigating by feel, hoping not to knock anything over.&lt;/p&gt;
&lt;p&gt;This everyday experience turns out to be surprisingly relevant to the future of Virtual Reality.&lt;/p&gt;
&lt;h2&gt;Blind in Plain Sight&lt;/h2&gt;
&lt;p&gt;A person wearing a VR headset is, for all practical purposes, blind to their physical surroundings. To stay safe, VR users are typically asked to clear a room of obstacles before putting on their headset. When they approach the edge of their designated play area, the headset displays a virtual boundary - a visual fence that breaks immersion but prevents collisions.&lt;/p&gt;
&lt;p&gt;The problem doesn't end there. Even within the virtual world itself, users have almost no haptic awareness below the waist. A player might glance down mid-game to discover they're standing directly inside a virtual coffee table. It sounds harmless - after all, no one gets hurt by a virtual table - but &lt;a href="https://eyalofek.org/wp-content/uploads/2023/04/CHI23_EmbodiedPhysics.pdf" target="_blank" rel="noopener"&gt;research presented at CHI 2023, the leading Human-Computer Interaction conference,&lt;/a&gt; found that violations of physical rules in VR measurably reduce immersion. Conversely, when a user's virtual avatar respects physical laws - even when that means moving differently from the user's actual body - immersion increases.&lt;/p&gt;
&lt;h2&gt;A Belt That Lets You Feel the Virtual World&lt;/h2&gt;
&lt;p&gt;Researchers at the VR Laboratory of the University of Birmingham have developed a novel approach to this problem. Their team - Dr. Diar Abdlkarim, Devika Mukherjee, Dr. Daniele Giunchi, Dr. Massimiliano DiLuca, and Prof. Eyal Ofek - designed a haptic belt that gives users awareness of obstacles around their lower body, both physical and virtual.&lt;/p&gt;
&lt;p&gt;Their paper, &lt;a href="https://pure-oai.bham.ac.uk/ws/portalfiles/portal/295068678/BeltAndWhistles_CHI26.pdf" target="_blank" rel="noopener"&gt;"Belt and Whistles: Adding Lower Body Collision Awareness for MR Experiences,"&lt;/a&gt; will be presented at the CHI conference in Barcelona this month.&lt;/p&gt;
&lt;p&gt;The belt is battery-powered, wireless, and designed to fit a range of body types. It connects directly to a standalone Meta Quest 3 headset and delivers directional haptic signals - meaning the user can feel not just that something is nearby, but which direction it's coming from. Six voice coils embedded around the belt produce two distinct types of feedback: a sharp pulse when the user's body makes contact with a virtual object, and a soft, gradually intensifying signal as the user draws closer to one.&lt;/p&gt;
&lt;p&gt;This dual-mode design was deliberate. The researchers wanted the belt's signals to complement VR applications rather than compete with them - staying out of the way during intense moments while still guiding users toward safer, more physically coherent movement.&lt;/p&gt;
&lt;figure class="lab-blog-figure"&gt;
&lt;img src="figure_2.png" alt="Battery-operated haptic belt connected wirelessly to a Meta Quest 3 headset"&gt;
&lt;figcaption&gt;&lt;strong&gt;Figure 1:&lt;/strong&gt; A battery-operated belt, wirelessly connected to a Meta Quest 3 headset, enables rendering of a range of signals of lower-body haptics and scene awareness signals.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h2&gt;Does It Work?&lt;/h2&gt;
&lt;p&gt;Testing the belt across different scenarios, the team found that in high-stress gaming environments, the haptic feedback did not hurt player performance - but it did lead players to navigate more physically plausible paths through virtual space, improving their sense of presence. In calmer, more exploratory applications, users were able to navigate a darkened virtual room and locate targets while avoiding furniture entirely by feel.&lt;/p&gt;
&lt;figure class="lab-blog-figure"&gt;
&lt;img src="figure_3.png" alt="Haptic signal rendered in sync with the sway of a virtual bridge"&gt;
&lt;figcaption&gt;&lt;strong&gt;Figure 2:&lt;/strong&gt; Rendering of haptic signal in sync with the sway of the bridge, can help make the experience more realistic.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;The potential applications extend further still. The belt can render the sway of an unstable virtual bridge (See Figure 2), making the experience feel physically grounded. It can signal the vibrations of approaching objects - crystals crashing toward a player, for instance - before they come into view or earshot, giving users a premonition of what's about to happen (See Figure 3). It can also alert users when they're about to walk backward into a real wall, a disorienting experience that currently leaves many VR users feeling suddenly out of control.&lt;/p&gt;
&lt;figure class="lab-blog-figure"&gt;
&lt;img src="figure_4.png" alt="Cave Run haptic signals warning a player about approaching crystals"&gt;
&lt;figcaption&gt;&lt;strong&gt;Figure 3:&lt;/strong&gt; Cave Run: The haptics signals of the belt render the vibrations as crystals crash in front of the player. While objects are yet to be seen or heard, their motion can be used as a premonition signal to help the player.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h2&gt;A Step Toward Full-Body VR&lt;/h2&gt;
&lt;p&gt;Haptic research in VR has long focused almost exclusively on the hands. The haptic belt represents a different philosophy - one that treats the whole body as a surface worth addressing. The device is inexpensive, adaptable, and easy to use, and the researchers hope its simplicity will encourage broader exploration of full-body haptics in mixed reality.&lt;/p&gt;
&lt;figure class="lab-blog-figure"&gt;
&lt;img src="figure_5.png" alt="Haptic belt warning a VR user before they walk backward into a wall"&gt;
&lt;figcaption&gt;&lt;strong&gt;Figure 4:&lt;/strong&gt; A person wearing a VR headset is walking back into a wall, and as a result loses the view of the room (a). While the user's avatar is prevented from entering the wall, nothing stops the user from walking backward and feeling they lost control of their VR experience (b). Using the haptic belt, the user is aware of reaching the wall and stops backing up (c).&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p&gt;Belt and Whistles: Video presentation:&lt;/p&gt;
&lt;figure class="lab-blog-video"&gt;
&lt;video controls preload="metadata" poster="featured.png"&gt;
&lt;source src="https://eyalofek.org/wp-content/uploads/2026/04/BeltAndWhistles.-HI26.mp4" type="video/mp4"&gt;
&lt;a href="https://eyalofek.org/beltandwhistles/" target="_blank" rel="noopener"&gt;Watch the video presentation on the original post&lt;/a&gt;
&lt;/video&gt;
&lt;figcaption&gt;&lt;a href="https://eyalofek.org/beltandwhistles/" target="_blank" rel="noopener"&gt;Watch the video presentation on the original post&lt;/a&gt;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p class="lab-blog-source"&gt;Original post: &lt;a href="https://eyalofek.org/beltandwhistles/" target="_blank" rel="noopener"&gt;https://eyalofek.org/beltandwhistles/&lt;/a&gt;&lt;/p&gt;
&lt;/article&gt;</description></item><item><title>Eyal Ofek</title><link>https://virtualrealitylab.netlify.app/author/eyal-ofek/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://virtualrealitylab.netlify.app/author/eyal-ofek/</guid><description>&lt;p&gt;2024 Chair of Computer Science, University of Birmingham, UK.&lt;/p&gt;
&lt;p&gt;My research focused on Human-Computer Interaction (HCI), sensing, and Mixed-Reality displays (MR) to enable users to reach their full potential in productivity,
creativity, and collaboration. I envision MR applications as weaved with the fabric of our lives rather than PC and mobile apps, limited to running on a specific device’s screen. Such applications must be smart enough to understand users’ changing physical and social contexts and flexible enough to adapt accordingly. We developed systems such as FLARE (Fast Layout for AR experiences) that were used by the HoloLens Team and inspired the Unity MARS product, or Triton 3D audio simulation used by Microsoft Games such as Gears of War 4 and is the base of Microsoft Acoustics. ILLUMIROOM, a collaboration with the Redmond Lab, was presented at CES 2013 Keynote.
I have published over 100 academic papers (with more than 14000 citations), was granted more than 110 patents, and I was awarded a senior member of the ACM.&lt;/p&gt;
&lt;p&gt;I have been 19 years at Microsoft Research, both as a principal researcher as well as a research manager.
In addition to publications of academic papers and the transfer of technology to products, I have released multiple tools and open-source libraries,
such as the RoomAlive Toolkit, used around the world for multi-projection systems,
SeeingVR to enhance the use of VR for people with low vision,
Microsoft Rocketbox avatars, MoveBox and HeadBox toolkits to democratize avatar animation,
and RemoteLab for distributed user studies.&lt;/p&gt;
&lt;p&gt;I formed Microsoft&amp;rsquo;s Bing Maps&amp;rsquo;s Research Lab, where we published state-of-the art research and impacting the products.
Among our results is the development of the influatial Stroke-Width Transform text detector used by Bing and was incorporated into many OCRs and the OpenCV library.
I developed technologies such as the world’s first street-side imagery service, street-level reconstruction, novel texture compression, matching of user&amp;rsquo;s photos to the world (presented at TED 2010) and more.&lt;/p&gt;
&lt;p&gt;I have served on multiple conference committees and was the paper chair of ACM SIGSPATIAL 2011. I was the Specialty Chief Editor of Frontiers in Virtual Reality for the area of Haptics and on the editorial board of IEEE Computer Graphics and Application Journal (CG&amp;amp;A). I received a senior membership of the ACM.&lt;/p&gt;
&lt;p&gt;I was a founder of several startup companies.
Among them, a develop of a very succesful 2D+3D graphic editor for the Amiga computer, development of a novel game engine rendering global illumination effects in real-time, and overseeing software R&amp;amp;D of the world’s first time-of-flight video camera in a startup company. I used cameras for applications such as TV depth keying and reconstruction, and it was the basis for the Depth cameras used by Microsoft HoloLens and Magic Leap HMD.
I worked on AI based forest fire detection, guiding a fleet of automous drones at DataBlanket.com&lt;/p&gt;</description></item></channel></rss>