Eyal Ofek

Eyal Ofek

Professor

University of Birmingham

2024 Chair of Computer Science, University of Birmingham, UK.

My research focused on Human-Computer Interaction (HCI), sensing, and Mixed-Reality displays (MR) to enable users to reach their full potential in productivity, creativity, and collaboration. I envision MR applications as weaved with the fabric of our lives rather than PC and mobile apps, limited to running on a specific device’s screen. Such applications must be smart enough to understand users’ changing physical and social contexts and flexible enough to adapt accordingly. We developed systems such as FLARE (Fast Layout for AR experiences) that were used by the HoloLens Team and inspired the Unity MARS product, or Triton 3D audio simulation used by Microsoft Games such as Gears of War 4 and is the base of Microsoft Acoustics. ILLUMIROOM, a collaboration with the Redmond Lab, was presented at CES 2013 Keynote. I have published over 100 academic papers (with more than 14000 citations), was granted more than 110 patents, and I was awarded a senior member of the ACM.

I have been 19 years at Microsoft Research, both as a principal researcher as well as a research manager. In addition to publications of academic papers and the transfer of technology to products, I have released multiple tools and open-source libraries, such as the RoomAlive Toolkit, used around the world for multi-projection systems, SeeingVR to enhance the use of VR for people with low vision, Microsoft Rocketbox avatars, MoveBox and HeadBox toolkits to democratize avatar animation, and RemoteLab for distributed user studies.

I formed Microsoft’s Bing Maps’s Research Lab, where we published state-of-the art research and impacting the products. Among our results is the development of the influatial Stroke-Width Transform text detector used by Bing and was incorporated into many OCRs and the OpenCV library. I developed technologies such as the world’s first street-side imagery service, street-level reconstruction, novel texture compression, matching of user’s photos to the world (presented at TED 2010) and more.

I have served on multiple conference committees and was the paper chair of ACM SIGSPATIAL 2011. I was the Specialty Chief Editor of Frontiers in Virtual Reality for the area of Haptics and on the editorial board of IEEE Computer Graphics and Application Journal (CG&A). I received a senior membership of the ACM.

I was a founder of several startup companies. Among them, a develop of a very succesful 2D+3D graphic editor for the Amiga computer, development of a novel game engine rendering global illumination effects in real-time, and overseeing software R&D of the world’s first time-of-flight video camera in a startup company. I used cameras for applications such as TV depth keying and reconstruction, and it was the basis for the Depth cameras used by Microsoft HoloLens and Magic Leap HMD. I worked on AI based forest fire detection, guiding a fleet of automous drones at DataBlanket.com

Interests
  • Mixed Reality
  • Human-Computer Interaction
  • Computer Vision
  • Haptics
Education
  • Ph.D. in Computer Vision, 2000