New Technology Promises More Efficient and Practical Virtual Reality Systems
Glitchy games and bulky headsets may soon be things of the past thanks to a new eye-tracking system for virtual reality/augmented reality (VR/AR).
Eye tracking is an essential component of AR/VR systems, but current systems have some limitations. These include a large size due to bulkier lens-based cameras and the high communication cost between the camera and the backend system.
Georgia Tech School of Computer Science Associate Professor Yingyan (Celine) Lin, Ph.D. student Hoaran You, and postdoctoral student Yang (Katie) Zhao have developed a new eye-tracking system that works around these limitations by combining a recently developed lens-less camera, algorithm, and acceleration processor designs.
“The current VR headsets are too heavy, gaming can lag, and using the controller is cumbersome. Combined, this prevents users from having a truly immersive experience. We mitigate all these problems,” said You.
This new system, EyeCoD: An Accelerated Eye Tracking System via FlatCam-based Algorithm & Accelerator Co-Design, replaces the traditional camera lens with FlatCam, a lensless camera 5x – 10x thinner and lighter. Combined with FlatCam, the team’s system enables eye tracking to function at a reduced size, with improved efficiency, and without sacrificing the accuracy of the tracking algorithm. The system could also enhance user privacy by not including a lens-based camera.
Another feature of the EyeCoD system is that it only puts the portion of the screen that a user’s eyes focus on in high resolution. It does this by predicting where a user’s eyes may land, then instantaneously rendering these areas in high res. These computational savings, plus a dedicated accelerator, underpin EyeCoD’s ability to boost processing speeds and efficiency.
The team received the Office of Technology Licensing’s Tech Ready Grant for its efforts earlier this year. Tech Ready Grants offer $25,000 to help faculty transition projects from the lab to the marketplace.
The team hopes to use the funds to integrate the current demos into a compact eye-tracking system for use in commercial VR/AR headsets.
Along with winning the Tech Ready Grant, the team presented EyeCoD at the International Symposium on Computer Architecture (ISCA) 2022. IEEE Micro included the work in its Top Picks from the Computer Architecture Conferences for 2023. The annual publication highlights “significant research papers in computer architecture based on novelty and potential for long-term impact.”
EyeCoD is a collaborative work. Collaborators include Rice University Professor Ashok Veeraraghavan, whose team provided the technical support and design of the FlatCam camera in EyeCoD; and Ziyun Li, of Meta, who provided technical inputs to ensure that the EyeCoD system aligns with industry AR/VR specifications.