One of the ways that virtual reality could be made even more immersive is if users were able to fully explore enormous virtual worlds by physically walking through them. That’s certainly a neat idea, but it’s also something that’s not exactly easy to accomplish when you’re using a VR headset in a small space. After all, nothing ruins the illusion of endless virtual space more than setting out to walk through a sprawling forest only to immediately stub your toe on the dresser five steps in front of you.

Previously, we’ve written about a solution created by researchers in Japan that misleads the brain into thinking it’s walking in a straight line when it’s actually walking in giant circles. That’s pretty neat, but it still requires a “play space” of 16 by 22 feet. A new system, developed by researchers from Stony Brook University, Nvidia, and Adobe, offers an alternative.

“This [project aims to] redirect users’ walking in VR so that they can explore a large virtual scene within a small physical space,” Qi Sun, lead author of the study, told Digital Trends. “We did this with an eye-tracked HMD (head-mounted display) to detect users’ saccade. With the help of human nature [in the form of] ‘saccadic suppression’ effect, users do not notice our redirection.”

Saccades refer to quick eye movements that occur when we are looking at different points in our field of vision, such as when we’re scanning a room. They take place without our control at a rate of several times every second, although we do not consciously register them. What the researchers on this project did was to exploit these saccade movements to rotate the virtual camera in the VR display during such rapid eye movements. By doing this, they can redirect users’ walking direction in real life to simulate a much larger space. This can be done without causing dizziness or discomfort.

Sun offered some potential applications for the redirected walking, including gaming and design work. For instance, an architect could walk around a 3D model of a large building in the confines of their office. Will this technology be commercialized in the near future? Right now, the team is keeping quiet about their plans. “This is currently a research project and neither I nor anyone else could comment on productization plans at this time,” Sun said.

The research is due to be shown off at the SIGGRAPH 2018 event, which showcases the latest computer graphics and interactive technologies, in August.

Related Posts

Your WhatsApp voice notes could help screen for early signs of depression

The study, led by researchers in Brazil including Victor H. O. Otani from the Santa Casa de São Paulo School of Medical Sciences, found that their AI could identify depression in female participants with 91.9% accuracy. All the AI needed was a simple recording of the person describing how their week went.

Talk to AI every day? New research says it might signal depression

This finding comes from a national survey of nearly 21,000 U.S. adults conducted in 2025, where participants detailed how often they interacted with generative AI tools and completed standard mental health questionnaires. Within that group, about 10% said they used AI daily, and 5% said they engaged with chatbots multiple times throughout the day. Those daily users showed higher rates of reported depressive symptoms and other negative emotional effects, such as anxiety and irritability.

You might actually be able to buy a Tesla robot in 2027

The comments follow a series of years-long development milestones. Optimus, which was originally unveiled as the Tesla Bot in 2021, has undergone multiple prototype iterations and has already been pressed into service handling simple tasks in Tesla factories. According to Musk, those internal deployments will expand in complexity later this year, helping prepare the robotics platform for broader use.