When we glance around a room without focusing on one thing in particular, our eyes move around and then blink in small eye jumps known as saccades. Road to VR has an interesting article that covers how researchers have devised saccade tracking that will move and reorient players incrementally in real life as we glance, walk and play around in VR.
When our eyes do saccadic movement they go through a type of transitional downtime or a flickering moment of blindness per say, as they glance at what’s coming up ahead or at what’s going on around them in VR. The time between looking around and finding a point to look at is the downtime that saccade tracking researchers use to shift the virtual world around us.
Ben Lang from Road to VR explains that “With precise eye-tracking technology from SMI and an HTC Vive headset, the researchers are able to detect and exploit that temporary blindness to hide a slight rotation of the scene from the user. As the user walks forward and looks around the scene, it is slowly rotated, just a few degrees per saccade, such that the user reflexively alters their walking direction in response to the new visual cues.”
Video Credit to: Road to VR via YouTube
Tricking The Eyes, Mind, Body
The saccade tracking tech does a great job tricking the mind and body into believing the room they are in and the game they are playing feels more expansive than it actually is. This can be a great solution for gyms, arcades, and location-based experiences that have limited space available but still want their clients and customers to get a fully satisfying and immersive room scale VR experience through movement.
Guidance systems aren’t only for small spaces, warehouse-scale VR experiences like Zero Latency already use Change Blindness Redirection. Multiple players put on a wireless gaming backpack, wireless headset, and use tracked controllers to fight zombies as a team as they’re guided around each other and think they are somewhere far away from one another on a game’s map.
Moving virtual environments in small shifts as users are distracted by viewing, interacting, and walking through virtual worlds sends a signal to the eyes and brain that the user is covering more area, walking in a straight line or are twisting and turning through corridors and maps when they’re really walking in circles. These subtle scene manipulations aren’t noticeable from the viewer’s perspective and will lead the user to change their course if they are too close to other players, objects, and walls.
Lang goes on to explain the ever-changing tracking system that researchers Anjul Patney and Qi Sun have designed has a “GPU accelerated real-time path planning system” that builds these changing pathways around users as they go. He goes on to say that the system “can account for objects newly introduced into the real world environment (like a chair), and can even be used to steer users clear of moving obstacles, like pets or potentially even other VR users inhabiting the same space.” How external object detection or inter-player tracking is achieved isn’t known to us, but is still really cool in concept.
Reflecting On The Tech
After reading this article I was left with thinking about how the tracking system moves users around in room scale as they walk around and how using controllers would play into this. Not everyone has a large room scale setup, so does interacting with the environment also take into account for the use of controllers and teleportation? Surely the time between activating the teleport button and moving to the spot you’ve selected should be a factor in the movement of the pathway the system is generating as well.
HTC Vive and Pro headsets can be tracked wirelessly with the TPCast add-on with their sensors, but real-time object tracking is a whole different ballgame. We doubt that they’re sticking trackers on everything in sight, so how do they do it? How can what’s being tracked externally keep redirecting me without me seeing it if my dog is moving around all over the place?
The article mentions that the tracking system is “applicable to VR content” but doesn’t list what content or games it has the potential to be used with. Learning more from the researchers about which games and experiences they are testing the guidance and tracking system with would be helpful.
The future is bright when it comes to eye tracking and manipulating the virtual spaces around us. Imagine the relief of never having to keep tabs on your boundary lines again and not worrying that you’re going to run into someone or hit something with your controller or hand.
This kind of blossoming technology works within the space it’s been given and can turn any room large or small into a world. We can’t wait to see which games and experiences will be paired with the saccade eye tracking and course redirection system.