It seems so long ago since 3D, VR and AR have appeared in our everyday lives - on our smartphones, on our computers and consoles, in our cinemas - but in reality these technologies have developed in a handful of years, not without a few hitches along the way.
This limitation was caused by the sensors the first headsets were equipped with and by the algorithms. The risk was to encounter the "drift" or drift effect: moving in space, the headset could not exactly keep track of our trajectories, producing a drift effect as in figure 2 which could eventually lead to inaccuracy of the spatial tracking.
This limitation is acceptable if we are faced with 360 ° images, but it becomes frustrating if we want to move and explore entire virtual worlds. In order to walk in virtual spaces with the first VR headsets it was therefore necessary to rely on special joysticks or more futuristic solutions such as omnidirectional treadmills (such as the Virtuix Omni).
A few years and two tracking techniques later, we were finally freed from obsolete devices and we could experience the 6DoF (6 degrees of freedom) with the ability to move in space in width and height.
For the math geeks, here are some nerdy details on how this tech innovation came to be. It all starts from quaternions. These are a mathematical entity introduced by William Rowan Hamilton in 1843 as extensions of complex numbers. Quaternions provide convenient mathematical notation for representing orientations and rotations of objects in three dimensions (Figure 3). Compared to Euler angles, they have simpler functions to compose and avoid the problem of the cardan block. Compared to the rotation matrices, they are more stable numerically and perhaps more efficient.
We are now ready to delve into the techniques for positioning a body in virtual reality. Let's start with "Outside in tracking": this is a technique that involves the use of sensors to be positioned in the room to keep track of our position - as shown in figure 4.
The second tracking technique used in VR tech is called “Inside out tracking”: by mounting the cameras (even low resolution ones) on the headset and using image recognition algorithms, it is possible to obtain the same effect as the outside tracking (figure 5). A headset that already uses this technique is the Oculus Quest, whose cameras will soon also allow us to recognize fingers, freeing us from the use of controllers and gloves.
The same algorithms that reproduce inside out tracking allow us to create mixed reality experiences on our smartphones.