Adapted from Pixabay by Dori Grijseels
Your brain isn't the same in virtual reality as it is in the real world
VR is widely used to study the brain, but it isn't the same as real life — and this has real-world consequences
Virtual Reality (VR) is not just for video games. Researchers use it in studies of brains from all kinds of animals: bees, fish, rats and, of course, humans. Sadly, this does not mean that the bees have a tiny VR headset. Instead, the setup often consists of either normal computer screens surrounding the subject, or a special cylindrical screen. This has become a powerful tool in neuroscience, because it has many advantages for researchers that allow them to answer new questions about the brain.
For one, the subject does not have to physically move for the world around them to change. This makes it easier to study the brain. Techniques such as functional magnetic resonance imaging (fMRI) can only be used on stationary subjects. With VR, researchers can ask people to navigate through a virtual world by pressing keys, while their head remains in the same place, which allows the researchers to image their brain.
The researchers can also control a virtual environment much more precisely than they can control the real world. They can put objects in the exact places they want, and they can even manipulate the environment during an experiment. For example, neuroscientists from Harvard University were able to change the effort the zebrafish had to put in to swim to travel the same distance in VR, which causes zebrafish to change how strongly they move their tails. Using this experiment, researchers determined which parts of the zebrafish brain are responsible for controlling their swimming behavior. They could have never performed such a manipulation in the real world.
If you've ever experienced VR, you know that it is still quite far from the real world. And this has consequences for how your brain responds to it.
One of the issues with VR is the limited number of senses it works on. Often the environment is only projected on a screen, giving visual input, without the subject getting any other inputs, such as touch or smell. For example, mice rely heavily on their whiskers when exploring an environment. In VR, their whiskers won't give them any input, because they won't be able to feel when they approach a wall or an object.
Another issue is the lack of proprioception, the feedback you get from your body about the position of your limbs. Pressing a button to walk forward is not the same as actually moving your legs and walking around. Similarly, subjects won't have any input from their vestibular system, which is responsible for balance and spatial orientation. This is also the reason some people get motion sickness when they are wearing VR headsets.
When VR is used for animal studies, the animals are often "headfixed," meaning they cannot turn their head. This is needed to be able to use a microscope to look at the cells in their brain. However, it poses a problem, specifically for navigation, as animals use a special type of cell, called a "head direction cell," in navigation tasks. These cells track the orientation of the head of an animal. And when the mouse can't move its head, the head direction cells can't do their job.
This is especially the case for the cells in the hippocampus. That is the part of your brain that is responsible for navigation, and so, relies heavily on inputs that give you information about your location and your direction.
Neurons talk to each other through electrical signals called action potentials, or spikes. The number of spikes per second, called the "firing frequency," is an important measure of how much information is being sent between neurons. A 2015 study found that, in VR, the firing frequency of neurons in a mouse is reduced by over two thirds, meaning that the cells don't send as much information.
The same study also showed that the cells are less reliable. They specifically looked at place cells, cells that respond to a particular location in the environment and are incredibly important for navigation. In the real world, these cells send spikes about 80% of the times that the animal is in a particular location. However, in VR, this is reduced to about 30%, so when an animal visits a location ten times, the cells will send spikes during only three of those visits. This means the animals are not as sure about their exact location.
Another important feature of brain activity are brainwaves, or neural oscillations. These represent the overall activity of all the neurons in your brain, which goes up and down at a regular interval. Theta oscillations, brainwaves at a frequency of 4-7 Hz, play an important part in navigation. Interestingly, scientists found that rats have a lower frequency of their theta oscillations in VR compared to the real world. This effect on oscillations is not limited to navigation tasks, but was also found for humans who played golf in the real world and in VR. It is most likely caused by the lack of vestibular input, but scientists are still unsure of the consequences of such changes in frequency.
We know that we should be critical when interpreting results from neuroscience studies that use VR. Although VR is a great tool, it is far from perfect, and it affects the way our brain acts. We should not readily accept conclusions from VR studies, without first considering how the use of VR in that study may have affected those conclusions. Hopefully, as our methods get more sophisticated, the differences in brain activity between VR and the real world will also become smaller.
This is a nice piece about some interesting implications of using VR in research. I like that you mentioned the limited sensory information available in VR environments. Besides the lack of sensory information from non-visual cues alone, I think it’s also important to consider how multiple sensory systems work together in the real world (and how these interactions are lost in VR). For example, in the real world we often see and touch objects at the same time, and the integration of these two senses can help us get a more complete picture of the objects in our environment. But in VR, you can’t physically pick up and feel the objects that you see. It might be interesting to study how that discrepancy affects the way the brain responds in these different contexts. Perhaps the lack of multisensory interaction partially explains why VR can be so captivating to the brain, as the brain might be surprised by the missing information it expects.