It’s been a long time coming, but virtual reality technology is finally here. After years of science fiction and pop-culture predictions about fully immersive gameplay, it seems like every major tech company is coming out with their own hardware and software.
It doesn’t just have to be used for video games. Many people are looking into the other kinds of entertainment, as well as the educational and social applications of VR technology. Even with just the technology that’s currently available, VR allows for full immersion in a new perspective that would otherwise have never been accessible.
What has become our visual spectrum makes sense from an evolutionary standpoint, since it is a lot of what our sun is pumping out. But that’s not the whole picture, especially when you compare it to what some animals can see. Maybe one of these days VR technology will allow us to experience how animals view our world.
Let’s imagine a new product at CES 2020.
I recently got to experience how some animals see, thanks to a demo of the Animoculus Rift at CES 2020. It’s a collection of games developed by a company with the same name, designed for a variety of VR hardware. It takes advantage of current game engines and graphical capabilities to give a user an experience like they’re seeing the world through the eyes of an animal. Eventually they want the ability to generate interactive experiences from the perspective of almost any animal, but for now they just have a few examples of their software.
310 Degrees of Legally Blind
The first game I played was called Deer Chase. Imagine a hunting game turned on its head. Instead of a human hunter trying to shoot at animals, I was a white tailed deer trying to navigate the forest, avoid hunters and predators, and look for food, all at the same time.
The first thing I had to get used to was the focus. A white tailed deer has roughly 20/200 vision, which is about the same as a legally blind person.
It wasn’t just evenly blurry either; the top and bottom of my field of view were extremely blurry, with a bit more focus through the middle. The game developers did this to replicate the horizontal pupil that deer actually have.
Another challenging aspect of the game was the colors. While humans see the three primary colors of red, green, and blue, deer and many other animals are sensitive to only blue and green. So, there were the blue sky and the greens of the forest, but any other colors just became murkier greens or slightly yellow.
White tail deer, along with many species that are generally prey, have eyes mounted on the sides of their heads, pointing out. This gave me a 310° view of the world, but was extremely difficult to get used to, since my brain is designed for processing binocular vision.
An advantage I had, however, was at night. If you’ve ever seen an image of certain animals at night, you’ll notice that their eyes seem to glow. That’s because of something called the tapetum lucidum, a membrane at the back of the retina that reflects light.
Basically, the tapetum lucidum causes twice as many photons to hit the photoreceptor cells in the retina. So many things became quite desaturated, but still relatively vivid, compared to what a human would see at night, thanks to the recycling effect of the reflective membrane.
The Eagle has Landed
After the panic and frenzy of that experience, I was certainly ready for the next game; Eagle Flight Simulator. Right away, it was a huge difference from the vision of a deer.
The first major difference was colors, after all of the blues and greens of deer vision, it was refreshing to see vibrant full colors. In fact, it was even beyond full color, since research has suggested that eagles, like many other birds, are also sensitive to ultraviolet light.
The game itself, as you can imagine, was mainly about flying around. But since this is Eagle Flight Simulator, you are also hunting rabbits. It took me a little while to get used to the controls. Once you’re in the air, flying is relatively intuitive, but landing, taking off, and actually catching prey were a bit trickier.
Some of the controls also directly related to my vision. The headset I was using tracked my head movements and amplified them so I could look around in flight. There was also a heads-up display that allowed me to target certain objects and zoom in on them within my field of view.
This broke my immersion somewhat, but the developers explained that it was necessary. An eagle has roughly five times the density of the photoreceptors in the human eye. The display in my headset wasn’t the limiting factor, my eyes were. Selectively zooming in on certain objects was the only way they could think of to demonstrate that enhanced clarity.
I could also select up to two targets to zoom in on. This was meant to reflect the two patches of concentrated photoreceptors that the eagle eye has called a fovea, whereas a human eye only has one. Once I got used to the heads-up display and controls, it was an extremely fun and exhilarating experience.
Catch of the Day
The next game was called Aqua Explorer. This one had the least amount of actual gameplay, and was more meant to show off the graphical capabilities of the system. That’s probably a good thing, because just navigating the environment was complex enough with the two types of vision available.
Chosen for their interesting types of eyesight, Aqua Explorer had a cuttlefish mode and a mantis shrimp mode. The developers highly recommended that I start with cuttlefish mode, and when we get to talking about mantis shrimp mode you’ll see why.
I didn’t know too much about cuttlefish, except that they had weird looking eyes and that they could camouflage similar to an octopus. What I didn’t know is that they actually can’t see color, no cephalopod can. What they can see, however, is polarization. When light comes from the sun or many other sources, it is depolarized; the actual electromagnetic waves aren’t coming in at any particular angle.
When light bounces off or through certain materials, like water, it can become polarized in a particular orientation. The closest humans get to detecting this is when a reflection is particularly glary. That’s why sunglasses that filter out polarized light reduce glare.
In cuttlefish mode, the developers had a simple solution for letting the player visualize polarized light. The primary image was black and white, since, again, cuttlefish can’t see color, and colors were overlaid on top of that to represent the angle of the polarized light. This makes sense because it’s thought that being able to detect polarization gives them similar definition and contrast to their observations, as color would.
I was certainly getting a lot of information with this kind of vision; I just didn’t necessarily know what to do with it. One interesting thing the game developers added to this mode was other cuttlefish. These computer-controlled cephalopods displayed shifting patterns of polarization around their faces and tentacles. It’s thought that these patterns that only few organisms can see might be a form of hidden communication or signaling between cuttlefish in certain situations.
Being able to see polarization instead of color was certainly interesting, but it did not prepare me for the acid trip that was mantis shrimp mode.
If you’ve ever read anything else on interesting animal eyes, you may be familiar with the mantis shrimp already. They have, arguably, the most sophisticated eyes in the animal world and let me tell you, experiencing them is a lot different from reading about them.
First, there are the colors. While humans have three different types of color photoreceptors, a mantis shrimp has 12. However, that actually doesn’t give them access to an unprecedented range of different hues and shades. For one, most of those extra primary colors are in the ultraviolet range of the spectrum.
The second major difference is that a mantis shrimp’s eyes don’t appear to blend colors in the same way that most other eyes do. Instead, the large variety of photoreceptors is meant to preprocess a lot of the visual information so the brain doesn’t have to. In order to represent this, they made the image extremely high contrast and only allowed 12 colors to show through. It’s like the entire world was a psychedelic Obama ‘Hope’ poster.
That would’ve been disorienting enough, but there’s also the fact that each eye can move independently and they allowed you to adjust that through the controller. And the eye is also a compound eye like an insect’s, so if you focus carefully, you can see that the image is constructed from a series of very small dots. This was to make it stand out from the other types of vision found in eye structures more similar to a human’s.
Oh, and did I mention that each eye is actually three eyes? Yeah, each of my eyes was receiving three different feeds at the top, across the middle, and towards the bottom of my field of view. Each one had a slightly different depth of field and slightly different color perception.
And the icing on this disorientation cake was my old friend polarization. Unlike in cuttlefish mode, color was already being used to represent, well, color. Since mantis shrimp can detect linear polarized light like a cuttlefish, and circular polarized light, the game developers again had to rely on a heads-up display.
Different portions of my field of view were vaguely highlighted and had a symbol to represent the angle or direction of polarization. Overall this experience made it extremely difficult to navigate, but it certainly taught me how much information these sophisticated eyes are absorbing.
From what I’ve seen, most of these experiences will find uses mainly as educational tools and not as consumer games. Although I don’t doubt that some of these vision mechanics could be successfully incorporated into more conventional games. Eagle Flight Simulator could be developed into something that the average person would play. Right now though their focus is on refining more aspects of relaying animal vision.
An experimental honey bee vision they are working on has a bullet time mechanic, to represent the faster frame rate small animals generally experience of the world compared to larger animals.
The developers even mentioned the possibility of a plant game where you can generally perceived the location of the sun through photosynthesis, although I wasn’t sure if they were joking.
Ultimately, this is certainly a very interesting experiment into what VR can do. Animals and other organisms seem to experience a completely different universe than we do, and it’s possible that we can learn from that. So, what animals would you like to see the world as?