Media enthuse about the latest gear or games employing virtual reality or augmented reality; scholarly treatises pontificate about how virtual reality and grounded reality interact. Then there’s something called mixed reality…

It’s easy to get lost among the realities. Here’s a primer to sort out the terms and understand how each might apply to eLearning.

The real, physical world is called “grounded reality.” This article explains the “other” realities in reference and relation to grounded reality.

Immersive experiences and VR

The lure of virtual reality, or VR, is immersing learners in an environment. That environment can be created using 360-degree video, be entirely computer-generated, or combine video with computer-generated elements.

Most learners have probably encountered 360-degree video online; it shows up on real estate listings, virtual college-campus tours, and much more. While it’s possible to view 360-degree video on a computer screen, it is best viewed using a VR headset. Either way, learners can examine a complete scene—up, down, and in front of and behind the learner’s position—and get a far more complete picture than they get from looking at still photos or ordinary video. Journalists who’ve used 360-degree video in their storytelling say that news stories presented in this format have greater emotional impact than text, ordinary video, or even interactive packages because the audience has a greater sense of being on the scene. But the “scene” is still something that exists in grounded reality, and the audience can only look at it, not interact with it.

That’s where VR comes in. VR immerses participants in a different world—often a world that doesn’t actually exist. The environment is a digital creation, hence “virtual” reality, though it can be a digital re-creation of an actual place. Whether based on a real place or entirely fictional, it feels real to learners who are immersed in the digital world; while there, learners can interact with other avatars in the virtual world. According to Tobin Asher, the lab manager at Stanford University’s Virtual Human Interaction Lab (VHIL), people treat avatars—whether other “people” or fictional creatures encountered in a VR environment—as real, even if the graphics quality is not top-notch and the avatars don’t look real. Learners also behave as they would in a real space: They are unwilling to walk through a virtual wall, for example, or approach another avatar too closely.

That’s because of key elements of VR that make it feel like an actual environment: tracking and rendering. Tracking is the ability of the software to “know” where the learner is in the real environment by tracing the person’s movements. Rendering moves the avatar within the virtual world to reflect the person’s movements in grounded reality. Accurate tracking and rendering creates a strong feeling of “presence”—the person feels as if he is actually present in the virtual world. When the person moves, his avatar and the virtual world move along with him—the virtual world responds as a real environment would. Since the learner, or, rather, his avatar, can move through this virtual world and interact with the environment as well as with things and avatars within the environment, the brain is fooled into thinking it is real.

A common VR demo scenario is walking across a (virtual) cavern on a narrow plank of (virtual) wood. Even though they “know” that they are on a solid floor, people doing the “plank walk” feel terror; their hands sweat and their bodies activate all the responses they would in an actual scary experience. This does not happen when people watch a scary movie or participate in a role-playing game.

VR is making its way into professional and educational eLearning: NFL and college football coaches use VR so their players can practice plays over and over, building muscle memory and honing skills with little risk of injury and at considerably less cost than live practices. US Customs and Border Protection uses VR simulators to train officers, preparing them for active-shooter events and routine confrontations with members of the public. Google Expeditions, which uses Google Cardboard viewers and smartphones, enables schoolchildren to take virtual field trips around the world. And researchers at the VHIL are studying whether VR experiences can change people’s attitudes and behavior, with an eye to developing diversity training programs.

Augmented reality

All of that contrasts sharply with augmented reality, or AR, in which participants remain who—and where—they are, in grounded reality. Using an augmented reality tool, it’s possible to see additional information, objects, and even animated characters seeming to inhabit the grounded reality. The objects aren’t there, at least not in any physical way. The object or information is on a digital layer that appears on top of the learner’s view of the world.

Augmented reality is a useful way to provide information and job aids: Google Glass, an unsuccessful attempt to bring this experience to consumers, has proven useful in industry. For example, Boeing successfully tested Google Glass as an aid to reduce errors and cut the time technicians need to create “wire harnesses” for aircraft—an endeavor with zero error tolerance. The techs wear the augmented reality “smart glasses” to access a “floating” display. They can look up diagrams and access instructions without having to stop what they are doing and turn to a computer screen. Using voice commands allows employees to keep their hands on their work, which often requires sorting and placing dozens of wires; preventing interruptions saves precious time and reduces the likelihood of errors.

Cydalion, an augmented reality app, provides navigation assistance to people with low or no vision. The augmentation is a layer of audio and haptic feedback that alerts the user to overhead obstacles, tripping hazards, and items nearby that they could walk into.

Augmented reality is also part of games like Zombies, Run! and Pokémon Go, which add a layer to—augment—players’ grounded reality experience. Each player sees the zombie or the Pokémon, feels the challenge of catching or eluding the fictional character—and changes her behavior accordingly. But people around her do not see the character (unless they are also playing), and, of course, there is no physical character.

Mixing it up

An experience that includes elements of augmented and virtual reality is called mixed reality. A key element is that the virtual elements behave as if they are real. A variety of implementations are possible. For example:

  • Rather than overlay virtual objects on participants’ view of the world, a mixed reality experience might integrate the virtual objects with the grounded reality world, allowing learners to pick up or interact with them. For example, NASA uses the ProtoSpace tool to create a virtual Mars rover; a team of engineers can inspect it, walk around it, kneel and peek under it, and even poke their heads inside the vehicle—even though it doesn’t physically exist.
  • Learners might use avatars to enter a virtual space rather than immersing themselves in that world. An instructor could hold an online course in a virtual space created to resemble a lecture hall on a college campus or an office conference room, while the learners are physically dispersed around the world! The learners interact with the instructor and with one another, collaborate on assignments, and even sit in their favorite seat during every virtual session—even though they are at the office or even at home, wearing their pajamas.

A final term that might get tossed into the mix is alternate reality, an experience that takes place entirely within grounded reality. An alternate reality game or simulation offers realistic—and real-life—experience solving problems, generally using the Internet and information, items, and locations in grounded reality. The problem posed might be fictional; the problem might be real but the solution fictional; or the entire experience could be a simulation. Unlike typical eLearning simulations, though, the outcome of an alternate reality experience is not preordained. The participants can influence or even determine the results.

Whichever reality an eLearning tool inhabits, learners can benefit from the enhancements and increased opportunities for interaction and engagement.

References

Boeing. You can see the difference. Video, 2015.
http://video.boeing.com/services/player/bcpid4029272025001?bckey=AQ~~,AAAAukPAlqE~,oAVq1qtdRjzvprzv7b_de_OOilq0kg4d&bctid=4051927834001

CNN. “This Is So Like ‘The Matrix.’” Video excerpt from Morgan Spurlock Inside Man, season 2, episode 2, aired 20 April 2014. Stanford University Virtual Human Interaction Lab.
http://vhil.stanford.edu/news/2014/this-is-so-like-the-matrix/

Dennis, Danfung, Nonny de la Peña, Gabo Arora, and Chris Milk. “Where Virtual Reality Takes Us.” New York Times. 21 January 2016.
http://www.nytimes.com/2016/01/21/opinion/sundance-new-frontiers-virtual-reality.html

Hogle, Pamela. “A New Take on Augmented Reality: Cydalion Navigation App Aids People with Low Vision.” Learning Solutions Magazine. 8 November 2016.
/articles/2129/?utm_campaign=lsmag&utm_medium=link&utm_source=lsmag

Hogle, Pamela. “Blending Fantasy with Reality Drives Successful Alternate Reality Games.” Learning Solutions Magazine. 28 September 2016.
/articles/2070/?utm_campaign=lsmag&utm_medium=link&utm_source=lsmag

Senese, Mike. “NASA Shapes the Future of Space Design and Exploration with its Mixed Reality Program.” Make. 19 July 2016.
http://makezine.com/2016/07/19/rockets-rovers-mixed-reality/

US Customs and Border Protection. “Northern New York And Vermont Media Take A Shot At New CBP Virtual Training System.” 1 July 2016.
https://www.cbp.gov/newsroom/local-media-release/northern-new-york-and-vermont-media-take-shot-new-cbp-virtual-training