It’s easy to distinguish bad ideas from good ones—in retrospect. Too bad it isn’t so easy before we start. Often our first steps with a new technology start on the wrong foot. That’s why every large corporation has a team assigned to something called New Media or Emerging Technologies or Advanced Platforms. The team is very clever, very small, and very brave. Their job is to make the rookie moves. The rest of the company can learn from their mistakes. So can we.

Brandt Dargue works in Boeing Research and Technology’s Advanced Learning Center, which is charged with applying new technologies to certain perennial learning problems. I am always eager to see what Brandt is doing, so I was happy to join him recently in an Atlanta restaurant.

Before dinner, Brandt reaches into his case and pulls out a HoloLens. This is Microsoft’s mighty (and mighty expensive) augmented reality device.

“Check this out,” he says, tightening the headstrap to fit me. Although some complain about the hefty HoloLens, I find it comfortable and well balanced. A bit over one pound, it weighs no more than the average bicycling helmet.

The HoloLens is a direct-view augmented reality device. I look through its clear visor at the restaurant environment. On our table, a miniature ballerina magically twirls between our glasses of iced tea. This dancer is standard HoloLens demoware.

Brandt tells me to make the “bloom” hand gesture (like releasing a bird) to bring up the menu and then launch his demo with the “select” gesture (like firing a spray can). I’ve done these before, but it still takes a few tries to launch his first app.

We review a cornucopia of demos and experiments. Synthesized planes, virtual factory tours, and more. Some features work great. But not all of them.

The Boeing group commissioned a functional cockpit simulator using the HoloLens. A trainee might use such a simulation to monitor the aircraft’s many instruments and operate its controls, all of which surround me as we launch the demo.

Brandt isn’t happy with the final user experience. The simulator substitutes mid-air gestures for the solid feel of real sticks, switches, and sliders. Instruments are selected by turning one’s head. The selection is tedious, and air gestures feel unsatisfying and tiring. But these might be chalked up to the cost of doing business in virtuality.

A more fundamental problem is field of view. Look-through augmented reality gear like the HoloLens imposes digital images onto normal scenes, not unlike a heads-up display. But for reasons that are stubbornly rooted in the laws of physics, the digital image is limited in size (about 30 degrees wide and 17 degrees high). Hold your smartphone, in landscape mode, a foot in front of your nose. (Or, more precisely, twice its screen width.) That’s the HoloLens digital display area. Notice how much real reality lies outside that patch of pixels? Some folks have faith that upcoming HoloLens models will have a larger field of view. Maybe—if Microsoft first achieves a fundamental breakthrough in optics. Other fans say, don’t worry—you just get used to the constraints.

And that’s almost true for, say, the ballerina. I could keep her entirely within the active window while I studied her from a careful distance.

But it is not true for the cockpit. You can’t see the full simulation unless you turn your head methodically, and paint the room with your tunnel vision view. You never see more than a patch at a time. Only a very limited immersion is achieved.

It is limited, too, as a learning experience. Pilots must learn to monitor dozens of gauges all around them. They tune up their peripheral vision to catch a warning in the corner of the eye and then shoot a quick glance to the side panel. But wearing a HoloLens, a trainee must actually turn her head, not just shift her eyes, to see what is happening anywhere off axis.

Does this mean the HoloLens is useless for training? Absolutely not. Let’s consider other experiments.

Karthik Muthiah is a creative technologist working with the Greater Atlanta Christian School. The limited immersion that cripples the cockpit simulator is actually useful in Karthik’s application. Working with GACS biology teachers, Karthik built an interactive display of DNA’s double helix structure focused on its component amino acid pairs. The model is small enough to fit into the active window of the HoloLens.

Karthik’s daughter and her classmates stand in an arc, each wearing a HoloLens. They all see, in the middle of the arc, the same DNA model—while the teacher explains its features (Figure 1). They can play CRISPR-like games, performing together a little manual gene splicing on the shared model. (Editor’s note: CRISPR is a genome-editing tool.) Or the teacher can let them spawn their own separate augmented realities, all sharing the same real classroom environment but each with their own instance of the model, oriented optimally, and ready to mod as they see fit.

 

Figure 1: Five students in a network interact with a shared model of DNA structure in this (tightly cropped) view from a sixth HoloLens

In other applications, real reality is more than a stabilizing background—it is the star of the show. Brandt described a demonstration at last year’s I/ITSEC conference: Boeing demonstrated a prototype HoloLens system for maintenance trainees. The trainees see augmented instruction diagrams embedded directly into aircraft components as they work on them. This is a classic application of augmented reality. But Boeing took it one step further.

At their demo in Orlando, Florida, Boeing technologists piped the video feed from the HoloLens’s forward-facing camera to remote subject-matter experts in various offices across the country. These experts served as tele-coaches. Each trade show attendee put on the HoloLens visor, picked up a tool, and stood in front of an actual mechanical assembly. He saw an augmented maintenance diagram that pointed to the next test point in the part he was supposed to inspect. He applied the real tool to the real indicated point. In his headphones, he could hear a live maintenance expert cry out: “That is the right tool, but you have the red probe on the wrong pin.”

Sometimes the problem you are solving is a good fit for the tool you are exploring. But you will probably learn more from the cases where it is not. As Brandt reminded me: “If all your experiments are total successes, you don’t know how to properly design or conduct an experiment.”

In any case, all three of these examples offer valuable landmarks as we start to map out the contours of successful augmented learning.