“Everybody can learn to think like a futurist,” Amy Webb told DevLearn 2017 Conference & Expo attendees at the opening general session on October 25, “Sci-Fi Meets Reality: The Future, Today.” It’s a good thing; the theme of the conference was “The Future Is Here,” and the general sessions—as well as many breakout sessions—looked ahead to anticipate and plan for the future of eLearning.

For anyone hoping to learn how to anticipate future human behavior, Webb is an excellent guide. She’s the founder of the Future Today Institute, publisher of the FTI Trend Report, an author, and a leading futurist.

A futurist, Webb said, learns to “listen for weak signals at the fringe”—the fringe of human experience, behavior, and learning. The budding futurist then seeks to recognize patterns in that data early, so as to spot emerging trends.

Anticipating the question forming in the minds of audience members, Webb segued to an explanation of why algorithms do a poor job of “strategic forecasting” based on data about human behavior. Humans are simply “too capricious,” Webb explained, for logical, rule-bound logarithms to predict with any accuracy.

Her overview of the development of artificial intelligence and machine learning made clear the inherent problems with relying on machines to think for us. While machines might have what she termed “artificial narrow intelligence”—an ability to do specific, narrowly defined tasks better than humans can—that is not the same as thinking and predicting behavior.

“We don’t really want machines that can think,” Webb said. The true goal is machines that learn and make decisions on their own. But that learning and those decisions are based on their coding, which is performed by humans. Humans with human biases and fallibilities. At present, Webb said, the technologists who develop artificial intelligence and the educators and ethicists who develop learning and attempt to influence human behavior and values work in different spheres with little overlap, resulting in a dearth of ethical values guiding the automated decision-making.

Webb described projects that “taught” computers to recognize categories of people, citing the example of a computer that “learned” to identify a “typical CEO” from a selection of photos. “How might this go wrong?” Webb asked rhetorically, to a background of knowing laughter. Her next slide showed the computer’s idea of a CEO: middle-aged, almost always white, and always male. There was one notable exception in the slew of photos of men: a photo of the CEO Barbie toy.

How does all of this relate to eLearning? Again, Webb anticipated the question. Her response gave hope to some—and terrified others. She presented three possible future scenarios:

  • Everyone in the learning arena learns to recognize weak signals; they work with technologists to refine artificial intelligence to instill values. Future machines learn not only to identify correct and incorrect answers; they also learn right and wrong. Webb said that she gives this optimistic scenario a 25 percent chance of occurring.
  • Everyone present is inspired by her talk but they, and the rest of the learning world, do nothing. Artificial intelligence continues to develop as it has in the past, learning to identify correct answers but lacking values. Webb’s prediction is that this pragmatic optimistic scenario has a 50 percent chance of occurring.
  • Learning and artificial intelligence continue to develop on separate tracks. Future artificial intelligence and machine learning projects incorporate real biases that affect what and how people learn and how knowledge is transferred. Webb said that she gives this catastrophic scenario a 25 percent chance of occurring.

In an attempt to end on a strong positive note, Webb said that “the future hasn’t happened yet—we think” and encouraged attendees to take action. “To build the future of learning that you want, listen to weak signals now.”

Looking to the future

Conference-goers streamed from Webb’s talk to any of dozens of options in the first block of breakout sessions. Across two and a half days, they could choose from hundreds of sessions that looked to the future of eLearning, attempted to define “community” in the age of digital learning—and offered tips, tricks, and examples of what they could do right now to create engaging, effective eLearning. Alternatively, the almost 3,200 participants could wander through the biggest-ever Expo, with 145 vendors showing off their wares: localization tools, authoring tools, audio narration, branding and customization, turnkey eLearning solutions, and much, much more. The intrepid ventured into a virtual world at the VR Learning Lab, where they played games, watched virtual reality films, and tried out VR apps.

Most conference-goers interrupted their wandering and learning to attend the second general session with legendary animator Glen Keane, who spent 38 years at Walt Disney Feature Animation, drawing characters that many attendees grew up with. (Keane delivered a stunning and popular keynote at Learning Solutions 2017 Conference & Expo in March.)

Keane’s address, “Embracing Technology-Based Creativity,” started with Keane reminiscing about his initial design of the Beast, and whether Belle (the Beauty) could fall in love with this creature. As he described his design process, Keane sketched the Beast. The session ended with Keane’s demonstration of an augmented reality drawing tool, Tilt Brush, which he used to create a 3-D drawing on the screens as a breathless audience watched.

Sandwiched in between, Keane shared his thoughts on how moving from hand drawing to computer-drawn animation forced him to think about drawing. “I’d draw to make the paper go away,” Keane said. “I’ve always wanted to make the surface go away so I could live in my drawing.”

New technologies, like 360-degree animations, make that possible—but also transformed storytelling. “Story was no longer linear,” Keane said, showing an early 360-degree movie short, Duet, which has an interactive version that can be viewed on some Android smartphones in 3-D, and on iOS as well. (To view the show in full interactive 360, use the Google Spotlight Stories app on a compatible Android device. For iOS, download the Google Spotlight Stories app.)

The characters circle each other—and the viewer, who can turn and look in any direction. Learning professionals are grappling with the challenges and opportunities the new world of immersive, nonlinear stories offers.

Storytelling powers learning

A much-anticipated general session with actor, producer, and cultural icon LeVar Burton focused on technology and storytelling. Admonishing the audience not to dismiss the “storytelling power of video games,” Burton said, “Storytelling in this modern technological age is very probably the most advanced system for learning in history.”

Burton’s talk emphasized the importance of education and literacy, and he shared many memories of his mother, who had a powerful influence on his life and career.

“Reading is the single most important activity in which we can engage for learning,” Burton said, before showing the crowd the ways he has updated his Reading Rainbow childhood literacy TV program for the digital age, re-creating the program as “Skybrary School.”

Commenting on video games and immersive storytelling formats, Burton said, “We are capable of accessing multiple elements of storytelling simultaneously, often without being aware.”

The conference closed with a final general session talk delivered by Jane McGonigal. The futurist and game designer delivered an alternative take on “How to Think Like a Futurist,” describing the interaction between our present selves and our ideas of our future selves. For an in-depth review of McGonigal’s talk, see “How to Think Like a Futurist: DevLearn 2017 Closing Keynote.