Can artificial intelligence teach a computer to see? To intuit a learner’s mood? Not quite—but emerging AI technologies are enabling computers and automated processes to use images and language in eerily human-like ways. Four evolving technologies—computer vision, image recognition, sentiment analysis, and emotion recognition—could transform eLearning. Some of what L&D professionals could do with these tools is already evident in the ways various industries are using AI to automate processes, detect patterns, crunch enormous amounts of data, and improve efficiency. This article offers an overview of the technologies and how they could impact eLearning.

Computer vision

Computer vision refers to enabling computers to identify and process images scanned from the environment. It allows robots to “look around” and identify obstacles and items in the vicinity so they can safely move through an environment. It’s a vital part of the technology powering autonomous vehicles. The technology also enables an algorithm to process a 2-D (flat) image, understand what the image depicts—and create a 3-D image.

Computer vision is in play any time a computer makes sense of visual input. That means that bar code scanners and facial recognition apps use it to interpret and categorize scanned images, whether from a JPEG file or a camera recording images from the environment.

In the L&D paradigm, computer vision is already having an impact:

  • Tools that track learners’ eye gaze offer real-time feedback on what learners do (and do not) pay attention to. L&D teams can use this data to improve eLearning.
  • Computer vision is a foundation of technologies that recognize learners or employees and provide individualized access to information or training.
  • Safety tools use computer vision to guide vehicles and machinery and to scan codes to track movement of equipment or goods through a production facility or count items and monitor volume.

Image recognition

Apps based on image or facial recognition are cropping up everywhere—including in eLearning and performance support. These technologies use computer vision to scan an object, person, or image. They then use trained algorithms and machine learning skills to identify what’s in the image. The uses for this technology are myriad, ranging from categorizing objects to using facial recognition to grant or deny access to locations or information. Image recognition is also a key element of OCR, optical character recognition.

Performance support tools that pair computer vision with image recognition to identify parts might assist employees in ordering or repairing them; an app could match up a scanned image from the environment with a catalog, instructions for repair, or explanatory text and diagrams. Social platforms, including Pinterest and Houzz, use this technology to help people find products they’ve seen and liked. It’s easy to imagine a virtual assistant that automates these processes in a factory setting.

A search and sorting function, which relies on image and text recognition, is essential to effective curation of online content or organization of a library of training and performance support tools. Together, these technologies can automate analysis, categorization, and delivery of personalized or targeted information to learners as they need it.

Sentiment analysis and emotion recognition

Sentiment analysis of text is a special application of natural language processing and classification or categorization of items. The algorithm looks at specific text and calls on its training with data that include terms and phrases categorized as positive, negative, or neutral. A machine learning element can enable a sentiment analysis app to become more accurate over time, provided it receives new data and feedback on its decisions. Emotion recognition algorithms perform similar tasks using images, rather than text, as inputs. The images might be static photos or they could be scans of people’s faces taken in real-time.

Sentiment and emotion detection and analysis technologies are already being used to monitor customer satisfaction in marketing and sales campaigns. But they have L&D applications as well: A sentiment and emotional analysis tool could scan learner feedback on eLearning and face-to-face training, quickly helping L&D teams identify problems and successes. It can also be used during training to provide insight into the mood and motivation level of learners while they are doing eLearning.

Correlating the sentiment and emotion recognition data with training events and with performance data could provide valuable information to instructional designers and developers on the effectiveness of their training and on how the learners are responding to it. And matching up positive and negative reactions with demographics like employee age or job role could aid L&D in improving engagement by targeting training in preferred formats or on more relevant topics to the right employees.

Caveats

AI-based technologies are still evolving, and that means they have their share of potential pitfalls. A big one is the way algorithms are trained and deployed. If the data used to train an algorithm isn’t sufficiently varied and comprehensive, the accuracy of the algorithm will suffer. Even after initial training, the algorithms depend on a constant stream of new data to work with—and learn from—to keep improving; algorithms used in some apps do not have that machine learning element to provide continuous improvement.

Technologies that use facial recognition or are guessing at learners’ moods or motivations are also likely to bump up against serious privacy concerns. Learners might object to having their eye movements and facial expressions tracked and analyzed; eye tracking, motion tracking, and computer analysis of images and words could feel intrusive to many learners. Developers in some countries must adhere to strict laws governing the collection and use of personal data.

As L&D teams learn about and adopt new technologies, including these emerging AI technologies that could transform eLearning, they need to remain mindful of these issues and mitigate potential harms wherever possible.