At an educational conference earlier this month, I led a Socratic seminar about the ethics of using virtual reality (VR) in the classroom. We debated health risks, bias, data privacy, economics, oversight, content, and many other relevant ethical issues. Ethical debates like these often have no right answers, and the attendees made many valid and contradictory points, which made for a lively conversation. I soon understood that participants had different optimization mindsets.
Throughout the discussion, I realized that the differences in opinion originated from different belief systems and backgrounds, as I’d expected—but also from vast differences in mindset, which was a surprise, especially given the self-selected audience of tech-savvy education professionals.
On one end of the spectrum, some attendees, often with lifelong educational backgrounds, were highly risk-averse, especially with regard to their schools, personal careers, and the use of any new technology. For them, no benefit was worth any potential harm or financial investment in an unproven technology, exemplifying the principle of primum non nocere, or “first, do no harm.” They said it’s better to wait for more peer-reviewed research that proves efficacy and vets content. They said the technology needs to mature before we spend our limited funds and expose students to unknown risks. We’ll call this group the “do-no-harmers.”
On the other end of the spectrum, some attendees were more focused on the student experience, and were thus more willing to consider using VR in their classrooms if the potential benefits to the students were large or if the risks were unlikely, inconsequential, or reversible. Many of these people had business or tech backgrounds, which perhaps instilled a higher tolerance for ambiguity, risk, and faster decision-making. They realized that they’d never have perfect information and that every decision carries risk. They felt it would be remiss to not expose students to emerging technologies, technologies that could be critical to career success in the coming years. We’ll call this group the “decide-and-goers.”
One way to understand and reconcile these differences in mindset is by looking at what factors we optimize for and why. We’re always optimizing for something, striving to achieve a minimum or maximum in some way. Every living thing is, in fact, continually optimizing for one or more factors. Plants bend toward sunlight and catch rain in their leaves. Lions and gazelles optimize for speed to outrun each other. Chameleons optimize for color camouflage. Collectively, human beings are perhaps the best optimizers on earth.
Individual humans, however, get to choose what we optimize for, but with a lack of perfect information, we don’t always make perfect choices. We can over-optimize, under-optimize, optimize for too many or too few factors, optimize for the wrong factors, or optimize in ineffective ways. Further complicating matters, multiple variables often make it difficult to know how well we optimized our actions. Did we improve or worsen the outcomes—or was it all simply luck?
The differences in optimization strategies within my group discussion were striking. The do-no-harmers wanted to minimize risk and financial expense. The decide-and-goers preferred to maximize student benefit and administrative efficiency.
While both sides provide necessary perspectives, I personally reside squarely in the decide-and-go camp. While I don’t believe that VR is the be-all and end-all to educational reform, I do come from a business and tech background. I see that we do a disservice to students of all ages by not exposing them to important emerging technologies, such as VR, artificial intelligence (AI), Internet of Things (IoT), blockchain, and others. If students are going to be empowered and successful in the coming years, they must learn how to create with and develop for these new technologies.
If we as educators are not exposing our students to new technologies that have inherent risks, how can we effectively teach the next generation of students? In fact, eliminating all risk can actually hurt us. For example, many educators and psychologists hold the idea that coddling students too much hinders development. I was raised in the woods in a family business, helping repair boat engines and solder video games, sometimes hooking myself with fishhooks, burning myself with tools, or cutting myself with knives, usually with parental oversight. Whenever I did hurt myself a little bit, I learned to be more careful next time and make better choices. I couldn’t have developed this grit without taking a few relatively minor risks now and then.
Similarly, as an entrepreneur, I know that I’ll often have to take risks and make decisions without all the information I’d like, and that small, inexpensive, and frequent failures can teach me where the true opportunities lie. To optimize for maximum learning and success, I need to factor in the risks, failures, and other downsides to my being an early adopter. In contrast, doing nothing new ensures mediocrity—which, in today’s hypercompetitive world, leads nowhere. Therefore, it’s our responsibility as educators to expose students (be they young children, higher-ed students, or employees that we’re training) to important new technologies, despite a few manageable risks.
While a healthy dose of skepticism and caution are necessary, we educators need to remember that in order to optimize and maximize students’ learning and success, we may need to take some calculated risks, invest in unproven technologies, and do our best to adjust course as we go. It’s the only way for our students—and for that matter, for us, our schools, and our businesses—to move quickly through the steep learning curves and stay relevant.
In short, we must ask ourselves every day: what am I optimizing for today and why?
Let us know in the comments what factors you optimize for. How is your strategy working? How is it falling short? What would happen—good or bad—if you reevaluated your optimization mindset?
For a deeper exploration of the optimization philosophy, check out Ted Chiang’s fantastic sci-fi novella, The Story of Your Life, published in the Starlight 2 sci-fi anthology in 1998 and again in Chiang’s own 2002 short story collection, Stories of Your Life and Others. The 2016 movie Arrival is based on this novella, though the movie doesn’t dive into the optimization concept or philosophy nearly as much as the novella. This story also delves into how optimization in math and physics relates to linguistics, the nature of time, and how we perceive the world. It’s worth reading, as you’ll never think about cause and effect the same way again.
A recent New York Times article explains how British schools are intentionally adding risk (high climbs, sharp tools, heavy bricks, less protective padding, etc.) to their playgrounds in an effort to build grit in students.
For a deep dive into how helicopter parenting and lack of risk-taking discourages personal development and resiliency in children and young adults, read How to Raise an Adult, by Julie Lythcott-Haims. Or just watch her TED talk for an overview.