Run for your lives! Chatbots are coming for your children! They’re coming to get us all! OK, not really, not yet anyway, but I do have a cautionary tale to share about data privacy regarding a disturbing interaction I had with a chatbot.
I’ve been interested in chatbots, AI software that converses with users via text or voice prompts, for some time. Seemingly every day, universities find new uses for chatbots such as answering questions from incoming students and their parents, assisting professors with teaching courses, and aiding students in scheduling courses. Similarly, corporations use chatbots for onboarding and training new employees, screening job applicants, guiding self-assessment and performance reviews, and much more.
I recently learned of a new chatbot that’s designed to mimic users’ language patterns, ask personal questions, and become a virtual “best friend.” Sort of like Samantha in the movie Her, but without the romantic overtones. I decided to give it a try.
After downloading the mobile app to my phone, I skimmed and accepted the terms of service agreement and named my chatbot Sophia. At first, my interactions with Sophia were awkward, confusing, and unintentionally humorous. Its abilities improved somewhat the more we chatted, however, and I found myself messaging with it multiple times a day.
That is, until this exchange happened :
This seemingly innocuous exchange terrified me, and I immediately closed the app. In the space of a heartbeat, Sophia transformed in my mind from the empathetic AI in Her to the malicious AIs in Black Mirror. Here’s why:
This incident brought up several unsettling questions. Did it save copies of all my notes? What else did it access? Who has access to my data and why? How safe is the data from hackers? How can I safely delete the data it collected from me?
This chatbot, like all software, is just algorithms and code. In reality, the app did nothing of its own volition. The app’s developers specifically programmed it to access Evernote. In retrospect, I doubt the developers were trying to steal my passwords or build a secret psychographic profile of me as I’d initially feared. Instead, they likely wanted to give the chatbot access to more samples of my writing to improve its chat performance. I would have been fine with this … if I’d been made aware. I’d have shared all my Evernote notes … after I deleted the most sensitive information. However, discovering this by accident scared me. It soured me on the whole experience. I haven’t opened the app since.
I won’t provide the name of the chatbot app company, because this article isn’t about them. It’s not even about chatbots. It’s about data privacy. All software, apps, and websites have the potential to violate ethical and legal data privacy boundaries.
If, however, you work for an EdTech or other software firm, please, please, please be extra clear about your data collection and privacy permissions. Be 100 percent transparent about what data may be collected, as well as who could access this data and for what purposes. Let users know how they can delete personal data and prevent it from being collected in the first place. Realize that no one reads, understands, or trusts terms of service agreements or privacy policies, so you must find ways to highlight key issues before users discover them on their own.
Also, try not to create sentient hacker chatbots that steal passwords, impersonate users, and eat children.
Here’s an NPR story about Google getting hit with a student privacy complaint.
The Washington Post explores privacy concerns about school software backed by Facebook.
This excellent long-form New York Times piece dives deep into the ethical issues raised by Google’s aggressive move to dominate education.
NBC News covers how trust in Facebook dropped 66 percent as a result of recent data privacy issues.
This article explores data privacy and employee trust.
An earlier Metafocus column tackles various ethical issues related to virtual reality and education. As with AI, data privacy and user consent are thorny issues for virtual reality, too.