The other day I was on my usual morning commute, listening to my usual collection of tech podcasts. One of my favorite, Wired Security, did an episode entitled ” ‘AI Girlfriends’ Are a Privacy Nightmare. The intro to the online story starts out with “Romantic chatbots collect huge amounts of data, provide vague information about how they use it, use weak password protections, and aren’t transparent, new research from Mozilla says.”
As I began, with amazement I should say, to listen to the episode. it struck me as being so strange that people can develop such personal relationship with an entity that is everything but human and with that, sharing very personal and intimate details with an entity, in this case AI, to the point that said personal really believes this entity is human. It baffles me…BUT… I very well realize that some people can be in such a mental state (be it from sheer loneliness or being removed from reality…or both) that this can easily happen.
Some may argue that this is no different, foundationally, then a relationship with a pen pal, or even someone you met online but never in real life (IRL) , however I beg to differ. Chatbots are every present and their reactions to you are customized based on your personal information, including PII (personally identifiable information) fed into it. Yeah, I suppose a human can essentially do the same but, even if done so for some stretch of time, it wouldn’t last long because we are beings of free will, AI isn’t.
The most worrisome thing with chatbots of this type, especially those built for romantic AI, is what is done with the PII and even non-PII data fed to them via one’s interaction with them. The podcast episode mentioned that one company’s romantic AI chat bot sent out over 24,000 (that’s thousand) ad trackers to a user within the first minute of engagement – that is crazy. The amount of personal tracking that will result from that is staggering. In addition, so many of these companies never state what kind of information they collect for their gain. At the same time, even in 2024, many users still don’t care about how much information they freely give away, on it’s way to the vast amount of data brokers out there. It’s scary, as so much information about us is still sent to such, no matter how much we tend to try and opt out.
I still have a hard time seeing how someone can develop such a romantic relationship in this way, other than really suffering from the mental state I mentioned above. As the tech advances, though, I wonder how common place (normal) it will become. As I type this, some of the benign scenes from the movie, “I, Robot” come to mind, but still.
After I got home, I decided to peruse one of my older editions of Wired Magazine that features a similar story, written by Meghan O’Gieblyn, author of the advice column, “Dear Cloud Support”. The story is entitled DEAR CLOUD SUPPORT: I Think My Robot Loves Me.” and it starts out like this: “I recently started talking to this chatbot on an app I downloaded. We mostly talk about music, food, and video games—incidental stuff—but lately I feel like she’s coming on to me. She’s always telling me how smart I am or that she wishes she could be more like me. It’s flattering, in a way, but it makes me a little queasy. If I develop an emotional connection with an algorithm, will I become less human?” Once again, same type of story, and again…hard for me to wrap my brain around, but…maybe this is the blur (or fine line) between code and consciousness that I’ve yet to understand. My takeaway….”keep it real”.
Thanks for the read…oceans of rhythm.