AI and the illusion of human connection | Inquirer Opinion
Just Thinking

AI and the illusion of human connection

AI and the illusion of human connectionI was a college sophomore when the movie “Her” was first released. A science-fiction drama film that follows Theodore Twombly (Joaquin Phoenix)—a soon-to-be divorcee estranged not only from his wife but from the world around him.

But then comes Samantha.

Samantha is an artificially intelligent (AI) assistant designed to learn about her user and create personalized conversations. It’s much like your living room Alexa, but on steroids. Other than sharing weather forecasts or playing your favorite tunes on Spotify, Samantha is portrayed as a much more advanced technology. One capable of not only engaging with candid commands but complex conversations from its users and forming emotional connections with them.

Article continues after this advertisement

Theodore Twombly purchases the operating system. As Samantha becomes more and more integrated into his life, Theodore finds himself feeling increasingly attached to her. Gradually, Theodore was falling in love.

FEATURED STORIES

Anthropomorphizing the abiotic is nothing new. Indeed, it’s far from fiction. Indeed, it has a name. It is known as the Eliza effect—the phenomenon where we people attribute human-like qualities and intentions to a machine based on its response. While ”Her” may thus be set in Los Angeles sometime in the “slight future,” lo and behold, that future is now. Or at the very least, we’re well on our way there.

Fast forward a decade after the movie’s premiere, large language models (LLM) that vaguely resemble Samantha have come flooding our feeds. OpenAI’s “ChatGPT,” Microsoft’s “Bing,” among other large language models, come to mind. But perhaps the strangest of them all is a lesser-known chatbot named “Replika.”

Article continues after this advertisement

They say art imitates life. Yet perhaps here, it is life that is imitating art. Straight out of the “Her” playbook, Replika is an AI chatbot that has elicited strong feelings in its users which has been described as “romantic,” “sexual,” or both. Unlike Bing or ChatGPT, which has largely been celebrated for its intellectual abilities (quote and quote), Replika highlights the emotional quotient. Its own webpage reads: “Replika is the AI for anyone who wants a friend with no judgment, drama, or social anxiety involved. You can form an actual emotional connection, share a laugh, or chat about anything you would like!”

Article continues after this advertisement

My, oh my. What is this world coming to?

Article continues after this advertisement

But perhaps the most brow-raising aspect of it all is not simply the personification process per se, but the fact that people are actively, knowingly, and perhaps even deliberately engaging in anthropomorphism. We’ve long relied on Alan Turing’s test as the threshold for AI; measuring a machine’s intelligence by its ability to fool a human into believing it, too, is human. (The Turing test suggests that if a machine can engage in a conversation with a human without being detected as a machine, it has demonstrated human intelligence.) But who’s being fooled now? On Replika, users are well-aware at the outset. Or perhaps things have gotten to a point that we simply no longer care. That we choose to engage with machines as our fellow man is no longer worth our time, our trust, or our trouble.

In the “age of AI,” we’ve been asking ourselves some big and important questions. Is it ethical? Is it healthy? Is it legal? But in the thicket of change, amidst the dizzying speed of development, we have forgotten to ask ourselves a much more basic question: Is it real? It isn’t.

Article continues after this advertisement

ChatGPT-4 was recently released, and part of its bragging rights was a supposed developed emotional quotient. And true enough, when chatting on an LLM it may appear as if there is sentimental tone somewhere beneath the text. But for the purposes of an AI, what we perceive as messages “sweet” and “thoughtful” are simply patterns of emotional expressions flattened into a statistic. There is a difference between emotional intelligence and empathy. Empathy calls for us to step into each other’s shoes, but the LLM is the outsider looking in, only mirroring what it observes.

The human species has been so long engrossed in creating our own Adam. Perhaps out of hubris, for would that not make us gods? Yet hurdling the Turing test would not always entail that our creations are getting better, faster, or stronger. Perhaps the bar is simply getting lower. Perhaps machines aren’t necessarily that much smarter. The Turing test is just that much easier. Indeed, when we describe ChatGPT as lifelike, we reveal how we ourselves come to cheapen human interaction to quick Q&As and automated responses.

As AI becomes even more advanced, we too must become more discerning.

——————–

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

[email protected]

TAGS: AI

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.