Reflections on ‘the AI mirror’

2025-10-02 14:37:58 英文原文

作者:PC(USA)

LOUISVILLE — Delivering the Anita and Antonio Gott Lecture at Fifth Avenue Presbyterian Church in New York City on Sunday, Dr. Shannon Vallor explored the topic “The AI Mirror: How to Reclaim Our Humanity in an Age of Machine Thinking.”

Dr. Shannon Vallor

Vallor is the Baillie Gifford Professor in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute at the University of Edinburgh in Scotland. Her most recent book, which shares a title with her lecture, explores the ethics, advantages and challenges of a future with artificial intelligence.

On Sunday she was introduced by the Rev. Dr. Scott Black Johnston, senior pastor at Fifth Avenue Presbyterian Church. Vallor said her book “uses a mirror as a metaphor to help us understand how AI tools work, what they can do and cannot do, and what they can do but probably shouldn’t.”

During her talk, which can be seen here, Vallor addressed mirrors, space, time and stories.

On the topic of mirrors, “we need to grasp AI’s impacts on democracy, science, media, the arts and climate,” she said, including how we carry out our daily work and how we find companionship.

“It is changing how we make laws, music, love and war,” Vallor said. AI is a mirror “in a more powerful sense. It’s not one technology, but many machine-learning tools.”

When you look in a mirror, “you know there is no second face in that room,” she said. But AI companies “want you to fail the mirror test. They want you to think [that with AI technology] you’re seeing something that can help you, teach you, comfort you, help you speak and write — and one day replace you.”

AI “produces fun-house mirrors of worlds where women have never been CEOs or artists, where white people never use social services, and where productive people are young men with too much hair product,” Vallor said. “We need perspective when AI falsehoods seem real.”

AI tricks “can be dazzling, even blinding and disorienting,” she said. “They are used to confound us. They are fraudulent deep-fake images, designed to gin up our rage over something that never happened.”

Thinking about space, Vallor said that in order to reason, “we need to use thought as a way to stretch into the open space of the future.” AI tools can be designed “to hold open the space of reason, but for the most part they aren’t.” A large language model won’t tell us “I don’t really know” or “I’m not sure” or “why don’t you tell me?” she noted.

“They are mental space-filling machines,” Vallor said. “Want a better ending to your novel? The AI mirror has unflagging confidence.”

If we had large language models in the 1600s and asked them to deliver justice, “they would have never demanded the liberation of women and slaves. They could not have even questioned the divine right of kings,” she said. “We would have remained stuck in time, just as we had always been.”

Thinking “requires the repeated practice of scaling daunting peaks. We’re letting those skills erode,” she said. “We could design AI to strengthen our cognitive muscles and those of the next generation, but that’s not what the market wants. It wants to extract maximum returns now.”

On the topic of time, Vallor reminded her hearers that “we know space and time aren’t separate” and that “things and events arrive on an unpredictable schedule.”

Humans enjoy the ability to choose. “We can abandon a commitment, or make one we swore we would never take on,” Vallor said. “We can stick with the dire politics we know, or we can say, ‘to hell with this. We’re changing the game.’”

“Here’s the thing: AI mirrors can’t make these radical new choices.” They are “time machines that refuse to take us into the future.” Now, more than ever, we need to exercise our collective power to tell our own stories, “to give ourselves more road to travel before we reach a cliff,” Vallor said.

We use our technology — smart phones and video games — to fill all our open time and space, she said. “We don’t know what it is to have unfilled time and unfilled mental space. We fill it as quickly as possible with algorithmic content.”

Turning to stories, Vallor noted that AI is being used to write news stories and op-eds, for plot twists by fiction writers, and by educators, who are increasingly being pushed to use AI to summarize “key points to the readings we assign.” Many educators “vigorously resist this campaign. The stories of our history, literature, politics and scientific discovery will be increasingly told by mindless machine mirrors.”

She defined empty mental space as “where time opens and values are chosen — where the story can be anything.”

“We learn we have power over the future,” Vallor said. “We do that when we innovate in art, media and design, when we come together to make new politics and press our moral claims onto the world. Empty space, quiet and boredom is the field of the future.”

Modern digital life can be compared to the 1990s game Tetris, according to Vallor, where “every moment is a challenge to fill the open spaces of existing as efficiently and seamlessly as possible. The blocks are falling faster and faster, and they must be slotted into their predictable and optimal places.”

Instead, “we need to hold open our space and our time so we can write the next chapter of the human story and let it go somewhere new,” she said. “That’s not what AI currently offers us.”

AI mirrors can’t tell our stories because they don’t know why those stories matter, Vallor said. “There’s nothing at stake for an AI storyteller, because an AI bot has no future to make. … Unlike the work of being human, its work has been done.”

But unlike an AI model, we “sit at the edge of a future that is open, one that need not reflect on the past — not unless we surrender to machines the power of telling our own stories,” Vallor said. “The narrative power is the self’s crucible. But even more importantly, it’s the crucible of our collective humanity, which will have a story that only we have the power and the duty to write.”

During a question-and-answer session following her talk, Vallor was asked about how we can get to a hopeful future.

“I think we can get there through remembering the history of human courage and human responsibility,” she said. “When things feel like they’re spinning out of control, it’s easy to feel like we don’t have the resources to get through this.”

“All you have to do is read history enough to know how many times we’ve been in some version of here,” she said, and “how many times humanity has been boxed up … and chewed its way out through solidarity, ingenuity, faith and a commitment to the future.”

“There are ways to channel feelings into a power of creating a future for ourselves and one another,” Vallor said. “We’ve done it so many times, and we can do it again.”

关于《Reflections on ‘the AI mirror’》的评论


暂无评论

发表评论

摘要

Dr. Shannon Vallor delivered the Anita and Antonio Gott Lecture at Fifth Avenue Presbyterian Church in New York City, discussing her book "The AI Mirror: How to Reclaim Our Humanity in an Age of Machine Thinking." She used mirrors as a metaphor to explain AI’s impact on various aspects of life, including democracy, media, and climate. Vallor emphasized that while AI can be misleading, humans retain the ability to make radical choices about their future, unlike AI which lacks this capacity. She argued for preserving open mental space and time to shape human progress independently of AI's constraints.

相关新闻