SALT LAKE CITY — Is the endgame for human beings in sight? Some people, including environmental activist Bill McKibben, believe so.
In his new book "Falter," McKibben argues that human beings, once considered the crown of creation, are not long for this world. "Put simply, between ecological destruction and technological hubris, the human experiment is now in question," he writes.
McKibben is among people who have begun to envision a post-human Earth. Others, such as the leaders of the Center for Humane Technology, aren't talking about that yet and say we're not yet at the brink of the "technological singularity," the predicted point at which machines become more intelligent than humans.
But at a recent forum in San Francisco, co-founder Tristan Harris, a former Google design ethicist, said we're already at another dangerous place — the point at which technology has overwhelmed human weaknesses, leading to what Harris calls "human downgrading."
Humans are downgraded when technology is designed to purposefully exploit our weaknesses, leading to societal problems such as tech addiction, information overload, polarization, and excessive vanity and outrage, Harris said. Taken to its extreme, this manipulation has the potential to destroy individual agency, or free will.
Thankfully, the group also has some solutions to what Harris calls "the race to the bottom of the brain stem." Here's what they think should be done before we cede the planet to artificial intelligence.
At the April 23 forum, attended by tech leaders that included Apple co-founder Steve Wozniak and Roger McNamee, author of the new book "Zucked," Harris said people have been struggling to define the cultural problems accumulating as technology's influence spreads.
“It has to do with something (Harvard biologist) E.O. Wilson said, which is that we’ve got paleolithic emotions, medieval institutions and godlike technology. This is kind of the problem statement of humanity," Harris said.
In trying to capture our attention, purveyors of technology have turned cellphones and social media into slot machines to which we constantly return for neural rewards, he said. At the same time, even as we spend about a quarter of our lives in front of screens, our sense of well-being has plunged. Harris noted research by San Diego psychology professor Jean Twenge that has shown how depressive symptoms in women and girls have climbed concurrent with the rise of social media.
"We know what this is doing; it's downgrading and overwhelming who we are and our identities," Harris said.
He cited the average daily usage of YouTube — more than an hour a day on mobile devices — as an example of how technology is eroding free will. Yes, we choose to click "play" but the recommendations before us are designed to make the content irresistible.
Further, technology can even decide or downgrade our beliefs, Harris argued, saying that the algorithms tilt us toward "crazytown."
In an interview with Eric Johnson of Vox, Harris said that for individuals to fight the trend is akin to bringing a knife to a laser fight. "This is actually a deep point that people really underestimate, because it’s sort of a civilizational moment when an intelligent species, us, we produce a technology where that technology can simulate the weaknesses of the creator."
Children are especially vulnerable to features like Snapchat's "beautification filter." More than half of plastic surgeons say they've had clients who want surgery so they can look more like they look on the beautification filter, he said.
"It’s never been easier to (believe) that people only like you if you look different than you actually look," he said, suggesting that some businesses, in effect, are profiting at the expense of our children's self-esteem.
To stop human downgrading, the Center for Humane Technology calls for changes in how technology is designed, used and regulated, and Harris said in the Los Angeles Times that a "full-court press" by policymakers, media and the public is necessary for the current system to change.
Harris told Michael Hiltzik of the Times that it's time to consider changes to Section 230 of the 1996 Communications Decency Act, which currently shields social media platforms from legal responsibility stemming from the words or actions of users.
"The platforms’ immunity stems from their image as neutral purveyors of others’ content. Harris argues that this concept no longer applies to platforms that are actively serving recommendations to billions of users," Hiltzik wrote.
Companies should have to disclose the degree to which they have influenced users by recommending content, Harris believes.
But that is just one part of a society-wide effort to combat the problem of human downgrading, which the center proposes to lead with a series of public events, the launch of a podcast led by Harris and center co-founder Aza Raskin, and the development of guidelines to help companies redesign their products in ways that don't exploit human vulnerabilities.
Harris and his colleagues believe that technology can be designed in ways that bring out our best, not our worst, wrote Adele Peters of Fast Company. But Peters noted some people are skeptical that companies will make changes if the changes would affect their bottom line, and others thought the forum was short on substance.7 comments on this story
"But the argument that tech companies need to make radical and systemic change — and shift to business models that don’t rely on advertising — is something that more people should be talking about," Peters wrote.
And, brace yourselves, these changes may also mean that we have to pay for things we're getting for free now, such as Facebook and Twitter. Free isn't always so great, Harris said.
“We’re getting free social isolation, free downgrading of attention spans, free obstruction of our shared truth, free incivility," he said. "Free is the most expensive business model we’ve ever created.”