Book review: How has social media rewired our minds?
2022-12-1 00:29:25 Author: blog.avast.com(查看原文) 阅读量:22 收藏

A reminder that it’s crucial to listen more to others, put down the screen, and be present IRL. 

With the reinstatement of previously banned Twitter luminaries including Donald Trump and Kathy Griffin, now's the perfect time to do further research into the role of social media in our public discourse. 

The recent book by Max Fisher, The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World, should be on everyone’s reading list. His book documents the rise of social networking for the past decade and shows its highly influential role in society. Fisher is a reporter for the New York Times who has covered its effects for many years, reporting on their impact with conflicts in Germany, Brazil, Sri Lanka, and Myanmar. On many of these trips, he worked with Amanda Taub and others to interview sources during critical moments of social unrest. 

What do all these countries have in common?

Each of the aforementioned countries have experienced the rapid radicalization by social media that has had disastrous consequences, feeding riots and increasing the number and intensity of threats on people portrayed in various viral videos. 

Fisher’s perspective is both chilling and illuminating. He starts off at Gamergate (an online harassment campaign which began in August 2014, targeted women in the video game industry, most notably the feminist media critic Anita Sarkeesian and video game developers Zoë Quinn and Brianna Wu) and ends up on the Capitol steps on January 6, 2021. Over the years, Fisher has made frequent trips to Silicon Valley and the headquarters of Facebook, YouTube, and Twitter. “But that is what makes visiting the Valley so disturbing. Some combination of ideology, greed, and the technological opacity of complex machine-learning blinds executives from seeing their creations in their entirety,” he writes. 

“Social media radicalizes us and primes us to be intolerant of others whose attitudes, opinions and views differ from our own,” says Missouri State University professor of communication Brian Ott in a recent New York Times op-ed.

In The Chaos Machine, Fisher quotes many of the same researchers we have featured in our previous blogs, including Renee DiResta, Frances Haugen, and Megan Squire. We wrote about a report co-authored by DiResta about how to spot fake news. We also reviewed the Netflix documentary The Social Dilemma where DiResta is featured. She also sat for an interview by ex-White House speechwriter Jon Favreau, where she talks about these issues at length. Haugen is the Facebook whistleblower who made this trove of documents public. Finally, Squire was the subject of a post on Telegram hate speech.

Fisher’s book chronicles the multiple failures of the three social media platforms to prevent “users moving toward ever more titillating variations on their interests. If that’s cats or bikes, the impact is slight. If it’s politics, health, or other topics with some gravity for society, the consequences can be profound,” he writes. Indeed, users are quickly moved towards more salacious and dangerous content as they view more content since the almighty algorithms of the three are designed expressly for this purpose. “Within a few months, with a small team, we had an algorithm that increased watch time to generate millions of dollars of additional ad revenue,” one former Google employee said of those early years, “so it was really, really exciting.” He points out that the relationship between a cable TV network and the viewer is one-way, “while the relationship between a Facebook algorithm and the user is bidirectional. Each trains the other.”

The Chaos Machine finds a breathtaking statistic from researchers looking into the refugee crisis experienced across Europe during 2018: “Wherever per-person Facebook use rose by one standard deviation above the national average, attacks on refugees increased by about 35 percent. Nationwide, they estimated, this effect drove as much as 10 percent of all anti-refugee violence.” Another analyst quoted in his book said, “Social media plays the role that the ringing of the church bells used to play in the past. That’s the way that people knew that a lynching is going to happen.”

When Fisher reports on US-based activities, he provides some insight into what has happened to our social discourse. “I could not have foreseen that all of our politics were going to become Gamergate,” he wrote, referring to the heightened level of threats we have seen across social networks. “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” he wrote, “creating a world with “no civil discourse, no cooperation; misinformation, mistruth.”

This results in what he calls “irony poisoned,” a joke on the dulling of the senses that comes from a lifetime engrossed in social media subcultures, fueled by ironic detachment and algorithmic overstimulation. Is it any mystery why we get tricked into reading misinformation? He says, “We’ve reached a point where things that are popular and emotionally resonant are much more likely to be seen by you than things that are true.”

He also sets the context for one of the early AI failures back in 2016 when Microsoft launched its Twitter chatbot, Tay. After an overwhelming amount of racist slurs were fed to Tay by mischievous users, Tay was taken down (but not before it had carried out 96,000 interactions). 

Digital tyranny

Fisher documents the numerous policy failures of Facebook, YouTube, and Twitter to regulate and moderate their content and quotes one researcher who came to think of Facebook’s policy team as akin to Philip Morris scientists tasked with developing a safer, better filter for their cigarettes. Or said another analyst, “It was like putting more and more air fresheners on the outside of a toxic-waste factory while production simultaneously ramped up inside.” Time after time, the executives faced with limiting toxic speech chose to let it continue, because it ultimately drove their profits. He found that social media allows protestors to skip many of those steps, putting more bodies on the streets more quickly. “That can give people a sense of false confidence,” said one researcher. This creates the cancel culture phenomenon, “which was the arrival of a technology so ubiquitous, so ingrained in our very cognition, that it had altered the way morality and justice work,” he wrote. 

He writes: “YouTube had cultivated an enormous audience of viewers who had never sought the content out, but rather were pulled into it by the platform’s recommendations.” This is borne out by his research across the globe, touching down in one trouble spot or another to witness the effects of these recommendations first-hand. 

In the week after the 2020 election, the 20 most-engaged Facebook posts containing the word election were all written by Trump. By mid-November 2020, Facebook researchers made a startling discovery: 10 percent of all U.S.-based views of political content, or two percent of overall views, were of posts claiming the election had been stolen. Sadly, the book’s timeline ends after the January 6 riots.

Fisher points out one cause of trying to develop useful content moderation policies: “Being able to shoehorn the nuances of human speech, politics, and social relations into if-then decision trees” or having to create simple rules that human moderators could follow.

I asked a friend of mine who was an early content moderator many years ago for a combined ecommerce and social networking site that is no longer around. She marked offending content that someone else on her team would then delete. “It was very tedious and boring. I would have to scroll page by page through the sites. I saw some very disgusting stuff such as porn photos of young teens and did the job (for $10 an hour, which was a lot back then) part-time for a year or so.  It was content that I didn’t want my high school grand-daughter to see. Thankfully there were no lasting effects.” She told me that “people don’t know how to be bored anymore, and they tend to fill their slack times with consuming social media instead of doing something constructive offline. I see all the time families who are out to dinner are all looking at their phones, including the adults.”  

Potential next steps 

So is there any remedy to this toxicity? 

We’ve covered proposed changes to Section 280 that former President Obama mentioned in a speech at Stanford earlier this year, along with legal challenges to the law that are presently before the Supreme Court.  Google's Jigsaw unit has done extensive research about people from various countries who have supported conspiracy theories.

In any case, it’s crucial to listen more to others. And as my friend suggests, put down the screen and be present IRL. 


文章来源: https://blog.avast.com/chaos-machine-book-review
如有侵权请联系:admin#unsafe.sh