Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

More content moderation won’t stop extremism or misinformation on Facebook

Facebook
REUTERS/Johanna Geron/Illustration
From online extremism that triggered the U.S. Capitol riots to false coronavirus claims and vaccine misinformation, it’s no secret that social media platforms like Facebook spread outrageous and erroneous content. What’s more, the Facebook Files, a collection of documents released by former Facebook manager Frances Haugen, show that the company played down these harmful effects. If there’s ever a moment to push for social media reform, it’s now.

So far, companies like Facebook say they just need to continue hiring more content moderators to screen and remove dangerous posts or lies. However, content moderation still hasn’t stopped lots of extremism and misinformation from spreading. In fact, it remains an endless game of whack-a-mole. Here’s why.

First, consider the problem of scale. Facebook, for instance, has billions of users, many who post daily. The resulting magnitude of user-generated content makes it impractical to hire a team of moderators large enough to screen and remove all harmful posts.

Second, consider another problem working against content moderators: the social media algorithms distributing content online. Algorithms that make Facebook so effective for advertising and sharing also make this platform increasingly difficult — if not impossible — to moderate. That’s because the algorithms spreading (mis)information work faster than the content moderators trying to screen and remove it.

To wrap our heads around this dilemma, consider what social media algorithms are designed to do: First, hijack your attention — say, with click-bait ads or viral content — when you log on to social media. Next, collect all data you leave traces of online, including your “likes” and amount of time spent scrolling and glancing at ads or content. Then, sell access to your data to advertisers or outside parties, whose goal is to target you with more ads or content that’ll keep you “liking” and scrolling.

By harvesting your private data to manipulate what you see online, social media algorithms work around the clock to target people with nonstop ads and content, especially information that keeps everyone addicted to “liking” and scrolling.

Unfortunately, this addictive design can incentivize a great deal of outrageous and erroneous information. After all, what’s going to addict people to “liking” and scrolling is often what’s outrageous, and what’s outrageous isn’t necessarily truthful. The result is a society left vulnerable to bad actors who provoke mobs by ginning up online outrage and spreading falsehoods.

Content moderation may sound good in theory, but it repeatedly fails in practice. That’s because social media’s problem isn’t merely an epidemic of extremism and misinformation. It’s that these platforms manipulate what people see online, with the goal of addicting users to “liking” and scrolling through endless content. As a result, social media platforms end up generating more information — including extreme and misleading info — than can possibly be taken down.

Christopher Cocchiarella
Christopher Cocchiarella
If we want to mitigate extremism and misinformation on social media, we need to change the manipulative algorithms and addictive design that characterize social networking sites like Facebook. Fortunately, it’s a solvable problem. For example, here are two reforms that could make a difference.

First, lawmakers should implement policies that enhance personal privacy and data protection. Regulating how social media algorithms harvest private data would put a reasonable check on how these platforms manipulate users online. Such regulation may include limiting how much private data can be collected by social media companies, as well as restricting how data can be accessed or used by advertisers and outside parties.

Terry Chaney
Terry Chaney
To date, some states have passed laws that move us in this direction, including the California Consumer Privacy Act and Virginia’s Consumer Data Protection Act. Minnesota might be the next state to lead the way with the Minnesota Consumer Data Privacy Act. This legislation may compel national action to protect our private data, not unlike the European Union’s General Data Protection Regulation.

Second, social media companies like Facebook should redesign their platforms by eliminating addictive features, such as “like” buttons and infinite scrolling. Curtailing addictive design would transform social networking sites for the better. Instead of addicting users to outrageous and erroneous information, these designs could help people encounter credible and accurate information.

As we continue losing people to a prolonged pandemic, finding credible and accurate information can often mean the difference between life and death. In this way, our future well-being may depend on reforming social media.

Christopher Cocchiarella is a training and development specialist with a background in technical communication and user experience. Terry Chaney is a policy analyst with a background in economics and technology policy. They live and work in the Twin Cities.

Enregistrer un commentaire

0 Commentaires