Television screens airing the first presidential debate at the Walters Sports Bar in Washington on September 29. | Sarah Silbiger/Getty Images
The temporary ban doesn’t solve Facebook’s organic content problem or the problematic political ads appearing on its platform before voting.
Facebook is going to temporarily ban all political ads … but only after the 2020 election, a move that solves neither its organic content problem nor the problematic political ads appearing on its platform prior to voting.
On Wednesday, the social media giant announced that it will temporarily stop running social, electoral, and political ads in the United States after the polls close on Election Day, November 3. The measure is an effort “to reduce opportunities for confusion and abuse,” wrote Guy Rosen, Facebook’s vice president of integrity, in a blog post announcing the decision. The company will notify advertisers once it lifts the policy post-election, but it didn’t indicate when that would be. In early September, Facebook said it would ban new political ads the week before the election, but ads that have already been in the mix prior to then will continue to appear in News Feeds.
Also on Wednesday, Facebook said it would ban and remove posts that seek to intimidate voters, including ones that encourage poll watching “when those calls used militarized language or suggest that the goal is to intimidate, exert control, or display power over election officials and voters.”
These announcements are the latest in a series of small, slow, and iterative measures Facebook has introduced in recent months related to US politics and elections. And while they’re better than doing nothing, they are also too little, too late.
There’s a lot a political ad ban doesn’t do — it doesn’t stop politicians from lying in ads in the days leading up to the election, and it doesn’t stop giving political campaigns the ability to hyper-target ads to tiny groups of voters with very specific messaging. (Microtargeting makes it super easy to precisely target negative and misleading ads to certain voters, and it makes it harder for opponents and other groups to know those ads are out there and counter them.) Plus, banning political ads after the election doesn’t solve the, you know, before-the-election problem.
Some political strategists also argue that clamping down on political ads online hurts small campaigns more than it does the big ones. Facebook ads are a lot less expensive to run than television commercials — which means campaigns with big budgets can go to TV, while campaigns with small budgets can’t.
Also: Ads are just a small part of the equation. Facebook’s role in presenting voters with political information, disinformation, and conspiracy theories stretches far beyond advertising, and focusing too much on advertising allows it and other tech platforms to avoid the bigger problem: organic content. That means the type of stuff that goes on the platform for free — such as a false story in 2016 claiming Pope Francis had endorsed Donald Trump, or an unsubstantiated claim made by the president about mail-in-voting over the summer.
Misleading and polarizing organic content spreads fast and far on the platform all the time because social media thrives on engagement, and what engages people is content that evokes strong emotions. A political campaign doesn’t need to pay for a political ad to spread lies claiming Elizabeth Warren wasn’t born in the US or Marco Rubio has six secret love children — they can just post it.
The dangerous, preposterous QAnon conspiracy movement, which has shifted from the fringes to the mainstream, is a perfect example of social media’s failures. It has flourished on places like Facebook, Instagram, and Twitter — not because of ads, but because of organic content. Facebook finally banned QAnon this week, but it has already reached far and wide on its platform, as Recode’s Shirin Ghaffary recently explained:
The theory continues to grow online, in both the number of followers and the strength of its political influence in the Republican Party. The growing political clout of the movement is especially worrisome for misinformation researchers who say QAnon is potentially becoming one of the largest networks of extremism in the United States. QAnon is gaining broad appeal not just with the extremely online, male-dominated, 4chan message board crowd, where QAnon was first born; it’s also increasingly popular with suburban moms and yoga-loving wellness gurus on Instagram and Twitter.
Misinformation about voter fraud and the election is spreading — and it’s not relying on paid ads to do it. While Facebook tries to catch misinformation and put warnings on it, falsities travel a lot faster than its content moderators. In a hypothetical post-election scenario where Joe Biden wins the election but President Trump refuses to concede or insists the election was rigged, he doesn’t need an ad to spread that sort of lie — again, he can just do it in a post. Facebook says it will attach an “informational label” to content seeking to delegitimize the outcome of the election. However, it seems unlikely it would take down such posts or outright fact-check them, given how reluctant it’s been to take such actions in the past.
Will it work? When I consider my own anecdotal experience, I doubt it. I spent hours browsing the Facebook pages of anti-maskers for a story over the summer and encountered multiple people who had such labels on content they shared. They just dismissed them by claiming Facebook was censoring them or hiding the truth. They had also often developed their beliefs because of content they saw on Facebook or other online platforms.
In a September 3 post, Facebook CEO Mark Zuckerberg wrote that the 2020 election is “not going to be business as usual.” But the iterative measures Facebook has introduced so far seem to be exactly that — business as usual.
Will you help keep Vox free for all?
The United States is in the middle of one of the most consequential presidential elections of our lifetimes. It’s essential that all Americans are able to access clear, concise information on what the outcome of the election could mean for their lives, and the lives of their families and communities. That is our mission at Vox. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you haven’t, please consider helping everyone understand this presidential election: Contribute today from as little as $3.
from Vox - Recode https://ift.tt/2GDyEiD
via
A.I .Kung Fu