“The U.S. elections are just two months away, and with COVID-19 affecting communities across the country, I’m concerned about the challenges people could face when voting,” CEO Mark Zuckerberg wrote in a Facebook post. “I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.”
Starting on October 27, political candidates and committees won’t be allowed to place new ads on Facebook as well as Instagram. However, the political ads approved before that won’t be affected and will be allowed to run. Advertisers will also be able to modify and spend more money on these existing ads to expand their reach.
This weeklong ban appears to be a compromise as Facebook faces increasing pressure to block political ads altogether like its peers such as Twitter and Reddit, and it remains to be seen how big of an impact it will have. In his defense to not fact-check political ads, Zuckerberg has repeatedly said he doesn’t want to meddle with the truth and wants to leave it up to the voters to decide what content they want to trust.
Zuckerberg in his latest post says that Facebook took this decision since there won’t be “enough time to contest new claims” in the final days of an election. “I generally believe the best antidote to bad speech is more speech,” he added.
In addition to this, Facebook today introduced a new forward limit on its messaging platform, Messenger, to stem the growing tide of viral misinformation. The social network is now rolling an update that will restrict users from forwarding a message to no more than five people at a time.
What’s more, to tackle voter misinformation, Facebook will now take down posts that tell people they will catch the coronavirus if they vote. It will also label content that misleads users by “claiming that lawful methods of voting will lead to fraud.” Candidate or campaign pages that prematurely declare victory will also be flagged, Facebook added in a blog post.
Further, Facebook is expanding its policies on voter suppression to cover and remove explicit as well as implicit misrepresentations about how or when to vote such as posts that say things like “you can send in your mail ballot up to three days after election day.”
Over the last few months, as we inch closer to Election Day, Facebook has faced an avalanche of misleading election content from both official candidates and malicious partisan groups and the situation will likely grow worse in the next two months. The changes announced today could stifle these threats to an extent but only if Facebook acts before it’s too late.
Related Posts
WhatsApp has begun testing a long-overdue group chat feature
The Meta-owned messaging platform is testing a new feature called "group chat history sharing" (via a WABetaInfo report). As the name suggests, the feature lets a WhatsApp user (likely the admin) share the chat history (up to 100 messages sent within 14 days) with someone while adding them to a group.
You can now choose the kind of content you see on Instagram Reels
The announcement came from Instagram CEO Adam Mosseri, giving people a more direct way to shape the kind of videos they actually want to see. At its core, Your Algorithm lets users actively tune their Reels experience.
New UK under-5 screen time guidance targets passive time, what it changes for you
The push is rooted in government-commissioned research that links the highest screen use in two-year-olds, around five hours a day, with weaker vocabulary than peers closer to 44 minutes a day. Screens are already close to universal at age two, so the guidance is being framed as help you can actually use, not a ban.