The move comes as chief executive Mark Zuckerberg warned of “an increased risk of civil unrest across the country”.
Explaining the move, Zuckerberg said: “This election is not going to be business as usual. We all have a responsibility to protect our democracy.”
He said he “generally” believed that “the best antidote to bad speech is more speech, but in the final days of an election there may not be enough time to contest new claims”.
In preparation of this, Facebook is going to “block new political and issue ads during the final week of the campaign”.
“So in the week before the election, we won’t accept new political or issue ads,” Zuckerberg continued, although political and issue ads that had already been running will be allowed to continue.
He said that Facebook was taking steps to encourage voter turn-out as well as tackle misinformation and prepare for candidates prematurely declaring the results.
In particular he warned that the coronavirus pandemic, which was likely to result in an increased number of people voting by mail, could mean delays to vote counting.
“Many experts are predicting that we may not have a final result on election night,” he wrote.
“It’s important that we prepare for this possibility in advance and understand that there could be a period of intense claims and counter-claims as the final results are counted.
“This could be a very heated period, so we’re preparing the following policies to help in the days and weeks after voting ends,” he said.
Facebook will be putting a Voting Information Centre panel at the top of both Facebook and Instagram “almost every day until the election” which will include video tutorials on postal voting and information on registration deadlines.
This panel will also “prepare people for the possibility that it may take a while to get official results [which] will help people understand that there is nothing illegitimate about not having a result on election night”.
<iframe src=”https://www.facebook.com/plugins/post.php?href=https%3A%2F%2Fwww.facebook.com%2Fzuck%2Fposts%2F10112270823363411&width=500″ width=”500″ height=”274″ style=”border:none;overflow:hidden” scrolling=”no” frameborder=”0″ allowTransparency=”true” allow=”encrypted-media”></iframe>
Do microtargetted ads reduce transparency?
The social network has faced criticism for allowing political ads to be “micro-targeted” on its platform so that they are only seen by small communities rather than debated more widely in the days after they appear.
The Mozilla Foundation has claimed that this makes it easier for politicians and their supporters to parade fiction as fact and avoid being called out on it until it is too late, particularly as Facebook has previously said ads placed by candidates would not be fact-checked.
The new steps could serve as a precedent for how the firm handles elections elsewhere in the future.
In July, Donald Trump refused to state that he would accept the result of the election as he dismissed the validity of polls which showed him behind the Democratic candidate Joe Biden, prompting concerns he and his supporters would not comply with the results.
The US president has previously been sanctioned by Facebook and Twitter for posting false information relating to postal votes, and yesterday encouraged supporters in North Carolina to vote twice in the November election to ensure their ballot is counted.
Good news for brand ad safety?
Yuval Ben-Itzhak, CEO at Socialbakers, said: “Brand marketers will be pleased to hear this latest news. While nothing beats the scale and reach of Facebook, some marketers are reluctant to have their ads appear alongside polarising and possibly misleading political ads.”
“Banning politically-motivated ads is much easier than attempting to validate each one, even with Facebook’s ample technology resources and staff. The scale is just too great. As a reminder, Twitter took a stance on political ads last year and completely banned them on the platform.”
“Facebook’s new stance on political ads is no surprise. The platform has faced tough criticism for allowing politically-motivated misinformation to proliferate on Facebook. Over time, we expect them to give users more control and the choice of opting out of seeing political ads altogether.”