Mr. Zuckerberg, Facebook’s chairman and chief executive, broadly outlined some of the options he said the company’s news feed team was looking into, including third-party verification services, better automated detection tools and simpler ways for users to flag suspicious content.
“The problems here are complex, both technically and philosophically,” Mr. Zuckerberg wrote. “We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible.”
The post was perhaps the most detailed glimpse into Mr. Zuckerberg’s thinking on the issue since Donald J. Trump’s defeat of Hillary Clinton in the Nov. 8 election. Within hours of his victory being declared, Facebook was accused of affecting the election’s outcome by failing to stop bogus news stories, many of them favorable to Mr. Trump, from proliferating on its social network. Executives and employees at all levels of the company have since been debating its role and responsibilities.
Facebook initially tried to play down concerns about the issue, with Mr. Zuckerberg calling the notion that the company swayed the election “a pretty crazy idea” at a technology conference on Nov. 10. In a follow-up Facebook post, he said that less than 1 percent of the news posted to Facebook was false.
But questions continued from outside the company, with some complaining that it was being too dismissive of its capacity to affect public opinion. In a news conference in Berlin on Thursday, President Obama denounced the spread of misinformation on Facebook and other platforms.
Mr. Zuckerberg came to no conclusions in his post on Friday, instead providing a list of possible solutions the company was exploring. One option, he said, could be attaching warnings to news articles shared on Facebook that have been flagged as false by reputable third parties or by Facebook users. Another could be making it harder for websites to make money from spreading misinformation on Facebook, he said.
Mr. Zuckerberg made it clear that Facebook would take care to avoid looking or acting like a media company, a label it has frequently resisted.
“We need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content,” Mr. Zuckerberg wrote. “We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”