Facebook Inc (FB.O) said on Thursday it had begun “fact-checking” photos and videos to reduce the hoaxes and false news stories that have plagued the world’s largest social media network.
Facebook has for months faced an uproar among users whose complaints range from the spread of fake news to the use of the network to manipulate elections and the harvesting of 50 million people’s Facebook data by the political consultancy Cambridge Analytica.
Manipulated photos and videos are another growing problem on social media.
The fact-checking began on Wednesday in France with assistance from the news organization AFP and will soon expand to more countries and partners, Tessa Lyons, a product manager at Facebook, said in a briefing with reporters.
Lyons did not say what criteria Facebook or AFP would use to evaluate photos and videos, or how much a photo could be edited or doctored before it is ruled fake.
The project is part of “efforts to fight false news around elections,” she said.
A representative for AFP could not immediately be reached for comment.
Shares of Facebook closed up 4.4 percent at $159.79 on Thursday after a tumultuous two weeks. It remained down more than 13 percent from March 16, when Facebook disclosed the Cambridge Analytica data leak and sparked fears of stricter regulation.
Facebook has tried other ways to stem the spread of fake news. It has used third-party fact-checkers to identify them, and then given such stories less prominence in the Facebook News Feed when people share links to them.
In January, Chief Executive Mark Zuckerberg said Facebook would prioritize “trustworthy” news by using member surveys to identify high-quality outlets.
Samidh Chakrabarti, another Facebook product manager, said in the briefing that the company had begun to “proactively” look for election-related disinformation rather than waiting for reports from users, helping it to move more quickly.
Alex Stamos, Facebook’s chief security officer, said in the briefing that the company was concerned not only about false facts but also other kinds of fakery.
He said Facebook wanted to reduce “fake audiences,” which he described as using “tricks” to artificially expand the perception of support for a particular message, as well as “false narratives,” such as headlines and language that “exploit disagreements.”
Zuckerberg disavows 'user growth' memo
A Facebook Inc executive said in an internal memo in 2016 that the social media company needed to pursue adding users above all else, BuzzFeed News reported on Thursday, prompting disavowals from the executive and Facebook Chief Executive Officer Mark Zuckerberg.
The memo from Andrew Bosworth, a Facebook vice president, had not been previously reported as Facebook faces inquiries over how it handles personal information and the tactics the social media company has used to grow to 2.1 billion users.
Zuckerberg stood by Bosworth, who goes by the nickname “Boz,” while distancing himself from the memo’s contents. Bosworth confirmed the memo’s authenticity but in a statement he disavowed its message, saying its goal had been to encourage debate.
Facebook users, advertisers and investors have been in an uproar for months over a series of scandals, most recently privacy practices that allowed political consultancy Cambridge Analytica to obtain personal information on 50 million Facebook members. Zuckerberg is expected to testify at a hearing with U.S. lawmakers as soon as April.
“Boz is a talented leader who says many provocative things. This was one that most people at Facebook including myself disagreed with strongly. We’ve never believed the ends justify the means,” Zuckerberg said in a statement.
Bosworth wrote in the June 2016 memo that some “questionable” practices were all right if the result was connecting people.
“That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends,” he wrote in the memo, which BuzzFeed published on its website.
He also urged fellow employees not to let potential negatives slow them down.
“Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people,” he wrote.
Bosworth said Thursday that he did not agree with the post today “and I didn’t agree with it even when I wrote it.
“Having a debate around hard topics like these is a critical part of our process and to do that effectively we have to be able to consider even bad ideas, if only to eliminate them,” Bosworth’s statement said.