A former Facebook employee has released a large number of internal Facebook documents to reveal problems with Facebook’s decision-making. The whistle blower revealed her identity in an interview earlier, criticising Facebook’s use of algorithms to tear society apart. She also said that Facebook had vowed to fight fake news, but stopped using measures to combat it after the US election last year.
Frances Haugen, 37, a former Facebook product manager, appeared on CBS on Sunday (3).
Haugen said that Facebook’s algorithms are refined to target content that is interactive or responsive. She cited internal Facebook research that says divisive and confrontational comments are more likely to provoke anger, and that content that is inaccurate or incites anger tends to stay on the platform with users.
Haugen claims that Facebook realised that if the company made its algorithms safer, users would spend less time on the site, which in turn would affect ad click-through rates and revenue.
Haugen said she saw a conflict between the public interest and the company’s interest at Facebook, criticising Facebook for choosing to improve its algorithms for the company’s benefit every time in order to make more money.
Haugen also said Facebook took steps to control fake news and other inaccurate information circulating on the social media platform in the run-up to the US election last November. But she said the company’s top management decided to relax the measures against fake news and disbanded the company’s Civic Integrity department because they thought the election was over and there was no unrest.
She described Facebook’s actions as a betrayal of democracy to her, and the parliamentary riots that took place just months after the company relaxed its measures.
Nick Clegg, Facebook’s vice president of policy and global affairs, responded that social media has had a major impact on society in recent years, and that Facebook is often a major battleground.
He stressed that Haugen’s allegations were misleading, saying that the current evidence did not reflect that social media, such as Facebook, was the main cause of polarisation.