Frances Haugen, former Meta Platform (then Facebook) product manager Leakage He spoke at the South by Southwest (SXSW) conference on social media reform, how company platforms fuel hate and confuse technology giants.
Haugen joined Facebook in 2019 after working at Google and Pinterest. A decade ago, Haugen was diagnosed with celiac disease. A long-term autoimmune disorder. In 2014, she was forced to enter a critical care unit after a blood clot formed in her thigh. She hired a family friend to help with her day-to-day work. Their friendship soon deteriorated after the friend fell victim to conspiracy theories in online forums and claimed that dark forces were working to manipulate politics. His friends were drawn to the world of witchcraft and white nationalism. Although his friend has since abandoned these beliefs, Haugen’s career path has changed for the better. He realized that technology platforms had a dark side and that conspiracy theories could draw in ordinary people.
In 2018, when a meta recruiter contacted him, he asked for a job to work in the unit responsible for fighting misinformation. By 2019, he was a product manager on the Civic Integrity team. According to a Los Angeles Whistleblower AttorneySince then, Haugen’s inauguration has inspired a new generation of whistleblowers to talk about corporate corruption.

Francis Haugen at SXSW
At SXSW, he criticized the meter’s reliance on verification and artificial intelligence (AI). Medium content. In April 2018, the company’s chief executive officer (CEO) Mark Zuckerberg said he believed AI was a solution to combat misconduct such as fake news, hate speech and propaganda. He believes the company is overly dependent on these tools.
Haugen says Meter’s own research shows that AI only reduces hate speech 3% to 5%Violent content by 0.08% and graphic violent content by 8%. The company opposes this charge, Said that in the first nine months of 2021 hate speech decreased by 50%. People remain the key to content control. Content control requires people to judge the context of what is being said, otherwise it risks censoring content that is not “abusive” and does not provide an effective way to judge questions.

Haugen noted the success of Twitter through a new function that requires users to click on any link before sharing them. According to Haugen, this reduces the spread of misinformation from 10% to 15%. That way, Twitter makes sure that you’ve seen at least one article before sharing it, that no censorship is coming into play.
Francis Haugen believes that Meta can do a better job of controlling content by adding features to its platforms, but fears of declining profitability have dragged its feet on the issue. For him, adding these features will not result in censoring anyone or choosing who won the ideas. However he acknowledged that their numbers were not enough to defeat Zuckerberg’s.
Haugen says the number one priority for technology reform must be greater transparency. AI is often a screen that allows technology companies like Meta to claim that they are fighting against misinformation, without actually doing anything meaningful to stop it.