Easier said than done.
However, Frances Haugen, the former Facebook data scientist who came forward with thousands of internal documents to back up her claims that the engagement-based algorithms created by the tech giant are used to deepen “divides, destabilize democracies, and make young girls and women feel bad about their bodies,” has been in talks with the House Judiciary Committee to help secure a way to rein in the tech giant’s influence.
“Haugen told the Senate Commerce Committee in early October that she does not believe Facebook should be broken up because it would limit the platform’s ability to moderate dangerous and illegal content. Instead, she called for regulation, including oversight of social media companies’ algorithms.”
Clearly, any panel looking into ways to limit the negative effects of social media on users should have someone rooted in ethics. Based on her tech background, experience with Facebook, and strong moral sense, it appears that Haugen would be an invaluable asset to lawmakers concerning what should be done.
“A company with such frightening influence over so many people, over their deepest thoughts, feelings and behavior needs real oversight,” Haugen said during her testimony. “These systems are going to continue to exist and be dangerous even if broken up.”
And there’s another individual with a similar moral sense that I would recommend.
Investor Brian McNamee had been involved with Facebook in their early days and speaks with some of the same direct knowledge when he began to see the potential damage Facebook could cause. Alarmed, he pressed founder Mark Zuckerberg and COO Sheryl Sandberg “to adopt human-driven technology over addictive, dangerous algorithms.”
But nothing came of it.
The reason: Facebook had become an incredible money-making machine for investors and there was no meaningful incentive to change its business model.
As McNamee sees it, “We need legislation to address three related problems across the entire technology world: safety, privacy and competition.
“We need something like an FDA for technology products, designed to prevent harmful technologies from coming to market. For qualifying products, it would set safety standards, require annual safety audits and certification as a condition for every product, and impose huge financial penalties for any harms that result. There should also be amendments to Section 230 of the Communications Decency Act to create better incentives for Internet platforms.”
The second area of concern is the sale of personal data.
“Congress also needs to protect people’s privacy from relentless surveillance. My preference would be for Congress to ban surveillance capitalism just as it banned child labor in 1938. (The many industries that employed child labor complained then that they could not survive without it.) At a minimum, Congress must ban third-party use of sensitive data, such as that related to health, location, financial transactions, web browsing and app data.
“The third area for legislation is competition,” McNamee writes, “where Congress needs to update antitrust laws for the 21st century. The six-hour outage of Facebook, Instagram, and WhatsApp illustrated for many one downside of a monopoly: absolute dependence on a service.”
Yet, there is another area that could force change within Facebook: its employees.
“Until very recently, Zuckerberg could count on the loyalty of employees. . . . It was not until 2020—when Facebook was used by white supremacists to spread hate after the murder of George Floyd, by allies of the Trump Administration to downplay the significance of COVID-19, by Trump loyalists to undermine confidence in the election, and then in 2021 by the antivaccination movement to undermine the nation’s pandemic response—that employees openly challenged management in significant ways.”
In response to Haugen’s testimony, Zuckerberg tried to soften the problem.
“I’m sure many of you have found the recent coverage hard to read because it just doesn’t reflect the company we know. We care deeply about issues like safety, well-being and mental health. It’s difficult to see coverage that misrepresents our work and our motives. At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted.”
Nonetheless, McNamee says, that given the “evidence of the past five years, one might say that Internet platforms have launched an attack against democracy and self-determination. It is a battle they will win unless voters and policymakers join forces to reassert their power.”
Congress needs to listen to Haugen, McNamee, and Facebook employees who challenge the way the company influences attitudes and actions. With their skill and support, Congress can change social media’s business model from making money at all costs to ensuring that responsibility and accountability come first.