What Happens When Enough People Are Fed Enough False Information?

Published: December 14, 2021

By Jim Lichtman
Image
Read More

In Michael Morell’s 2015 book, The Great War of Our Time, the former CIA deputy director begins with a stunning statement: “the threat of terrorism did not die with Bin Ladin in Abbottabad,” and, left unchecked, is the greatest threat to America.

It’s an extraordinary book with details and insight that are both overwhelmingly thought-provoking and frightening.

However, when misinformation spreads at the blink of a Facebook post, digital domestic extremism is becoming an equally disturbing threat within America and other countries.

To listen to Frances Haugen—the whistleblower who testified before the Senate about Facebook’s lack of oversight on misinformation—you easily become convinced.

Haugen realized that within Facebook’s algorithm was a “sleeping dragon” that “would choose which post out of thousands of options, to rank at the top of users’ feeds,” Time magazine describes.

“One of the most dangerous things about engagement-based ranking,” Haugen told Time magazine in an interview, “is that it is much easier to inspire someone to hate than it is to compassion or empathy. Given that you have a system that hyperamplifies the most extreme content, you’re going to see people who get exposes over and over again to the idea that [for example] it’s OK to be violent to Muslims. And that destabilizes societies.”

In his speech to Georgetown University students, Zuckerberg describes the steps Facebook uses to reduce the harm caused by social media posts.

“We build specific systems to address each type of harmful content — from incitement of violence to child exploitation to other harms like intellectual property violations — about 20 categories in total. We judge ourselves by the prevalence of harmful content and what percent we find proactively before anyone reports it to us.”

Haugen has a different view.

Haugen says she agreed to work for Facebook only if she could work on misinformation on the platform. She discovered that everyone on her team had been newly hired and that the team lacked the data scientists she needed to do the job.

“I went to the engineering manager, and I said, ‘This is the inappropriate team to work on this. He said, ‘You shouldn’t be so negative.’”

Haugen’s team of misinformation specialists worked on the 2020 election. However, when the election ended, Facebook dissolved the team. Facebook says that the team wasn’t dissolved “but spread throughout the company to amplify its influence,” Time writes.

Once Haugen’s team was disbanded, she decided to act. After being told earlier this year by Facebook’s human resources department that she could no longer work remotely, Haugen collected documents she would later disclose that detail how Facebook’s algorithm worked. Haugen shared her story in Senate testimony suggesting that a federal agency be formed to oversee social media algorithms.

The similarities to tobacco insider, Jeffrey Wigand are not surprising.

Wigand was hired by Brown & Williamson to create a safe cigarette only to discover that his research results were being rewritten by B&W lawyers.

Haugen worked on a “civic integrity” team at Facebook to combat misinformation. “The reality right now,” Haugen says, “is that Facebook is not willing to invest the level of resources that would allow it to intervene sooner [in violence that was playing out in countries like] Ethiopia and other parts of the world.”

After bringing his findings to the head of Brown & Williamson, Wigand was fired but ultimately testified in a Mississippi court that the 7 tobacco companies manipulate nicotine levels in their products and that a cigarette was really a delivery device for nicotine.

Facebook has become a delivery device for misinformation and hate.

“I was learning all these horrific things about Facebook, and it was really tearing me up inside. The thing that really hurts most whistleblowers is: whistle-blowers live with secrets that impact the lives of other people.”

However, Wigand and Haugen share something else. They both have a strong moral compass when dealing with a crisis of conscience. That’s what caused them both to act.

While Haugen appears financially prepared to live life after Facebook, Wigand consults with several countries on his research.

The key question that both Haugen and Wigand have brought to the public through their disclosures is this: will things change?

Not any time soon. Nicotine still comes in the form of cigarettes and and tobacco companies have now added nicotine through vaping devices.

Social media remains unregulated.

Comments

  1. “it is much easier to inspire someone to hate than it is to compassion or empathy.” I still would like to be the optimist and trust that people want to and do share loving kindness.

Leave a Comment



Read More Articles
The Latest... And Sometimes Greatest