Social Media’s Algorithm Problem

For too long, Social media has become a haven for third-party shares of misinformation, false conspiracies and blatant lies. As belief turns to anger, and anger to hate, some have become radically motivated, violent extremists.

While many of these companies have taken some steps to stem the tide of hate and violence, our worst impulses have rapidly overtaken and divided our kinship with each other leading to the violent extremes we witnessed on January 6.

What’s to be done?

In an extraordinary commentary, Yael Eisenstat offers a withering assessment and some important ways Congress can act.

“A former CIA officer, White House advisor, head of a global risk firm, Eisenstat is a Visiting Fellow at Cornell Tech’s Digital Life Initiative where she works on technology’s effects on civil discourse and democracy. In 2018, she was Facebook’s Global Head of Elections Integrity Operations for political ads.”

Writing in the Harvard Business Review, Eisenstat says, “The problem with social media isn’t just what users post — it’s what the platforms decide to do with that content. Far from being neutral, social media companies are constantly making decisions about which content to amplify, elevate, and suggest to other users. Given their business model, which promotes scale above all, they’ve often actively amplified extremes. …

“It is time to define responsibility and hold these companies accountable for how they aid and abet criminal activity. …

“We need to change our approach not only because of the role these platforms have played in crises like [the January 6 attack on the Capitol] but also because of how CEOs have responded — or failed to respond. The reactionary decisions on which content to take down, which voices to downgrade, and which political ads to allow have amounted to tinkering around the margins of the bigger issue: a business model that rewards the loudest, most extreme voices.

“Yet there does not seem to be the will to reckon with that problem. Mark Zuckerberg,” Eisenstat points out, “did not choose to block Trump’s account until after the U.S. Congress certified Joe Biden as the next president of the United States. Given that timing, this decision looks more like an attempt to cozy up to power than a pivot towards a more responsible stewardship of our democracy. And while the decision by many platforms to silence Trump is an obvious response to this moment, it’s one that fails to address how millions of Americans have been drawn into conspiracy theories online and led to believe this election was stolen — an issue that has never been truly addressed by the social media leaders. …

“Guardian journalist Julia Carrie Wong wrote in June of this year about how Facebook algorithms kept recommending QAnon groups to her. Wong was one of a chorus of journalists, academics, and activists who relentlessly warned Facebook about how these conspiracy theorists and hate groups were not only thriving on the platforms, but how their own algorithms were both amplifying their content and recommending their groups to their users.

“The key point is this,” Eisenstat makes clear, “This is not about free speech and what individuals post on these platforms. It is about what the platforms choose to do with that content, which voices they decide to amplify, which groups are allowed to thrive and even grow at the hand of the platforms’ own algorithmic help.

“So where do we go from here?

“I have long advocated that governments must define responsibility for the real-world harms caused by these business models, and impose real costs for the damaging effects they are having on our public health, our public square, and our democracy. As it stands, there are no laws governing how social media companies treat political ads, hate speech, conspiracy theories, or incitement to violence. This issue is unduly complicated by Section 230 of the Communications Decency Act, which has been vastly over-interpreted to provide blanket immunity to all internet companies…

“One solution I continue to push is clarifying who should benefit from Section 230 to begin with, which often breaks down into the publisher vs. platform debate. To still categorize social media companies — who curate content, whose algorithms decide what speech to amplify, who nudge users towards the content that will keep them engaged, who connect users to hate groups, who recommend conspiracy theorists — as ‘internet intermediaries’ who should enjoy immunity from the consequences of all this is beyond absurd.

“The notion that the few tech companies who steer how more than 2 billion people communicate, find information, and consume media enjoy the same blanket immunity as a truly neutral internet company makes it clear that it is time for an upgrade to the rules. They are not just a neutral intermediary.

“By insisting on real transparency around what these recommendation engines are doing, how the curation, amplification, and targeting are happening, we could separate the idea that Facebook shouldn’t be responsible for what a user posts from their responsibility for how their own tools treat that content. I want us to hold the companies accountable not for the fact that someone posts misinformation or extreme rhetoric, but for how their recommendation engines spread it, how their algorithms steer people towards it, and how their tools are used to target people with it.

“To be clear: Creating the rules for how to govern online speech and define platforms’ responsibility is not a magic wand to fix the myriad harms emanating from the internet. This is one piece of a larger puzzle of things that will need to change if we want to foster a healthier information ecosystem….

“As long as we continue to leave it to the platforms to self-regulate, they will continue to merely tinker around the margins of content policies and moderation…”

“Democracy,” Franklin Roosevelt said, “cannot succeed unless those who express their choice are prepared to choose wisely. The real safeguard of democracy, therefore, is education.”

For too many, however, social media is the educator where the only curriculum is wherever social media steers people to what they want to hear and believe.

If steps aren’t taken to rein-in social media’s algorithms; If we don’t learn how to restrain our worst impulses, this democracy is in danger of falling into a chasm of hate and violent extremism.

0 comments… add one

Leave a Comment