Skip to main content
ColumnistsExecutive Director

From the Executive Director: ‘Don’t Shoot the Messenger’ Won’t Cut It Anymore for Social Media

By November 18, 2022December 15th, 2022No Comments

David Inoue

As we went into the 2022 midterm elections, one of the greatest concerns was the potential for disruption of the electoral process, especially through disinformation and misinformation widely dispersed through the internet.

Then, days before the election, the sale of Twitter to Elon Musk was finalized, and he assumed full control of the company without shareholder or other external accountability. It seemed the perfect storm for electoral chaos was in place.

Fortunately, our elections seem to have proceeded without widespread interference from third parties, and candidates have been patient as all the votes are counted. However, Twitter has fallen into disarray and remains a cautionary tale for us all about the dangers of social media and the need for further regulation and oversight, contrary to the direction Musk is taking Twitter, seemingly toward increasing chaos.

There has been a constant push and pull over how to manage social media. The ideal was that these platforms would serve as the modern town square or debate hall where people would engage in serious debate with better ideas rising to the top and untruths quickly being left to the side of the information superhighway. A perfect forum for freedom of speech.

Instead, what has happened with algorithms employed by the social media companies is an echo chamber of grouping people of like mind, allowing conspiracy theories and lies to flourish with supporters finally finding others who share their misguided beliefs such as that the 2022 election was stolen. The results of such spread and acceptance as truth of disinformation can be catastrophic as the January 6 commission has revealed.

The media platforms have made efforts at controlling the spread of falsehoods through active moderation and removal of posts deemed to be in violation of community guidelines. However, these efforts are piecemeal and half-hearted at best.

The reality is that their models of engagement, and subsequent ad revenue for more time spent on the site, is predicated on people finding content they want to read and interact with. Unfortunately for too many people, this leads them to false information through the site algorithms.

On Dec. 14, 2020, the Federal Trade Commission announced that it was initiating an investigation into the practices of TikTok, Discord, Facebook, Reddit, Snap, Twitter, WhatsApp and YouTube to better understand how they were using tracking and users’ demographic data and how they were using it to direct content and advertising.

Notably missing from the list and that should be included is Google, as well as other platforms that have grown in prominence in the intervening two years. To date, no report has been forthcoming from the FTC.

Congress can also take action. Section 230 of the Communication Decency Act notably shields internet content providers from liability for things posted by third parties on their platforms.

Since its passage, only two significant limitations have been placed on Section 230 immunity, protection of copyright and the prevention of sex trafficking. This leaves a wide swath of content still permissible and able to be shared without consequence by the platform providers.

I recognize that these issues fly in the face of the concepts of freedom of speech, a much-cherished right in this country. However, freedom of speech does not also mean freedom from consequence.

The likelihood that social media platform algorithms are actively directing people to the very hate speech that resulted in the mass shooting at Club Q in Colorado must be taken into account and responsibility laid where appropriate.

I’m sure the response from these social media giants would be “Don’t shoot the messenger!” The problem is that it’s not the messenger being shot, it is innocent clubgoers at Club Q and the Capitol Police on Jan. 6 whose lives have been lost.

The Supreme Court may believe that these corporations are people, but it is not the corporations that are being killed, it is real people. Whether it is the social media companies that spread the hate, or the gun companies that provide the instruments of mass destruction, these “people” must take responsibility for their role in the growing epidemic of violence.

We need the administration and Congress to take action together to protect us from these corporate accomplices to violence and murder against real people.

David Inoue is executive director of the JACL. He is based in the organization’s Washington, D.C., office.