A TikTok ban isn't enough — Congress must act to get children away from social media


Social media is horrendously broken. Bad actors are exploiting it while the platforms turn a blind eye. Our children are suffering — and our leaders won’t do anything serious about it.

The House recently cooked up a legally dubious bill to force TikTok’s owners to sell the company or face its delisting from app stores if it isn’t quickly sold. It’s a bad bill with good intentions that it aims to fulfill by crushing free-speech rights and putting the government’s thumb on the scale of free enterprise. Nonetheless, the bill passed with bipartisan support, President Biden signed it — and, naturally, it immediately went to court, where it will languish for years.

Now there’s a different bipartisan effort to repeal a 1990s-era provision of the Communications Decency Act known as “Section 230.” It says, very simply, that platforms are generally not liable for the content created by their users. Congress now wants to scrap Section 230 for some sort of nebulous future where all platforms are liable for anything anyone says on them. 

We enjoy the ability to comment on news articles. We like sharing opinions with our friends, family and strangers online. Repealing 230 puts all of that into question, because suddenly platforms would become legally responsible for anything you say — and if someone doesn’t like your criticism, they can sue the platforms for letting you say it. 

That legal risk is too much of a burden for any corporation to bear. It also flies in the face of a First Amendment meant to protect the healthy discourse and debate that makes for democracy.

There’s no question we should be wary of Chinese Communist Party influence, direct or otherwise, over an addictive app beloved by billions. There is equally no question that social media platforms should be responsible actors and not use their Section 230 exemption to editorialize content and churn out dopamine hits for the general public. Following the letter of the law while nakedly trampling all over its spirit has allowed the worst elements to produce the most odious content that violates all standards of decency and puts the good-faith protections of Section 230 in jeopardy, and the platforms know it.

Dozens of children have died completing stupid TikTok challenges such as blackout contests or eating laundry pods. Facebook and Instagram’s algorithms, which are designed to reinforce people’s biases, allegedly lead predators directly to children’s content.

Suicide is the second leading cause of death among youth ages 13 and 17. Studies have found that technology-based, feedback-seeking visual social media such as Instagram contribute directly to the depression behind those suicides, even if you limit time spent on social platforms. 

Before we start legislating who owns what and who is legally liable for what and where it’s said, we need more fundamental change. Platforms should not be allowed to target children, make money off of them or algorithmically exploit them at all. They should not be able to treat children like a product to be sold to the highest bidder. 

My new platform, Hedgehog, doesn’t allow minors at all. Why shouldn’t others step up and do the same? Is it because they’re making billions of dollars a year targeting children? 

A lot of states are acting to restrict children’s access to social media. Still, this runs into the same problem we saw in the so-called NetChoice cases recently before the Supreme Court: One state can impose laws that effectively set a national or even a global standard. The patchwork approach does not work and leaves more dangerous loopholes to be exploited and make competition harder.

I believe in limited government, limited regulation and the free market. To the extent that we’re going to have a government and count on them to legislate, this is the time and the place. Congress must act to protect children by setting rational limits on ages of use, volume, ad targeting, and exploitation of children by social media companies. Focus on the real harms and not the politically popular ones.

Because once again, Congress is getting it all wrong, making campaign commercials instead of actual policy. The algorithm is harming our children even as the adults squabble.

John Matze was founder and former CEO of Parler and is now founder and CEO of Hedgehog.

Source link

About The Author

Scroll to Top