Kids are in the social media company bag more than we think. So many of them have ceded their entire online autonomy to their phones that they even balk at the idea of searching the internet — for them, the only acceptable online environment is one customized by big tech algorithms, which feed them personalized content.
As our children’s leisure and imaginations become more and more integrated into the social media they consume, we need to understand that unregulated Internet access comes at a cost. Something similar happens to adults, too. With the advent of artificial intelligence, a spiritual loss awaits us as we outsource countless human rituals — exploration and trial and error — to machines. But it’s never too late to change this story.
This spring, I visited with a group of high school students in suburban Connecticut for a conversation about the role social media plays in their daily lives and mental health. More children today report feeling depressed, lonely, and disconnected than ever before. More teens, especially tweens and LGBTQ teens, are seriously considering suicide. I wanted to be honest about how social media can help and hurt mental health. By the end of the 90-minute dialogue, I was more concerned than ever about the well-being of our children—and the society they would inherit.
There are many problems with children and teens using social media, from declining mental health to dangerous and age-inappropriate content and lackluster efforts by tech companies to enforce their own age-verification rules. But the high school students I met alerted me to a more insidious consequence of minors’ growing addiction to social media: the death of exploration, trial and error, and discovery. Algorithmic recommendations now detect and pursue interests, find community, and learn about the world. Kids today simply don’t learn how to be curious and critical adults–and they don’t seem to know what they’re missing out on.
A week before the student meeting, I introduced the Child Protection Act on social media with three of my fellow senators, Brian Schatz, Democrat of Hawaii, and Republicans Katie Brett of Alabama and Tom Cotton of Arkansas. The bill is an all-out attempt to protect young people on social media, prioritizing stronger age verification practices and banning children under 13 from using social media altogether. But there was one provision of the bill that was particularly troubling for this group of students: banning social media companies from using the data (what they watch and pass on) that they collect on children to build and feed into algorithms that feed individual content. Back to the users. These high school students have become dependent, and may be dependent, on the algorithms of social media companies.
Their reliance on technology sounds familiar to most of us. Many of us can hardly remember when we didn’t have Amazon to fall back on when we needed a last minute gift or when we waited for the radio to play our favorite songs. Today, information, entertainment, and connectivity are delivered to us on a conveyor belt, with less effort and exploration required of us than ever before.
Undoing the rite of discovery comes at a cost. We all instinctively know that the journeys in life are just as important as the destinations. In wanderings we learn what we like and what we don’t. Sweating to get it makes the result even more satisfying and satisfying.
Why should students go out of their way to find a song or poem they like when an algorithm will do it for them? Why risk exploring something new when their phones are sending you never-ending content about the things they really care about?
What the kids I spoke to didn’t know is that these algorithms are designed in a way to make — and keep — users unhappy. According to an advisory from the Surgeon General this year, “There are ample indications that social media can also carry profound risks of harming the mental health and well-being of children and adolescents.” A report from the nonprofit Center Against Digital Hate finds just that Users can have less than three suicide-related content Minutes after downloading TikTok. Five minutes into that, they could come across a community promoting eating disorder content. Instagram is full of softcore porn, providing a gateway to hardcore material on other sites (which are often equally age-verified). And all over social media, fake lives are highly regulated and filtered, generating a sense of envy and inadequacy within the developing minds of teens.
Social media companies know that content that generates negative emotions holds our attention longer than content that makes us feel good. It’s the same reason a local news triggers a shooting or house fire, not a local food drive. If you’re a teen feeling bad about yourself, your social media feed will continue to feature videos and photos that are likely to exacerbate negative feelings.
These kids may think they need the algorithm, but the algorithm actually makes many of them feel bad. It is no coincidence that teenage grief and suicide rates have increased just as algorithm-driven social media content has taken over the lives of children and teens.
Student observations in Connecticut have made me more convinced than ever that this law is vital. By taking steps to wean young people from their dependence on social media and force them to engage in a genuine exploration of finding connection and fulfillment, we can recreate the lost rituals of adolescence that, for centuries, made us who we are.
The role social media has played in the decline of adolescent mental health also gives us a preview of what’s to come for adults, with the rapid deployment of artificial intelligence and machine learning in our lives. The psychological impact of the upcoming transition of thousands of basic, everyday human tasks to machines will make the influence of social media look like child’s play. Today, instruments help us find a song we like. Tomorrow, the machines will not only find the song – they will create it. Just as we were not prepared for the impact social media algorithms would have on our children, we are likely not prepared for the spiritual toll that will occur when we outsource countless human functions to computers.
Regardless of whether the Child Protection on Social Media Act becomes law, we must work on a broader dialogue, with adults and children from all walks of life, to determine whether we will truly be happier as a species when machines and algorithms do all the work for us, or If only achievement comes when humans actually do human work, such as research and discovery.