You might cringe at the mention of TikTok, the wildly successful Chinese social-media platform. But you’d be a fool for underestimating it. Downloaded on three billion cell phones – more than the population of China and India together – TikTok’s revenues last year amounted to $34 billion – more than the individual GDP of half the world’s countries. TikTok has risen to global popularity and made its investors rich through the use of an advanced algorithm. This AI-enabled computer program is very efficient – maybe more so than even that of Facebook, Instagram and other social-media platforms. And in an age when people are glued to their phones, powerful algorithms like this have the power to shape public opinion and maybe even rewire how we think.
Here, then, is truly the dark side of social media. You might bemoan the death of the dinner conversation. You might be puzzled how everyone has their phones with them all the time. You might even rail against the fact that no one knows anymore how to read a map or give directions. More ominously, social media algorithms have the potential to make people more prone to fringe ideas about almost anything, from diet fads to racism. They have the power to transform how humans process and articulate information. If you’ve ever suspected that the world today is crazier than in your youth, you should pay attention to social-media algorithms.
Let’s start by unpacking the secret of TikTok’s success. TikTok is a video-sharing platform made up of short clips that range between 15 seconds and a minute. A typical user might flip through dozens or even hundreds of videos in an hour (compared with other video-sharing platforms like YouTube, where users more typically get through about 10 videos an hour).
The speed at which users cycle through these videos creates an enormous amount of data. According to a report, the most critical piece of data for TikTok’s algorithm is the amount of time a user spends on a video clip. To give you an idea of the scale of TikTok’s reach, its users have cumulatively spent the equivalent of 320,000 years on the platform since March 2020. That’s as much time as between the Stone Age and the present.
If TikTok could accelerate human civilization, we might have invented intergalactic travel by now. Instead, we have cat videos.
Using the data it collects, TikTok serves up content it believes will keep its users on the platform for as long as possible (the goal of every social-media network that sustains itself through advertising). So, if a user is interested in cat videos, TikTok’s algorithm will show them an endless stream of amusing feline antics.
Now, cat videos might be harmless, but the nature of the algorithm is not. The more time you spend on platforms like TikTok, the more siloed and specific is the content served to you. This fundamentally restructures how we discover new forms of information while removing serendipity from our lives.
For this reason and others, scientists are sounding the alarm. In a new paper for the journal, Proceedings of the National Academy of Sciences, biologists and ecologists argue that the digital age and social media “have accelerated changes to our social systems, with poorly understood functional consequences.” Because of gaps in knowledge on these issues, these scientists argue that the study of our collective behavior under the influence of algorithms must rise to a “crisis discipline,” just like climate science. In other words, there should be a new focus on actionable insights that policymakers can use so that social systems can be safeguarded.
Carl Bergstrom, a biologist at the University of Washington and co-author of the paper, said in an interview that “social media in particular – as well as a broader range of internet technologies, including algorithmically driven search and click-based advertising – have changed the way that people get information and form opinions about the world. And they seem to have done so in a manner that makes people particularly vulnerable to the spread of misinformation and disinformation.”
We don’t yet fully understand how these changes affect the way we form beliefs and opinions (and thus influence decision making). Scientists are effectively in the dark about how our mind is being transformed in this new information landscape. We even lack standardized methods of managing these trends. This could have detrimental effects on our emotional wellbeing, health, ecosystem and how we govern ourselves. As the algorithms get more powerful, our understanding of their effects lags further behind.
A concrete example of how algorithms create real problems can be found in the growing trend of Covid-19 vaccine hesitancy. Suppose you have concerns about Covid vaccines and are a TikTok (or Facebook or Instagram) user. In that case, the chances are high that you will fall into a rabbit hole of various forms of vaccine misinformation that the platform’s algorithm serves up.
Just as worrying is the rise of racism as social media platforms such as Twitter offer up more of the same to reinforce, and perhaps normalize, intolerance and hate among a segment of the population. In the course of less than a decade, we have seen the rapid evaporation of a century’s worth of accrued social justice and natural right. What might happen next?
Since they became an integral part of our lives with the smartphone revolution, algorithms have become more powerful and pervasive. At the same time, our understanding of their influence on society and the way we think has remained stagnant. Raising social media study to a crisis discipline is the first step to getting algorithms back under our control. Whether it’s TikTok, Facebook or Twitter, beware of the technology that enables cat videos.
Joseph Dana is the senior editor of Exponential View, a weekly newsletter about technology and its impact on society. He was formerly the editor-in-chief of emerge85, a lab exploring change in emerging markets and its global impact.