What TikTok’s U.S. Spin-off Means for Its Algorithm and Content Moderation

Will American Ownership Change What TikTok Shows You?

TikTok’s U.S. spin-off could reshape its algorithm and the way culture is curated online.

Rachel Feltman: For Scientific American’s Science Quickly, I’m Rachel Feltman.

TikTok’s algorithm, which shapes what more than a billion users see, has developed an almost mystical reputation for figuring out what people want to watch. Those powers aren’t actually magical, but they do matter. An algorithm as widely used as TikTok’s can have a huge impact on our culture by determining what information people receive and how.

As TikTok prepares to spin off a U.S.-only version of the app with majority-American ownership, plenty of questions loom about how the platform—and its all-mighty algorithm—might change. Will new investors reshape what kinds of content is promoted or suppressed?

If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

Here to break down what we know about the highly anticipated TikTok sale and what it might mean for the platform’s future is Kelley Cotter, an assistant professor in the Department of Human-Centered Computing and Social Informatics at Pennsylvania State University.

Thank you so much for coming on to chat today.

Kelley Cotter: Of course, I’m glad to be here, and thanks for inviting me.

Feltman: So would you start by telling us a little bit about your background—you know, what kind of research you do?

And then I have a book that’s under contract right now with Oxford University Press on critical algorithmic literacy, so one of the things I’m interested in is understanding how what we know about algorithms can help us govern them in a more bottom-up fashion. And also thinking about our understanding of platforms and the practices we have around them as kind of contextual insights that we have.

Feltman: What do you think is lacking in most people’s understanding of the algorithms that power the social media they use?

Cotter: So when I started researching this maybe almost 10 years ago there was still a large portion of the population who weren’t even really aware that these processes existed to sort of sort and filter content online. Now I think that has changed quite a bit, where there’s—probably most people have some awareness of these processes happening. They have some awareness that what they see in their feeds isn’t everything that they could possibly see. And I think they also have a basic understanding of how that works, so they know that this depends upon their activity on the sites: the things that they engage with, the things they watch, the things they share, the things they comment on, all that kind of stuff.

I think anything higher level than that, maybe the more complex technical understanding, is more out of reach, but also, the ways that people are aware of the impacts or consequences of algorithms is also limited. So people are often aware of the ways—of their own encounters with algorithms because we learn a lot about them through our own experiences. But there’s not sort of a broad understanding of the ways algorithms might be reshaping different broader societal processes.

Feltman: Mm. So you recently wrote a piece for the Conversation about the TikTok sale and how it relates to the kind of infamous TikTok algorithm. To start us off, what do we know about the TikTok sale? What’s going on there?

Cotter: So we have some details at this point, not a full picture, but we have some details. So we know that the deal is going to create a new U.S.-only app, spun off from the original app; that it’s going to be a majority ownership by American companies, about 80 percent, and then less than 20 percent among Chinese investors, ByteDance—the parent company of TikTok.

And the main driver of creating this deal originally had to do with concerns about the app being under Chinese control. And one of the key focal points was the algorithm because there was concerns about the ways that the algorithm could be manipulated to shape the content that users see in their feeds in ways that U.S. lawmakers found concerning. So the algorithm, then, would be licensed to this new American company, and they would retrain it and rebuild it for the U.S.-only app.

Feltman: Yeah, and why is the fate of TikTok’s algorithm such a big part of this conversation, you know, even now that it wouldn’t be in the hands of a foreign power?

Cotter: The algorithm is at the heart of everything that TikTok does. So every social media platform really revolves around the functions that their algorithms perform. So algorithms are designed to tailor content to user preferences, so they’re designed to make users’ experiences meaningful and valuable; that’s sort of the goal. But it also means that they play a central role in shaping sort of the culture by the ways that they make certain kinds of content visible or less visible.

So they sort and filter content for folks and then also enforce some of the community guidelines that social media companies set to make sure that the content that people see in their feeds isn’t excessively gory or doesn’t promote violence or in—historically, there was concern about minimizing misinformation. So there’s different ways that it’s supposed to optimize feeds to lift up the best content and the best content for the individual user.

Feltman: As someone who’s studied social media algorithms for nearly a decade what’s unique about the one that powers the TikTok “For You” page, both, actually, algorithmically and maybe in the ways people feel that it works, if that makes sense?

Cotter: Yeah, the TikTok algorithm is perceived to be especially good at tailoring content for users. There’s kind of a popular conception of it as knowing people better than they know themselves. And some of my research with colleagues has investigated those kinds of beliefs and the ways that they converge in this really curious mixture of spiritual beliefs and conspiracy theorizing, where there’s sometimes perception that what people see in their feeds is somehow sort of, like, cosmically destined for them; it’s meant for them specifically. So there’s this really—there’s perceptions of the algorithm as being very powerful and good at its intended purpose.

In some ways, in many ways, the algorithm isn’t especially different from other social media algorithms. It’s sort of designed in the same way, where the goal is to keep users on the site and keep them coming back. That’s sort of what it’s optimized for. And it also, like other social media algorithms, relies on signals from people’s behavior on the site—again, the things that they like, the things they comment on, things they share, these sort of signals of interest.

One Wall Street Journal investigation suggested that watch time on TikTok is an especially strong signal of interest used by the algorithm to rank content. One reason why the TikTok algorithm might be potentially better at tailoring content is the nature of the short video format, where it’s easier to get a read on what interests people based on the length of time that they spend watching any given piece of content versus any other thing.

It also has other, like, unique features that promote more connections between creators and users. So we get, like, the Stitch function, where people will respond to different videos; they’ll splice in a video from another creator and respond to it with their own video. There’s sounds, where people can use similar sounds to kind of create kind of memes and, and different conversations or promote similar ideas about things. So there’s ways that connections across users are facilitated by the platform features that could be helpful for understanding user preferences.

But it’s not entirely clear why it is, at least perceived as, especially good at tailoring content. We have some information about how it works, but it’s hard to know any given one reason why it might be especially good.

Feltman: So given what we know about the proposed buyers for TikTok and the potency of the TikTok algorithm, what are the implications if the sale goes through?

Cotter: Yeah, because the algorithm is so central to life on the platform, to what it is, it matters whose hands it’s in because it will directly, again, shape what the platform looks like—what this new American app will look like.

So this doesn’t have to do with the ownership but with the new app because it’s going to be American users only—so they say that there will be global content that will still be visible on the platform, but the users for this app will be American. So we can expect that if this new algorithm, as licensed from ByteDance, is retrained on U.S.-only users, that the American values, preferences, behaviors that inform the curation of content by the algorithm on the site—we might expect to see some subtle shifts, just by nature of that different dataset that it’s being built on.

And if users perceive the new app to be in the hands of Trump allies or to be more conservative-leaning in their viewpoints and have concerns that those investors might exert influence on the content in the app, we might expect to see some users leave the app. So it could result in a situation where not only is it a—an app that is composed by only people based in the U.S. but only a subset of American users and particularly ones that perhaps might be right-leaning, which would also, again, have very big impact on the kinds of content that you see there.

So ultimately, the new app might look drastically different than it does right now, depending on what happens with decisions made by the investors, decisions by users, by who stays and who goes, and all that.

Feltman: Well, thank you so much for coming on to talk through this with us. We’ll definitely be reaching out to chat more if this sale goes through.

Cotter: Yeah, I’d be happy to chat more. Thanks again for having me.

Feltman: That’s all for today’s episode. We’ll be back on Friday to find out how Halloween treats can play tricks with our gut microbes.

For Scientific American, this is Rachel Feltman. See you next time!

Rachel Feltman is former executive editor of Popular Science and forever host of the podcast The Weirdest Thing I Learned This Week. She previously founded the blog Speaking of Science for the Washington Post.

Fonda Mwangi is a multimedia editor at Scientific American and producer of Science Quickly. She previously worked at Axios, the Recount and WTOP News. She holds a master’s degree in journalism and public affairs from American University in Washington, D.C.

Alex Sugiura is a Peabody and Pulitzer Prize–winning composer, editor and podcast producer based in Brooklyn, N.Y. He has worked on projects for Bloomberg, Axios, Crooked Media and Spotify, among others.

If you enjoyed this article, I’d like to ask for your support. Scientific American has served as an advocate for science and industry for 180 years, and right now may be the most critical moment in that two-century history.

I’ve been a Scientific American subscriber since I was 12 years old, and it helped shape the way I look at the world. SciAm always educates and delights me, and inspires a sense of awe for our vast, beautiful universe. I hope it does that for you, too.

If you , you help ensure that our coverage is centered on meaningful research and discovery; that we have the resources to report on the decisions that threaten labs across the U.S.; and that we support both budding and working scientists at a time when the value of science itself too often goes unrecognized.

In return, you get essential news, captivating podcasts, brilliant infographics, , must-watch videos, challenging games, and the science world’s best writing and reporting. You can even gift someone a subscription.

There has never been a more important time for us to stand up and show why science matters. I hope you’ll support us in that mission.

Thank you,

David M. Ewalt, Editor in Chief, Scientific American