This Tuesday, following its recent charm offensive in Washington, DC, TikTok hosted journalists at its Los Angeles headquarters to unveil a new center it has created to woo American policymakers, regulators, and civil society leaders.
“How much of a national security threat is it to join the wifi network here?” NPR technology reporter Bobby Allyn joked as he waited with me and other attendees for executive presentations to start. TikTok staffers looked unsure of what to say until Allyn reassured them he was just kidding.
The exchange revealed the tension underlying the friendly press invitation: TikTok, an increasingly influential social media app used by over 130 million Americans, is facing intense political scrutiny in the US over its parent company’s ties to China. A little less than three years after President Donald Trump tried to ban it, the company’s negotiations with US regulators have stalled and it’s facing renewed calls for a national ban. Already, 17 US states have banned the app from government-issued devices.
TikTok’s new Los Angeles Transparency and Accountability Center offers a behind-the-scenes view into TikTok’s algorithms and content moderation practices, which have attracted controversy because of concerns that the wildly popular app could be weaponized to promote pro-Chinese government messaging or misinformation.
The information TikTok offered about its algorithms and content moderation wasn’t particularly illuminating, but what stood out were the details it shared about its plan to split parts of its US operations from China, while still being owned by a Chinese company. The event also presented a rare opportunity for reporters to ask questions of a broad cross section of TikTok’s staff about its content policies and algorithms.
In her opening remarks to reporters, TikTok COO Vanessa Pappas acknowledged general skepticism around the power social media platforms have over parts of our digital lives — without mentioning any specific political concerns with TikTok.
“We really do understand the critique,” said Pappas about the role Big Tech has in controlling “how algorithms work, how moderation policies work, and the data flows of the systems.”
But, Pappas said, TikTok is meeting this concern by offering what she calls “unprecedented levels of transparency,” with initiatives like its new center and its plans to implement other initiatives, such as starting to open TikTok’s API to researchers.
The elephant in the room
There’s one big reason we were all at TikTok’s offices: China. But Pappas and the company’s other leaders never actually said “China” in their on-the-record remarks.
TikTok is owned by a Chinese company, ByteDance, which operates its own version of TikTok’s app, called Douyin, in China.
Critics have long argued that any Chinese-owned company is beholden to China’s national security laws, meaning ByteDance employees could be compelled to surveil Americans or manipulate TikTok’s recommendation algorithms in service to the Chinese government. While there’s no evidence that the Chinese government has directly demanded American user data from TikTok or its parent company, investigative reporting by BuzzFeed News revealed that as recently as June 2022, Chinese TikTok employees could access US users’ data.
At Tuesday’s event, TikTok shared more on how it plans to reassure the public that it won’t be influenced by the Chinese government. Its “Project Texas” is a major partnership with the Texas-based tech giant Oracle to move all US data that was previously stored on TikTok’s foreign servers to the US. The project also entails inviting a team of outsiders, including from Oracle, to audit its algorithms.
Another part of the project will create a new subsidiary called TikTok US Data Security (USDS) that will oversee the app’s content moderation policies, train TikTok’s recommendation engine with US user data, and authorize editorial decisions. Under TikTok’s plan, USDS employees will report to a yet-to-be-finalized independent board of directors with strong national security and cybersecurity credentials.
This is all coming about a month after TikTok was found to be spying on Forbes journalist Emily Baker White, who was covering leaked details about the project. TikTok acknowledged several of its employees improperly accessed White’s private user data, along with that of several other journalists, in an attempt to identify and track down their private sources. The company fired the employees involved in the surveillance and said they had “misused their authority” to obtain user data, but the incident only fueled suspicions about the company.
These suspicions could be a factor in why TikTok’s negotiations with the US Committee on Foreign Investment in the US, or CFIUS, are dragging on. CFIUS is an interagency government committee that reviews whether business deals are a threat to US national security. CFIUS has been reviewing ByteDance’s 2017 merger of TikTok and the company Musical.ly, giving it the power to unwind the deal and force TikTok to sell to a US company. Both TikTok and CFIUS were reportedly close to reaching an agreement to avoid that scenario, but negotiations have stalled.
It’s widely acknowledged that political escalations between China and the US have played a role in the delay. It’s not a good time for political agencies or elected officials — including President Biden, who would need to sign off on the deal — to support anything seen as pro-China.
“TikTok has realized that this is truly a political matter. It’s less about convincing national security authorities and more about convincing politicians,” said Anupam Chander, a professor of law and technology at Georgetown University.
Chander was part of a small group of academics, lobbyists, and data privacy experts that TikTok briefed about Project Texas in Washington, DC, a few weeks ago. The challenge, Chander said, is that “today, in certain political circles, any ties to China are poison.”
That might explain why TikTok executives steered clear of mentioning China on Tuesday.
Going under the hood
TikTok’s new Transparency and Accountability Center offered reporters details on its elusive recommendation algorithm and some tangible examples of how the app moderates content, but fell short of anything revelatory.
One tutorial in the center was all about TikTok’s recommendation algorithm, called the “code simulator.” It explained how the first time you open the app, you’re shown eight videos of trending topics that TikTok thinks you might be interested in. Then, the app refines its understanding of your interests based on what videos you’ve liked, viewed, and shared, what accounts you follow, and what people in your similar demographic are interested in. The tutorial showed snippets of the code used to program the machine learning models that recommend that content.
The second — and more engaging — educational exercise was a simulation of what it’s like to moderate controversial content on TikTok. One video showed a man making jittery movements with his arms with a caption saying he had just received a dose of a vaccine — set to a laugh track. Next to the video, a screen detailed TikTok’s misinformation policies. (The video wasn’t violating them since it was considered humor and not actual health misinformation.)
The exercise gave me a better understanding of the tough calls TikTok’s more than 10,000 people worldwide working on trust and safety have to make every day. But I wanted to know more about the process for making TikTok’s guidelines and designing its algorithm: Who decides what content gets seen by more people on TikTok, and how does the app decide when to boost or demote certain content?
TikTok staffers told me the app only promotes .002 percent of videos on its platform, and that those decisions are made by the content programming team, who identify which videos have the potential to be trending. One example they gave was how the company manually gave the Rolling Stones a boost when the band first joined TikTok.
TikTok said it’s giving some outside experts access to more detailed under-the-hood specifics: its entire source code, as well as specifics on exceptions it makes to manually promote certain trending content, in a separate, top-secret room in Maryland (you have to sign an NDA to enter). The company also said that Oracle employees have been reviewing TikTok’s code at a separate transparency center in Maryland.
While TikTok’s transparency center does give a little more insight into how the company and its app operate, there’s a lot we still don’t know about exactly how content, data, and moderation decisions are made inside the company.
On the other hand, TikTok is taking some novel approaches to try to shed light on its data practices and algorithms. Under TikTok’s USDS plan, a group of Oracle employees and security experts are supposed to be monitoring the company’s proprietary algorithms that dictate what millions of people see every day when they log in to the app. We don’t have that level of outside accountability for Facebook or YouTube. Companies like Meta and Google also track massive amounts of our personal information online but don’t attract the same type of national security concerns as TikTok because they’re American companies. Even if TikTok is now sharing information out of political necessity, it’s a net positive to society that they’re sharing any information at all.
It’s yet to be seen whether TikTok will manage to change minds on Capitol Hill. While these latest initiatives are a first step, it is going to take a lot more — and the validation of outside partners and experts — to persuade TikTok’s strongest skeptics.