“Nothing quite like casually reviewing horrific anti-Black hate speech to jolt you awake before 9:30 a.m. on a Monday,” I tweeted from shaky fingers.
Back then, in 2021, Twitter was still Twitter, I was still on Twitter, and I had spent years working inside Trust & Safety (T&S) departments at Twitter and Twitch as a senior policy expert in free expression.
That role put me in a position of immense power because inside technology companies, T&S departments have quietly acted as the governing bodies for our daily digital interactions. These departments are charged with keeping internet platforms and services as usable and safe as possible. Depending on the company, this usually means developing and enforcing rules or community guidelines (content moderation); engineering certain new products and trying to convince other departments not to engineer other products; conducting research; monitoring user accounts for coordinated, manipulated, or inauthentic behaviors; and dealing with various takedown requests.
In recent years, journalists and academics have sought to bring more transparency to the often opaque field of internet governance. While the resulting literature has offered useful insights, insufficient attention has been paid to the unique experiences of Black people working in T&S. It is in this context that I dedicated my time as a CCSRE Technology & Racial Equity Practitioner Fellow at Stanford University to conducting research to start addressing this gap.
My fellowship and research for this article ended up being anything but ordinary. I knew that I would be asking individuals to take enormous personal risks simply by speaking with me about their everyday experiences. I also knew that I would be asking them about some of the most traumatic experiences of their working lives.
These concerns were not taken lightly. Drawing on my own experience of fielding inappropriate and/or inconsiderate interview requests from academics and journalists, I was conscious that I would need to take a different approach. The Black T&S workers I wanted to speak with expressed concerns about remaining safe and protecting their identities. I ensured them I would treat their stories with the utmost level of respect and confidentiality.
I understood my sources’ concerns about keeping their secrets more than they knew. During the time I was conducting interviews, I had a big secret of my own. I had been subpoenaed by Congress and was blowing the whistle on Twitter to the United States House Select Committee on the January 6 attack.
After my testimony was first used pseudonymously in a Congressional hearing, and then I publicly revealed my identity, I was thrust into a new spotlight. With this came attention from the right-wing corners of the internet. Soon my name, picture, and old internal Slack messages from my work at Twitter were misconstrued and published to the world as part of what became known as the Twitter Files.
The Twitter Files created an unimaginable new vector of abuse for people who had already seen the worst sides of the internet and were keenly aware of online harassment. While T&S workers had feared targeting by an embittered colleague or an external hack, no one had considered that their employers would go to those lengths. Yet because of the Twitter Files, T&S workers faced death threats, rape threats, and doxxing that required them to flee their homes.
This all also happened at a time of industry-wide layoffs and market uncertainty. Elon Musk paved the way for T&S roles to be eliminated en masse by firing 80 percent of the company formerly known as Twitter. Other tech companies like Meta, Amazon, and Alphabet followed suit and decimated their T&S teams.
Thus, as I did this research, the risk calculation changed for T&S workers across the industry. Along with this, people I had previously interviewed changed their minds about being included in this story.
This article is based on interviews with six individuals who identify as Black Americans and who worked within T&S departments of tech companies. Their roles varied from building training curriculum to leading policy teams. They collectively worked at 12 technology companies ranging from legacy organizations like Google, Twitter, and Meta to financial and education technology start-ups. I’ve given them the pseudonyms Avery, Carter, Edison, Frankie, Kai, and Parker, and used gender-neutral pronouns to refer to them all.
What I found from my interviews is evidence that people are experiencing a wild combination of circumstances by simply existing as a Black person within a T&S department. They feel what it’s like to be othered within a supposedly progressive environment. They encounter aggressions and slurs at work. They feel the weight of specific labor necessary to fight for the digital rights of individuals around the globe who look like them. And all of this takes a heavy personal toll.
This has serious implications for humanity. While it may seem like ghost work, the work from these seemingly invisible hands forms the bedrock of how our internet was built and is regulated. These individuals’ brains have created our modern society and developed the architecture for our digital human rights. As one of the T&S workers I spoke with told me, “By the end of my career, I think I will have had a profound effect on the way that the internet works, especially social media platforms.”
However, almost every interviewee would become cautious. Speaking in hushed tones, they would describe a harrowing work experience on a T&S team that they had never previously shared. After this happened a couple times, I decided to ask why they were sharing these secrets with me.
“No one has ever asked,” I was told.
The six stories I highlight within this article are not meant to be a representative sample to generalize the experiences of all Black T&S workers or all T&S departments. Rather, these interviews serve as the first inquiry and foundational exploration into an area rich for future in-depth research. Each theme and finding drawn from this article deserves to be delved into with dedicated research using quantitative and qualitative methods. These areas of further study include discrimination in pay and leveling in T&S departments, a lack of diversity within senior T&S leadership, inadequate warnings about the content workers will encounter, a very specific type of labor extracted from marginalized employees that I’m calling Compelled Identity Labor, and insufficient institutional support for the lasting impact of their work.
Recording the experiences highlighted within this article and from additional future research is essential. We are in a moment of critical reflection about technology, and with external regulation for internet governance on the horizon, it’s vital that we amplify these untold stories. As we envision new ways to create safety in our future digital lives, it’s important to listen to and honor these human experiences. By drawing upon the experiences shared with me, I also provide four recommendations for how to begin to create more equitable regulatory environments and develop a healthier digital experience for all.
Diversity, Inequity, and Exclusion
Whistleblowers Ifeoma Ozoma at Pinterest and Charlotte Newman at Amazon gave the public the first real view into the modern environment for Black employees inside technology companies. These Black women, who both worked on policy teams (which are typically housed inside of T&S departments), reported in 2021 that their companies and departments were rife with discrimination in pay and leveling based on race.
Ozoma and Newman’s experiences resonated with the Black T&S workers I interviewed for this article. Most said the first obstacles in their careers presented themselves during the recruiting and hiring process.
Kai bluntly stated that on their first job in T&S, “it was very clear there was a huge disparity in my level of education and what I was being paid.”
“It feels pretty systemic that [Black] people coming in are downleveled, compared to their experience and their education,” Parker said.
In addition to stories about Black employees being overqualified and underleveled, interviewees said they were often unable to negotiate starting salaries like their non-Black colleagues and were often paid less.
Avery said that when they were hired, they were offered a certain salary that seemed low to them. “I counteroffered,” they said. “That final number was the first number somebody else [who wasn’t Black] got. They didn’t have to negotiate for the same amount.”
Parker added, “I thought I’d done a good job negotiating, but more people came on after and ended up with way more money.”
Every interviewee said that the companies for which they’ve worked, like most in the technology industry, presented an outwardly progressive stance toward diversity, equity, and inclusion (DEI). Notably, though, all said that their onboarding experiences portrayed a very different reality.
Avery compared tech companies, often heralded for their innovation, to “traditional work environments” like law firms, noting that while tech industry jobs are marketed to be different, “there are still things to watch out for.”
Kai said tech companies “do a good job of marketing the legal floor as a philosophical or moral stance of the company: ‘We do not believe in discrimination.’ Well, that’s also illegal.” They continued, “However, they made this seem like it’s a moral sort of imperative.”
Avery also felt tech companies’ PR around DEI rang hollow. “The sentiment is that it’s not a serious attempt at creating a more equitable environment,” they said. “Or that it’s just something to appease people, or something to write down on paper.”
Frankie outlined a detail about where diversity occurred that was a repeated theme across my interviews. Companies “say they are pro-diversity. But companies show who they care about. Who is at higher levels?” Frankie asked. “Who are the vice presidents [of T&S]?”
Carter added, “There is a problem with T&S as far as Black leadership goes.”
Parker summed it up this way: “I lost hope of tech companies being this environment where there’s sincere intentions about diversity.”
Exposure Warning and Macro Aggressions
The unique function of self-regulating the content posted to technology platforms and services falls to the employees of T&S departments. For my interviewees, exposure to hate speech was omnipresent in these jobs, regardless of whether they were directly responsible for moderating hate speech content. This included graphic examples of anti-Black hate speech, gore, and violence.
I asked the interviewees whether they felt properly prepared for — or at least warned about — the type of content to which they would routinely be exposed. All said that while there were warnings that they would likely encounter sensitive content, most were not made aware of the extremity of the material or the frequency with which it would be seen.
Avery said that prior to taking their job in T&S, they were not told about the types of imagery they would be exposed to. “There was no specific, ‘We’re going to show you examples,’” they said.
“I wasn’t aware,” Kai said. “I know I was told we would see some pretty ‘gnarly’ stuff, but it wasn’t particularly explicit on what ‘gnarly’ would be.”
“It’s traumatizing,” Avery said. “You are going to see charred human bodies.”
Kai said that the warnings they received were inadequate. “Which is not to say I wouldn’t have taken the job,” they added. “I just should have been allowed to consent to a larger degree than I was actually allowed to.”
Once Black employees were hired and working inside T&S departments, their daily experience and environment could be challenging. As well as being exposed to hate speech on the platforms they worked for, they described casual usage of anti-Black slurs within their own workplaces, including being on the receiving end of such slurs from colleagues.
“Sometimes the micro feels macro,” Frankie said. “T&S talks about sensitive things bluntly. But people still need a certain level of decorum.”
“There has been a large usage of slurs unnecessarily because of the workplace I am in,” Kai reported. “People think they need to use the actual slur instead of some censored version of the slur.”
Frankie added, “I’ve asked non-Black colleagues to not say the N-word and not tag Black colleagues when using it.”
Kai said that “the N-word with the hard ‘r’ is often used” in T&S: “You do feel it in your body, like physically, when someone is using that word, even if someone is repeating it because it’s words on a page,” they said.
The experiences are often worse. Frankie described an incident when they joined a new company and a colleague posted a photo of the new team on the service they worked for.
Frankie said, “One of the comments directed at me said, ‘Who’s that N-word in the pic?’”
When Frankie brought it up with the colleague who posted the image, their colleague replied, “it could be worse.”
Compelled Identity Labor
In Sarah T. Roberts’ seminal book, Behind the Screen: Content Moderation in the Shadows of Social Media, she describes a conversation she had with a commercial content moderator contracted by a third party. The moderator discussed how the technology company’s content moderation rules seemed to vary across geographical regions, and surmised that “one person on the [internal T&S] team is just passionate about the issues in the Middle East.”
This is not a subject I had previously seen discussed. When I read it, I immediately thought of the numerous Palestinian people I worked with on T&S teams who had been thrust into determining exceedingly complex and personal geopolitical situations.
The experience is something that also resonated with my interviewees. This is a unique type of work that I am calling Compelled Identity Labor. It refers to situations where T&S departments default to treating employees with marginalized identities as the voices and/or representatives of entire online communities who share their identities. This can force Black T&S workers into a situation where they are tasked with being the sole voice advocating on behalf of the human rights of all those people who happen to look like them.
This forced labor creates a sort of cognitive dissonance where people have to divorce themselves from their identity in order to meaningfully advocate on behalf of themselves. Kai conjured the philosophies of Frantz Fanon to describe the experience. “There’s very different masks you have to put on,” they said.
“It gets very different when you’re the face of the conversation piece and you also have to intellectualize it,” Kai continued. “I had to do a lot more emotional labor to get through those moments.”
None of my interviewees consider themselves experts on race. All, however, described instances where non-Black colleagues had treated them as experts on anti-Black hate speech or slurs. Multiple interviewees described how higher-ups within their T&S departments had specifically asked for their input on whether the company should decide that the N-word ending with an “a” should be treated the same as the word ending in “er.”
“Having these conversations makes you look at people funny,” Frankie said. “I don’t want to always have to be that person.”
My interviewees said that this Compelled Identity Labor often multiplied during times of high-profile civil unrest. Many noted that George Floyd’s death in 2020 had brought a particularly challenging spike in the volume of this type of labor within T&S teams, on top of their personal processing of the events.
For Frankie, those months of social upheaval felt exhausting and relentless both professionally and personally. After working for hours during the day moderating the anti-Black hate speech content on the service they worked for, they would close their laptop at night and then go out to join a protest. They said they felt like they couldn’t take a break from being surrounded by hatred.
Kai also recalled challenging experiences in their T&S department following Floyd’s death. They said their department struggled to establish a policy for handling the influx of hateful content posted to the platform. In addition, the company was deciding how to publicly position itself in relation to the growing conversations about racial justice in the wake of the #BlackLivesMatter movement. Kai was asked to help.
Kai said the “well-meaning white people I worked with” had a “reliance on me to cosign or help them get to a decision, or help them through the decision-making process so that we as a platform could navigate the space, and they could navigate the field.” Kai added that their T&S leadership “didn’t want to make a decision that felt like they would offend me.”
Kai also described a conversation that happened around the same time when they were placed in the position of making an argument for the Confederate flag to be designated a hate symbol. A white colleague made a counterargument that the flag was reclaimed language and not hateful.
“That is the kind of stuff where you’re like, ‘This is a joke, right?’” Kai said. “But they’re not kidding, they’re very serious, because they fundamentally believe in this idea of ‘freedom of expression’ — and what they don’t see is the harm that some expression can cause people actual physical harm.”
While all the interviewees spoke of the responsibility they felt in carrying their entire community on their backs through their daily work, they also said that despite their discomfort in these situations, they felt it was important for them to be in these rooms.
“When the culture becomes a part of the conversation, it’s important to have the right people in those conversations,” Edison said.
The experience of working in these environments, performing this Compelled Identity Labor, and being exposed to hate speech content was profound for the people I spoke with. Given what these humans described as seeing, doing, and experiencing every day, I wanted to know what kind of impact they believed the work of T&S was having on them.
What I heard from my interviewees aligned with reports from nearly a decade ago about secondary and vicarious trauma experienced by moderators and journalists whose work requires them to view graphic content on the internet. It also mirrored the “psychological strain” described by whistleblower Joël Carter, who came forward in September 2023 with claims of discrimination and retaliation occurring in the T&S department of TikTok’s parent company ByteDance.
Kai expressed that the trauma and strain is almost inescapable if your job requires viewing hateful or graphic content online. It’s “not quite a question” of whether you feel it, they said, but rather the extent to which you feel it.
“You learn very early on that you don’t need to be married to the argument in an emotional way, but what you do need is to make the argument,” Kai continued. “In the moment, it feels fine. But afterward, it is awful. It’s like I can’t believe I had to do all of that because people don’t believe that my humanity or identity should be disallowing assembly.”
Avery said, “Balance is doing what I’m supposed to, in terms of enforcing accurately the community standards and finishing that work. But after, I have to take time to process what I’ve seen.”
“We’re professionals, but we’re also individuals. There are moments where I don’t want to take part in this at all,” Parker said. “That has been really hard for me.”
Kai said it was difficult when colleagues “did not think about me or people who look like me to want to save them from potential violence.” They added, “That’s a lot to carry with you.”
Parker said that the way the T&S working environment affected people was “a very individual thing.” They explained, “You have to hold a lot of things, and it’s just tiring, honestly.”
Frankie described the physical and psychological impacts the job has had on them as “jarring.”
“Sometimes I’m not able to sleep so well. I wake up seeing images, and I react to loud noises,” they said. “I have less patience, I cry a lot more, and I have gotten a darker sense of humor.”
And, Frankie added, they lost 30 pounds due to the stress of their work.
Carter said that over time, the work has chipped away at their confidence.
For Avery, it has made them more wary of the world and caused them to experience hopelessness. They said, “I just have to walk around knowing … people who are plotting to harm folks on the basis of skin color … are really common enough for me to be worried that they are closer to me in a social setting than I would like them to be.” They added, “They’re your middlemen; they’re your police officers; they’re your people next door.”
The impact of T&S work has been so severe on the people I spoke with that they feared it had caused permanent psychological damage. It has also made them reevaluate their careers.
“My work has a more profound impact on me,” Kai said. “I’ll never be the same, and that is disturbing to me — even saying that is highly disturbing to me. I will never be the same, and I can’t unsee the things that I have seen.”
They continued, “I have had to get out of the direct adjudication of content that involves large amounts of hate speech. I can no longer make those arguments; I can’t divorce myself from making those arguments. So I’ve had to change my career trajectory.”
Glimpses of Hope
After listening to each individual’s stories and hearing the impact they have experienced, I ended all of my interviews by asking what advice they would give their former selves or any other Black person considering working in T&S.
The responses shocked me. Despite the challenging circumstances they had all outlined, none of my interviewees said they wouldn’t take the job again; nor would they discourage Black folks from taking T&S jobs. Instead, they offered measured and cautious advice.
Given the nature of the labor, Parker stressed the importance of creating a separation between personal life and work life. They also noted the importance of not attempting to do this work alone. “It took me a long time to understand that I needed support communities,” Parker said. “It’s super important to have that group of people who create a safe space for you. Learn how to do that.”
Kai echoed the importance of support and community. They said it was necessary to have these systems firmly in place while doing T&S work: “Not feeling great and having to do this job on top of that is not the move.”
My interviewees also cautioned that while the job may feel like the weight of the world at times, it’s important to remember that they have options.
Carter encouraged Black people interested in the tech industry to jump and take the leap: “Figure out what about the work drives you, and then just go for it.” They added, “But don’t stay around too long.”
Edison agreed. “Double down on what you can bring to the table and nurture whatever that is. And if it doesn’t go right, move on,” they said. “These are all just jobs, in the end.”
Kai said the most important thing for Black people in T&S roles is to manage their expectations, and “have an understanding of what you can and cannot do.”
Then Kai shared a piece of advice they said had taken them a while to realize. It was something I wish someone had told me a decade ago when I started on my career path.
“You are not going to fix racism by doing policy. So that will give you a leg up.”
Findings and Recommendations
The experiences of the six individuals I spoke with paint a picture of internet governance that is not only fragile but places a heavy human toll on Black T&S employees. While these stories are not a representative sample meant to generalize the full experiences of Black T&S employees, the findings from these interviews highlight the first inquiry into numerous areas that are rich for future research.
Findings from these interviews with Black T&S employees illuminate a uniquely challenging — and at times hostile — working environment. Among the many things contributing to this situation are discrimination in pay and leveling that begin with recruitment and hiring; a lack of diversity in senior leadership roles within T&S departments; an environment reverberating with racial slurs; inadequate preparation for exposure to traumatizing content — sometimes even in roles where it is not a job requirement — and insufficient support for coping with exposure to such material; Compelled Identity Labor that requires employees from marginalized identities to speak on behalf of their identities; and deep psychological impacts from the entire experience of such work.
Without change and further research to find solutions, the tech industry risks failing to attract and retain Black T&S workers it so urgently needs. In this absence, a gap will be felt in both the online and the real world protection of marginalized communities. Based on my interview findings and personal experiences, I have four recommendations to tackle the litany of challenges to creating environments where Black T&S employees are not only treated equitably, but are also able to thrive.
First, T&S departments need to eliminate disparities in pay and leveling. As Kai pointed out, discrimination based on protected characteristics is illegal. Employees who see it happening should discuss it with their company’s internal employment lawyers, who are typically housed in employee relations departments.
To proactively ensure equity, T&S heads should also bring in external auditors to conduct assessments of leveling and pay across their departments. If there are discrepancies, leadership should immediately rectify any inconsistencies, realign current compensation, and offer affected individuals back pay and requisite stock options.
Second, T&S departments need to develop more robust warning and boundary systems for employees who will be exposed to harmful content. Before hiring, companies should provide more thorough waivers that detail the types of content employees are expected to assess. As Frankie told me, “Companies need to make sure they are specific about what ‘sensitive content’ means.”
Companies should also acknowledge that T&S employees are humans, and humans have limitations. This means providing and maintaining a list of “no-go” boundaries that employees can set for content they prefer not to review. T&S heads also should enforce standards for the workplace that include encouragement against the uncensored usage of racial slurs.
Third, T&S needs to hire experts who specifically focus on race, ethnicity, and national origin. These are complex concepts that vary across the globe, and companies must end their reliance on single employees or groups who hold marginalized identities to speak on behalf of their own identities. Compelled Identity Labor is not a healthy or sustainable solution.
Once hired, these new experts, in addition to their Black T&S colleagues, should not have to rely on their own wellness practices to manage the impacts of their work. Companies should provide expanded resources for T&S workers to support their mental resilience and psychological well-being.
Fourth, tech companies should continue to invest in developing and supporting diverse T&S teams and leadership. This includes the incentive of tying executive compensation to diversity goals. In a time in the United States when diversity, equity, and inclusion programs are under attack, this could be difficult.
However, given the unique function of T&S in providing safety for humans, Edison explained to me why it is vital that these teams reflect the diversity of humanity. “We need the awareness shown by those who lived it. It keeps us honest and open and realistic about what we’re doing, the impact that we have, and makes sure that we don’t harm anybody.” They added, “We can take the bias out of certain behaviors now.”
Despite the challenges they face at work, my interviewees understood this deeply. They remained adamant that their experiences — and this research — should not deter people from entering the field of T&S and doing the vital work of internet governance. Rather, they wanted Black folks thinking about joining the industry to know that they are essential to a functioning, healthy internet.
Frankie said, “Your thoughts, your impact, and your empathy are needed.”
“You have more to give than you think,” Edison reiterated. “It’s important to realize you’re there for a reason.”
Anika Collier Navaroli is currently a Senior Fellow at the Tow Center for Digital Journalism at Columbia University and a Public Voices Fellow on Technology in the Public Interest with The OpEd Project. She was previously a CCSRE Technology & Racial Equity Practitioner Fellow at Stanford University. Anika also held senior policy official positions within the T&S departments at Twitter and Twitch. In 2022, she blew the whistle about her warnings to Twitter that went unheeded leading to the January 6th attack on the Capitol and the platform’s ultimate decision to suspend former President Donald Trump.