I’m a journalist and an urban planner, both service professions with an ethical obligation to the public. The work of planners and journalists is deeply embedded within power structures that perpetuate racial injustice and social inequality—and unfortunately, neither profession has engaged in the deep self-reflection required to change all that.
My interest in the confluence of journalism and urban planning brought me to the work of Chris Gilliard, who, in his critical engagement with the media, expands on redlining, the practice of excluding certain people from access to goods or services by selectively raising prices. Gilliard’s work identifies what he and coauthor Hugh Culick refer to as “digital redlining”: algorithmic practices of online discrimination against Black communities.
I also talked to Marcus Gilroy-Ware, whose work looks at how journalism responds to the challenge of fake news and the competition for both attention and action in an unregulated online media environment. How does the truth circulate? How is it even determined? I’m fascinated by Gilroy-Ware’s examination of the political economy of information, how social media posts and news stories organize their audiences into different groups, be they communities, consumers, or trolls.
Chris, Marcus, and I talked about journalism versus content production and how data is employed to monetize our attention. We discussed the social inequities that are built into the foundations of digital media and journalism, and how journalists might envision a new online architecture to house public information. —Nehal El-Hadi
Listen here and find the transcript of the conversation below.
Nehal El-Hadi: Marcus, if I could start with you. You wrote After the Fact? The Truth About Fake News, a book about the circulation of fake news within this environment. Could you talk about that a little bit, please?
Marcus Gilroy-Ware: I tried to take a slightly provocative approach to the idea of fake news. The book is motivated by some of the problems I found in the ways that misinformation and disinformation are talked about. The way in which we talk about this is rather selective—and blind to the overall ways in which we can be misinformed by cultural and ideological processes, as well as by much more narrow and specific forms of factual inaccuracy, whether inadvertent or deliberate. Part of that is in considering the relationship between journalism and neoliberal tables of value in government policy. That question has not really been, for me, engaged with enough. So as part of that overall inquiry, one of the things I wanted to do was look at the journalism industry, an industry I’ve been involved in and worked in. I have a slight disappointment, because there’s always been this sense of incredible potential and I really believe in the value of what great journalists do. And yet, as part of our overall cultural complacency around this slide into neoliberalism that has occurred in the last thirty or forty years, a lot of the things that we say we want journalism to do haven’t occurred.
At a certain point we started to use a word that troubled me, which was content. This is content, or creating content. This is emphasizing the wrong part of what we’re doing. Journalism isn’t just content—it may be content for somebody, but that’s somebody who only cares about very specific aspects of its presence and its exchange value. Neoliberalism has reduced journalism to being content to the extent to which it now has to compete with all the other forms of content that are out there on the internet.
El-Hadi: Chris, can you talk about content and information in data and the ways that they are being used?
Chris Gilliard: What Marcus said about content leads me to think about the ways in which internet platforms—or internet companies or tech companies, however you want to think about them—have helped to structure and push a narrative about what content is or what it should be. One of the immediate examples that comes to mind is the infamous “pivot to video” claim: that, eventually, everything was going to become video. Journalism poured a lot of resources into following Facebook down that path, which proved to not be as Facebook predicted.
El-Hadi: At the same time, I also want to bring up the events of the past year. The anti-police-brutality protests, this push towards inclusion and diversity and representation—the internet has worked hand in hand with this. That can’t be taken out of this paradigm shift within journalism.
Gilroy-Ware: Do you mean in relation to social media companies and their amplification effects?
El-Hadi: That and also the ways in which journalism is facing a “reckoning” again.
Gilroy-Ware: The first thing to say is, as much as I’m among the most vehement critics of Facebook and Twitter and their peers, I think it’s important to remember that it’s not as simple as saying that everything they do is bad. If we look at the enormous increase in interest and in traffic with Black Lives Matter as a long-term movement spiking in the summer, we can see that there are effects and possibilities enabled by these platforms that would have been probably more difficult without them.
We have to be able to find solutions that don’t ask us to give up the ability to talk to one another. I’m interested in imagining what that looks like. If you look at Black Lives Matter in the summer, the technical possibilities for how that could have been dealt with journalistically were all there. But editorially, time and time and time again, the stories that were written, the headlines that were written—the ways that the destruction of Black lives at the hands of police and the outpouring of anger in relation to that was handled in newsrooms—was extremely problematic. It’s not that it’s necessarily hostile; I’m reminded of Dr. King’s white moderate [from the “Letter from Birmingham Jail”]: Shallow understanding from people of goodwill is more frustrating than absolute misunderstanding from people of ill will.
El-Hadi: I’m thinking about who controls which data, which information, and then this idea that once it’s out there, it is uncontrollable. Chris, you’ve talked about the ways in which information is employed in digital redlining.
Gilliard: A lot of the work that I do is about the ways in which Facebook and Twitter are very much invested in promoting themselves as neutral companies, or as just the middleman, or as a mirror that reflects society. They use a lot of different metaphors. But one of the things that is undoubtedly true—but that they try to get away from—is that they exist to promote and amplify. They know that certain kinds of information, certain kinds of data, certain kinds of activities drive people to consume more. For instance, Facebook’s own research says that something like 64 percent of people who joined extremist groups on Facebook in one study have had extremist content recommended to them. Facebook promoted this activity to them.
Gilroy-Ware: I very much agree with the analysis that Chris has just offered, and I think it’s important to say that however much we might acknowledge that some of the things happening on social platforms are useful or good, that is almost the unintended consequence of us being able to find little areas—hiding places, if you like—within these platforms where we can practice mutual aid and other things like that.
We always talk about how it’s the data, the data, the data, that we want the data. But actually the commodification is even more cynical than that. It’s our attention that they want. The data is used simply to refine and better target our attention. You cannot be neutral and have that be your business model, because human beings are least interested in that which is neutral. People are much more interested in the things that either make them happy or distracted from the problems in their life, or the things that make them angry. And really, Facebook and Twitter don’t care that much one way or the other. What they want is your attention, whether it’s things you are angry about or things that you are happy about. These companies study our relationship to both quite carefully. That’s well-documented in the emotional-contagion study Facebook published in 2014 or a number of other experiments they’ve done with timeline algorithms and so forth. That’s what the algorithm of the timeline and the whole practice of continually scrolling and liking is intended to garner. It’s really important that we unpick that and criticize that.
El-Hadi: What does the conflation of content and journalism mean in this context?
Gilroy-Ware: In theory, journalism is part of the antidote to ignorance. Supposedly, at least in the ideal sense, the more we are learning about what’s happening in our world and what experts might be saying about it, the more empowered we are to survive in that world and able to challenge the conditions of our oppression and able to make things better for others as well. When you start to conflate journalism and content, and this attention-driven model starts to drive us towards “Twenty-One Zany Pictures of Goats on Cliffs” or whatever, then civic engagement and literacy and the ability to challenge your own oppression starts to be eroded. If that happens for one year, it’s a problem. If it happens for twenty years, it’s a huge problem.
Meanwhile, there’s a hundred other things that you can be looking at that don’t give you civic engagement, but that can keep your boredom at bay and keep you from remembering how alienated or oppressed or belittled you otherwise feel.
Like many people, I’ve joined TikTok in the past year to have a look at what is happening there. It’s interesting because in the beginning, maybe ten months ago, it was all dance routines—nothing very exciting. And now there’s a number of people who are starting to conflate content and journalism—or at least content and public knowledge—but in a much more sensible way. People with advanced training in some field or another who are trying to explain to us, for instance, the historical definition of fascism, why Trump was a fascist in his presidency, and other quite useful things.
There might be a conflation of journalism and content in a bad way in the sense that the two things are occupying the same timeline, and we’ll always go for the junk food rather than the healthy food. But we can also mix those things back together in a way that will lead to a more informed and empowered population. I do think there’s some possibility there.
El-Hadi: That’s a very techno-optimistic approach.
Gilroy-Ware: Interesting, because I consider myself a real techno-pessimist, generally speaking.
El-Hadi: Which is what I’ve gotten after reading your books, but you also have this hope.
Gilroy-Ware: I think it’s what Gramsci said: it’s the “pessimism of the intellect, optimism of the will.” If we don’t have some answers, some possibility in the future, then we really should just shut the whole internet down. There’s got to be some way that we can at least start chipping away at the structures that are producing these problems.
El-Hadi: What I appreciate about your work, Marcus and Chris, is that you complicate our relationships to these products in ways that don’t say “This is good, and this is bad.” But what you also do is point out where the harm is. The reality is that what we’re talking about—in terms of content, attention, power, concentration, information—has very real, tangible, material, harmful effects.
Gilliard: I think that much of what these companies do should be illegal. There are no significant federal privacy laws that prevent these business models from doing many of the things that lead to harms. What I’m referring to is the constant extraction of people’s data. In ways that I think most people don’t fully understand, we’re followed around and our data is taken from the moment we wake up. Actually, even when we are asleep—from the moment we go to bed.
One of the ways that this produces tremendous harms is that companies are able to use that targeting to deny people certain opportunities. I often refer to this as digital redlining. There have been numerous documented cases; a noted ProPublica story was about how Facebook would allow people to show an ad for an apartment or a job or something like that, and say which groups would see that and which groups would not. So you could conceivably post an ad for housing and say, “I don’t want any Black folks to see this.” Part of the specific harm with that, too, is that the average person who uses Facebook has no means of knowing that this is happening to them. Not only are they being harmed, but they’re being harmed in insidious and invisible ways. Another side of the coin of being able to target and recruit people and speak to their insecurities or their hatreds is being able to deny people opportunities.
Since last March, a lot of people have seen how essential it is to have internet access. I’ve long argued that internet access might be considered a public utility as important as water and electricity, and people often scoffed at that. But the last year has shown us that there are lots of ways in which the kind of education you get, the kind of job you can have, the kind of medical treatment you can get are all tied to your internet access. What kind of information comes to you and what kind of information you have to pay for and what kind of information is free—all of these things are so very closely tied to what kind of life you can live. There are a lot of harms involved in the extent to which Facebook, Twitter, Google, Instagram, and TikTok have data practices that look at people as resources in the most extractive way.
Gilroy-Ware: Chris made a point about the extractive relationship to human beings that social media companies have. Again, I think this is a hugely important point. There’s something about the way in which neoliberalism narrows down the question of value and what is valuable and what is not, the things that can be extracted from a person—I think Wendy Brown would call it a table of value. Attention, or something else that can be traded as a commodity, becomes extremely important to Facebook and Twitter, but other things about people—for instance, negative effects on mental health or on the public spheres in which they participate or on their education—are treated as collateral or externalities or just things that are not factored in. Facebook in particular, over the years, has embraced a business model that is centered on that choice of things to value—or not.
I’ve used the work of the historian Peter Linebaugh to try to think about this as a form of enclosure. Look at the way that the countryside and the commons were enclosed in prior centuries and in the twentieth century. Social media companies have slowly built a wall around what happens in journalism, around political discourse and debate. They built a wall around our interactions with our family. They’ve put themselves in the middle of all of those relationships and all of those spheres of conduct. There’s just something incredibly harmful about the extractive relationship; it’s a parasitic relationship that they’ve developed. The harms are not for them, but for us, and not least what happens to journalism. As long as we fail to address that, the same cycle will keep going around and around again: the harms will always be ours, and we will rightly raise an objection, and Facebook and Mark Zuckerberg will always be able to say, Well, we’re looking at this, and we’re sorry. But because the underlying problem in the business model is never changed, it’s never really settled, we just come back to a point where we’re outraged again at their invasions of privacy or their overall contempt for their own users.
When I set out to write my first book, Filling the Void, my goal was that if everyone reads this, Facebook is done tomorrow. But unfortunately, that’s not the way books work and that’s not the way discourse works. But I do hope for all users of social media platforms to have more literacy about what these companies really represent.
El-Hadi: I think that part of dealing with it is also naming it and being very specific about who the human beings who are being harmed are, and who are the ones that profit. Chris, when you talk about digital redlining, you are talking about marginalized Black communities in the United States. People don’t get harmed equally.
Gilliard: Absolutely. One of the maxims of privacy and surveillance work is that many of the harms, although they may eventually come down on everyone, are deployed against marginalized people in the earliest and in the most oppressive ways.
These companies have a massive investment in hiding what they do. I think about what Sarah Roberts calls commercial content moderation. There’s so much hate that gets through. There are people who sit in rooms for hours at a time and sort through terrible, terrible things, images and video, to make Facebook and YouTube somewhat palatable, to remove some of the hate.
Journalists have also done this work to some extent—to be the custodians of platforms and say, Hey, we found this on here when you didn’t or when your algorithm didn’t. You said that you got rid of “Stop the Steal” groups. You said you got rid of militias. And we still found them on your platforms. It’s very hard for the average person to know how they’re being harmed, because so much investment is made by these companies in occluding that fact.
El-Hadi: For the last portion of this conversation, I want to shift to a more, dare I say, optimistic approach to all of this. I want to go back to two points that each of you made. Chris, first, you had mentioned shifts you would like to see in the ways in which information circulates online. What are those shifts?
Gilliard: One of the things I’d really love to see at the intersection of technology and journalism is greater discussion of the privacy and surveillance harms of technologies. To give you an example, I’m a vocal opponent of the Ring doorbell and their partnerships with police departments. I’d really love to see a more cynical and critical eye, when journalists are reviewing technology products, to talk about the implications of having a web of surveillance that essentially turns everybody into cops and snitches or that functions as a gentrifying agent, where people are allowed to report Black and brown people doing mundane activities. Overall, I’d like to see a move away from the glowing review of all things tech, or not accepting that the proposed narrative of tech companies is the essential or main narrative.
El-Hadi: Marcus, you mentioned earlier how digital journalism could have evolved differently. What ideas or thoughts have you had around the ways in which digital journalism could move away from being content or product and back to what journalism is supposed to be about?
Gilroy-Ware: I’ve had the experience of teaching journalism in two different institutions. In the first one, journalism was in with the professions. It was in with nursing and accounting, which is not a problem, but I felt that in that setting, one of the things that we lacked was an exposure to creativity. At the institution where I now work, we are on the arts campus, in with the screen-printing students and the fashion students and the painting students. I feel that’s on some level very important, because one of the things that’s often been overlooked in the technological script is that making digital media well should be like making art well. I readily accept that it’s more expensive to make it this way. But given that, generally, as a sector, journalism is massively overproducing—because there’s lots of throwaway news articles that are all the same on every website—actually we would do better to use our resources to produce a more interactive and more creative digital journalism. One that thinks about that tension around our attention and thinks about the responsibilities that are incumbent on journalists to help people understand the world better, and that tries to combine that with interactivity, creativity, and even some considerations like color and layout. There’s so much bad design still in journalism. I can’t help but feel if you want to make a product that people are willing to pay for and engage with and learn from, then you have to make it as well-designed and beautiful as possible. Technologies can be developed to make that faster and easier for journalists than it currently is. It’s quite expensive to do, but it can still be done.
It was so interesting to me that when the “Snow Fall” feature was published in the New York Times some years ago, it was a huge thing. Journalism classrooms and everyone was looking at “Snow Fall,” even if it was basically a non-story as far as actual journalism. The way that it was put together and the animation and all of this stuff, everybody was blown away by it. That was years ago. Why hasn’t there been more of that? Why hasn’t BBC News started to do that on a regular basis? Why don’t they have a big desk of people employed just to do that, especially given the way that they’re subsidized? I think it comes down to priority, in the end. We’re in a capitalist world, and everything is going to be driven by an economy of money, first and foremost, including journalism, sadly. All too often this isn’t realized. We need to remember, our value lies in a different quadrant than simply profit and loss. Creativity and engagement and making something that’s appealing to people is equally important—in fact, more important.
El-Hadi: I’ve been thinking a lot about my own time in journalism school as I’ve been teaching critical engagements and responses to journalism and its future. The two things that end up coming up over and over again in these classroom discussions are notions of objectivity when it relates to data and race. You know, how do we establish or determine objectivity, and what does it even mean? Which is, I think, the discussion that journalism schools have been having since the dawn of time. But the other is about the ways in which journalism education needs to shift and move, not just in terms of what the students are taught, but including who the students are, who the faculty is—what kind of landscape or marketplace are they moving in to? How do they not only survive but also live up to the ethics of a profession that they’re taught—from the very beginning—that journalism is in service to the public? How do you teach that to journalists? The journalism industry ideally would make these changes by itself. But more realistically, it’s going to end up being individual journalists who push for that shift.
Gilliard: There’s been some tremendous work in just the last few years. So I don’t want to be all gloom and doom. I’ve been also really impressed, really spurred on by the willingness of journalists to talk not only to people who are perceived experts in the field, but also to the communities that these decisions affect a lot more. For example, if you want to talk about how powerful people have been affected by being banned on Facebook, I think it’s important to also talk to people who are far less powerful, who have been banned on platforms with often little or no recourse. There’s a lot of talk about “Facebook jail” and what that means if you run a small business or if you are a community activist. I’ve seen more willingness on the part of journalists to have that discussion with people who typically wouldn’t be seen as experts but who are very much affected by these things in ways that bring important perspective.
El-Hadi: I’d like to thank both of you. I think with and through both of your work quite a bit and use it with my students. And so I really appreciate this opportunity to put both of you in conversation with each other.
Gilroy-Ware: It’s been a pleasure speaking with you.