cover story

Look at Me!

A writer’s search for journalism in the age of branding
May 18, 2010

When I was nineteen and chose to accept the creeping suspicion that I would turn out to be a writer and, by extension, chronically deficient of funds, I made the fiscally prudent decision to drop out of school. I still worked on the college newspaper to which I had sacrificed so much of my grade-point average, writing a weekly gossip column until a brother in the Zeta Beta Tau fraternity threatened to sue over an item I’d written about his alleged screening for his fraternity brothers of a video he’d filmed of himself having sex with his girlfriend. The threats spooked the editors of The Daily Pennsylvanian into suspending the column entirely. This did not bother me, as I thought I had more substantial work to do.

On the other side of town, a virulent heroin epidemic needed to be investigated. It was 1998 and Philadelphia still nurtured a robust-in-hindsight tabloid newspaper called the Philadelphia Daily News (motto: “The People Paper”), where I was an intern on the city desk. The summer had required the city desk’s near-daily attendance at some photo-op or announcement in a particularly lawless swath of eastern North Philadelphia known as the Badlands. It was the most syringe-blanketed, zombie-infested, bombed-out neighborhood in a town in which achieving a superlative in such categories really meant something, and the city’s new celebrity police commissioner, John Timoney, had thrown himself into an exotic—or quixotic—quest to finally Do Something About It, via a multi-agency siege he called Operation Sunrise.

For all of Timoney’s messianic zeal, his efforts instilled little faith in the loose confederation of addiction counselors and rehab providers I met in the Badlands. Their budgets had been gutted by some technicality of welfare reform, the heroin seemed to be getting purer and more noxious every week, and they could not handle the drastic influx of court dates and bail demands they faced as a result of Operation Sunrise’s indiscriminate sweeps. A distressing new book on the drug war called The Fix illuminated their struggle; although numerous studies had estimated that every dollar spent in the attempt to constrain the demand for drugs—especially if those efforts focused on drugs’ most conspicuous consumers—was worth ten spent trying to stamp out its supply, the supply-siders had won the debate again and again. 

I wanted to alert “the people” of Philadelphia to the misconceptions clouding our heroin problem, so I called the author of The Fix. He humored me, and then casually asked if I was aware that John Timoney’s daughter, Christine, was a drug addict. 

This was tragic, of course, but also a fascinating story. Why was the police chief of an impoverished city with a famously overcrowded prison system and no shortage of rapists and murderers on the loose making it his first order of business to round up and jail a bunch of pathetic heroin addicts . . . when his own daughter was addicted to the stuff? Was he trying to track her down? Was it a macho thing? What was it like to fight the drug war on two such vastly different fronts? I scheduled an interview for the next week, telling his press officer I wanted to address concerns about the city’s “drug treatment infrastructure.”

But in the fluorescent glare of Timoney’s office, armed with my tape recorder, I felt like an asshole. The murder rate had already dropped drastically in his first few months on the job, and that year it would plunge below 300 after breaking 400 in every year of the previous decade. Who the hell was I? “I’ve known people who have gone into treatment,” he offered, shaking his head and giving me an opening to lamely and awkwardly mention his daughter. When I did, his expression hardened in a way that spooked me. “I don’t want to talk about my daughter,” he said. I left soon thereafter.

Sign up for CJR's daily email

And that was it. My editors instructed me to drop the story, and I left the paper the next month in a routine round of Knight-Ridder budget cuts. I ended up in Hong Kong, where I’d lived as a kid and where, for the time being, there was some money.

Hong Kong and Philadelphia had little in common save for the fact that both cities were deeply conscious of having passed their prime and were vulnerable to cheap highs. In any case, the way Philly took to smack, Hong Kong was falling prey to a particularly deranged case of the global Internet-stock addiction. The fact that Hong Kong had no Internet companies—or software companies, hardware companies, engineers, etc.—was a technicality. Opportunistic moneymen easily sidestepped this obstacle thanks to the rich supply of toy companies and trading firms listed on the Hong Kong Stock Exchange. A wealthy investor would simply buy out the small business owner, “inject” some “Internet assets” into the company—some URLs, maybe a server or two—pay himself a fee, and watch the stock explode. The fault lines of irrational exuberance were running through most of the mature economies of the world; Hong Kong’s Internet craze just seemed a few orders of magnitude more parodic than anything Silicon Valley ever came up with. Small-time shopkeepers who still did their bookkeeping with abacuses lined up by the thousands outside brokerage houses for shares in the latest new “Internet” offerings, even as most residents had never used the Internet in 1999 (though many learned how when they realized it could be used to trade Internet stocks). A local newspaper, the Hong Kong Standard, rebranded itself the Hong Kong iMail.

I worked at the Asia headquarters of Time magazine, where I wrote a daily Internet column chronicling the lunacy of the Asian markets—a subject about which I had no expertise, but that clearly was not stopping anyone. I should state here that my own company, Time Warner, which owned some of the world’s biggest magazines and HBO and one of America’s most venerable movie studios, was during this time acquired by AOL, a dial-up chat-room business that the market had, in its wisdom, decided was worth $163 billion, in large part because it had so few clunky “old economy” assets weighing down its stratospheric prospects for growth. That transaction seemed the platonic ideal of the rational market at work next to the epically shameless charlatanism I had to write about.

Within a year I had developed enough of a “following” to warrant being offered a gig at double my salary with an Internet startup, and a two-page profile in the aforementioned iMail followed soon. My friend Stephen, a British film writer, was quoted marveling over the “minor celebrity-cult status” I had built up in such a short time.

I left Hong Kong shortly after the story ran, for no particular reason other than a psychic nausea over how easy it had been to achieve as much as I had wanted there. It took a few months, but by the spring of 2001, I found a reporting spot in the Los Angeles bureau of The Wall Street Journal.

By this point, I had begun to develop a theory, partly by virtue of having experienced that one meaningful failure and one meaningless success, about generally what was wrong with the world and increasingly with the industry—journalism—that was attempting to convey it. I just didn’t know yet what I knew, and so this story stretches on for another nine years.

What I sensed was that while the laws of supply and demand governed everything on earth, the easy money was in demand—manufacturing it, manipulating it, sending it forth to multiply, etc. As a rule of thumb (and with some notable exceptions), the profit margins you could achieve selling a good or service were directly correlated to the total idiocy and/or moral bankruptcy of the demand you drummed up for it.

This was easier to grasp if you were in the business of peddling heroin, Internet stocks, or celebrity gossip; journalists, on the other hand, were at a conspicuous disadvantage when it came to understanding their role in this equation. In the past, newspapers had made respectable margins selling a non-inane product largely because people had little choice but to herald their sublets and white sales alongside the journalists’ tales of human suffering/corporate corruption/government ineptitude. The times were prosperous enough that much of the print media even chose to abstain from taking a share of the demand-creation campaigns of liquor and tobacco brands in the seventies and eighties. Indeed, journalism, it went without saying, was about delivering important information about the world—information people (and democracy!) needed, whether they knew it or not. That journalism’s ability to deliver that information—to fill that need—ultimately depended, to an unsettling degree, on the ability to create artificial demand for a lot of stuff that people didn’t actually need—luxury condos, ergonomically correct airplane seats, the latest celebrity-endorsed scent—was an afterthought at best, at least in the newsroom.

Journalists, by and large, had so little appreciation for their dependence on the larger engine of artificial demand that they were mostly blindsided when the Internet happened and they lost the benefits of that engine. A lot of them seemed to take it personally. They got insecure. Some started writing “trend” stories and giving over their column inches to celebrity newswires and sincerely talking about bylines (and politicians and everything else) as “brands.” They sold Time Warner to an absurdly overinflated dot-com. It’s not fair, of course, to blame only the journalists; there were mostly avowed capitalists in the corner offices of these places, and it is the fiduciary responsibility of capitalists to be as cowardly and uncreative as possible in times of fear and change.

This existential angst tormented even the commerce-savvy staff of the Journal, where I was assigned to the “youth” beat—which is to say, and it very much went without saying, youthful consumption trends. I was too young to realize that this was one of the few subjects about which young reporters, particularly the female ones, were trusted to cover with any measure of authority—because really, who gives a shit? I embedded myself on the front lines of the brand wars as if posterity really cared whether a popular new celebrity-endorsed offering from Nike or Adidas or Mattel or Urban Outfitters had yielded a noticeable market-share loss or gain that quarter. (To be fair, some hedge funds cared about this.)

Despite the superficiality of this beat, the people who inhabited it—the brands that were in demand—had money, power, and an attendant sense of entitlement that could be intimidating. At twenty-three, I felt sufficiently ancient and uncool to be consistently alarmed when, say, a sixteen-year-old small forward from Akron wrote me an angry two-way pager message when I respectfully declined his invitation to party in his hotel suite following a high school basketball tournament, or I awoke from a minor sneaker brand’s after-party to find a nineteen-year-old San Bernardino skateboarder attempting nonconsensual sex with me, or even when young celebrity stylists seemed sincerely to want to be my friend. I did not really identify with the cool-hunting, brand-building, sneaker-collecting generation of professional consumers I worked over for trend-story ideas, but neither did my colleagues in the bureau seem to identify with the megalomaniacal talent agents and casino magnates or the disgruntled aerospace engineers and short sellers they talked to all day.

So it wasn’t a total surprise that, amid the horror and sadness of September 11, I had also a sense of professional relief. I got to drive to San Diego to track down acquaintances of two Flight 77 hijackers who’d lived there, and generally conduct research on the local Muslim community. For the next six months, the paper was buoyed by a freak surge in demand for real journalism and its dusty byproducts—like collaboration, curiosity, a common sense of purpose. Of course, looking back, I also remember a lot of hysterical turf-warring, baseless speculating, and an overall atmosphere of humorlessness. (When I was dispatched to New Jersey to assist the “anthrax team” in attending the daily round of alarmist press briefings, for instance, a joking inquiry as to what sort of gas mask I ought to bring drew an earnest e-mail advising me that a preemptive course of Cipro might be more comfortable.) And so when the time came to resume the regimen of inquisitions into whether Barbie dolls could reclaim supremacy from the insurgent Bratz, or rappers could be convinced to switch sneaker brand allegiances from Nike to Reebok, and was the preeminent patron saint of pre-adolescent sartorial taste Britney Spears or Avril Lavigne . . . well, that was something of a relief, too. The biggest relief, though, would come when I was fired.

There were real stories on my beat, of course. It alarmed me, for instance, to learn that one of the companies in my “youth” sector, the mall chain Abercrombie & Fitch, made a weekly practice of purging its stores of hourly sales associates it deemed to be less than, in corporate parlance, “brand positive.”

The purgees were identified, a former regional manager explained, every week at corporate headquarters in New Albany, Ohio, during a conference call held specifically to critique photographs taken that week by the chain’s hundred or so district managers of all the “brand representatives” they had encountered in visits to their stores. The photos were uploaded onto some sort of company intranet, but my source told me his boss preferred printing them out on paper, so he could circle flaws, draw mustaches, scrawl racist epithets, etc. The source said braces, minor breakouts, the faintest possibility of weight gain, showing up to work in a prior season’s ensemble, wearing shoes that had not appeared on the list of authorized footwear for that season, and/or belonging to an ethnic minority could all be grounds for immediate dismissal from the ranks of Abercrombie & Fitch’s minimum-wage cadre of demand creators.  

I went to great lengths to corroborate the facts, which is where I fucked up; I e-mailed a draft of the piece (a decision inspired by a respected journalist I’d read about who said he did this all the time) to a trusted source, and he e-mailed it to someone else, and eventually it made its way to Abercrombie’s corporate offices and in turn to the company’s fearsome New York “crisis PR” firm. And because Wall Street Journal investigations are the sort of thing that affects the stock prices of companies, this was a fire-able offense. In retrospect, as much as I felt like a failure and a fuckup, I didn’t actually mind being liberated from the constant, insane pressure not to fuck up. All year I’d been variously accused of being “in the pocket” of one company or its rival by analysts, money managers, publicists, lawyers, etc., and I’d found it preposterous. What did I care who prevailed in the sneaker wars or the doll wars or the Japanese-hipster-credibility-halo-effect wars?

What I couldn’t understand, though, was why they killed the story. Sure, it wasn’t Blackwater, but this was a store that at least half our readers’ kids would have killed to work for, and it was being run by some racist, frat-boy cult, and the suburban teenagers it hired and fired so mercurially were going to grow into adults who thought this was . . . normal? That in the modern American workplace, this sort of Lord-of-the-Flies management strategy was just par for the fucking course?

I ended up handing over my notes to a civil-rights lawyer who was leading a class-action race-discrimination suit against Abercrombie. A few years later, more than ten thousand former brand representatives got checks in the mail as part of the $40 million settlement.

In 2004, I was again living in Philadelphia. A guy for whom I had transcribed some interviews at Philadelphia magazine back in college had been named editor-in-chief, and he offered me a chance at journalistic salvation. He had room in his budget for a young staff writer, but I had to freelance something first. I snagged a job at a downtown phone-sex call center, and six weeks later I had my piece—and another insight about journalism. “Phone sex,” I wrote,

is not so unlike being a reporter. A central challenge of success at both is keeping random strangers—horny guys, hostile hedge-fund managers—on the phone, talking to you, confessing to you, growing fond of you, resolving to talk to you again. And at all times, phone-sex operators, like reporters, are expected to remain detached, wise to “The Game,” objective—but in a way, that’s crap. It’s not easy to become beloved by strangers if not a single part of you truly yearns for that love.

The stranger thing about phone sex, though, was that the training program was more rigorous and extensive than any I’d encountered in journalism. There was a day and a half in a classroom learning such phone-sex fundamentals as the “hot statement” and the “ego stroke,” daily feedback sessions with supervisors who listened in on calls, a mandatory creative-writing contest for the best Halloween-themed fantasy scenario, refresher courses to hone fluency in more exotic proclivities, individual binders in which we recorded our progress in this stuff and collected, as per instruction, magazine clippings—Penthouse letters, perfume advertisements, etc.—whatever we found erotically inspiring. When my supervisor’s boss learned I was writing a story, he unfurled all the usual legal threats, but when it was published, the company ordered hundreds of reprints to dispense to new hires at orientation. They did not expect you to be some innate phone-sex genius, but they had full faith that you could get immeasurably better, especially if you wanted to, and they genuinely seemed to take it as a given that people wanted to become better at things they did.

For me, an enduring frustration of traditional journalism is that what training you do get centers on the imperative to discount and dismiss your own experiences in pursuit of some objective ideal, even as journalism simultaneously exposes you to an unusually large variety of experiences. The idea that it might be a good thing to attempt to apply insights gleaned from those experiences to future stories—let alone synthesize it all into any sort of coherent narrative—rarely comes up, unless you’re a columnist. This can be an especially torturous dilemma during the inevitable low point at which the journalist—this one, anyway—comes to believe that the only feasible course of action (given the state of journalism) is to secure a six-figure book deal, and commences filling her off-hours in a feeble attempt to “write what you know.” I know a lot of things, taunts the endless negative feedback loop, but none of them is how to make six figures.

With my journalistic redemption under way, and finding that redemption alone doesn’t necessarily pay the rent, I started a book proposal about something I termed “The Nothing-Based Economy.” The argument was pretty simple: the American economy had become so enthralled with the endless cultivation and expansion of demand that it had become totally divorced from the reality of need. This was not an inevitability that Marx and Mao and the movie Idiocracyhadn’t grappled with already, but I was just a journalist and those were just the facts. Drug companies founded to cure diseases had a duty to shareholders to never cure anything so long as tens of millions of Americans reliably spent hundreds of dollars a month on the nebulous array of chronic maladies pharmaceutical companies had invented to treat. Bankers who still (incredibly) claimed to facilitate “efficient allocation of capital” were in actuality beholden to the trading-desk arbitrageurs who couldn’t make money unless their corporate finance departments concocted a steady stream of “innovations” by which to render markets more inefficient. Every last function of government was being outsourced to some contractor with the fiduciary obligation to ensure that taxpayers wasted as much money as possible.

Abercrombie and LeBron James and AOL informed this observation, as had some stories I’d written for Philadelphia magazine: the year I spent shadowing a Wharton MBA class, for instance, on its punishing schedule of leadership classes and campfire retreats and networking events that seemed deliberately designed to impart no ideas, hone no skills, and prepare the students for nothing beyond spending an inordinate amount of time in the company of people very similar to themselves; or the investigation into Donald Trump’s resurgence as a “virtual developer” who licensed his name to the sort of luxury-condo projects where the deeds would change hands five times before the thing was even built.

This was all well and good, at least as underlying theories went, but of the fourteen distinct genera of profitable nonfiction books my agent had identified in his many years of sales analysis, he said my idea sounded most like an “I’m Right And You’re Totally Wrong” book. The appeal of such a book rests on the author having achieved a degree of personal-brand credibility, and since neither of us could remember a blurb along the lines of “regional magazine staffer calls bullshit on the American economy” following an entry on The New York Times best-seller list, I complied when my agent suggested I pursue instead an “exposé,” in which the author “draws on inside knowledge”—possibly acquired “as a reporter willing to live through a terrible experience”—to “regale us with stories about how much more awful things are than they appear on the surface.”

So in 2006, I took a job at American Apparel. You are probably aware that American Apparel had (and has) two primary reasons for notoriety: that it actually manufactures (in downtown Los Angeles, no less) the clothes it sells, and that its controversial founder and CEO, Dov Charney, decided to open hundreds of conspicuously located urban stores at the peak of the real-estate bubble, and staff them mostly with a revolving cast of underage-looking girls who were willing to work management-consultant hours for $9 an hour and a shot at being invited to pose for one of the trademark semipornographic employee photo shoots the company uses to advertise, if not its clothes, its “brand.”

American Apparel seemed to me a tweaked-out metaphor for the country itself, the way it had strategically shifted the hero of its exceptionalism narrative from its factory to its cast of disposable young people who are endowed with little besides their looks and the desire to broadcast their youthful insouciance to a wider audience.

My agent, though, hated the American Apparel stories. “It’s like one big contest for who can be the most vapid,” he wrote in a withering takedown of my sample chapter. “None of the characters you draw are remotely likable, or entertaining; nor do they illuminate anything.” I didn’t disagree about the vapid part, but—hello—that was sort of the point.

The next year, 2007, I took a job at Gawker Media helping launch a sister blog targeted at Gawker’s demographically attractive female readership—a property that was named Jezebel, at the insistence of Gawker CEO and founder Nick Denton and against my vociferous objections, after the blasphemous Old Testament whore who was eventually eaten alive by dogs. Gawker was in the business of gossip-blogging, an insidious racket that I and most members of my profession held partially responsible for the destruction of journalism.

But I also saw Gawker as American Apparel’s journalistic equivalent, and I justified taking the job by thinking of it as the next chapter in my immersion in the nothing-based economy, in which I would make the natural transition from creating demand for someone else’s brand to creating demand for my own. In hindsight, though, it seems obvious that Gawker had subconsciously inspired the whole book project in the first place.

I had started reading Gawker’s flagship site around the time it was founded, in 2002, because it was a media gossip blog and I was in the media. Back then, the comically bland Jim Romenesko had cornered the market on this sort of inside baseball, and Gawker, by contrast, was puerile, funny, and refreshing. Gawker writers covered the media and publishing industries as if it were all your typical inane celebrity bullshit, and padded their media and publishing coverage with actual inane celebrity bullshit—and padded that further by identifying (or inventing) a sort of pseudo-celebrity vortex of New York unknowns who wanted so badly to achieve some measure of what one of them called “microfame” that they would say or do almost anything to warrant another post on Gawker. Muddling these things together on one sarcastic Web site was popular with readers, but over time whatever I had found refreshing about it began to feel psychically draining.

I finally quit reading Gawker’s flagship site altogether after a post about the heated jockeying among New York Times reporters over which stories landed on the “Most E-mailed” list. I didn’t know why anyone in the nation’s most-respected newsroom would compete for the pro-bono, viral marketing services of a group of readers who demonstrably only care about a story if it concerns food, weight loss, or admittance into an Ivy League college—and I didn’t want to know. I had a sort of not-in-my-backyard unease about the nothing-based economy. While journalism had not exactly rewarded me in any quantifiable way, it had exposed me to a large number of people who had taken this vow of poverty for a lot of reasons other than the opportunity to endlessly debate the relative merits of carbohydrates and get their photos taken at parties.

But I also stopped reading it, probably, because it was 2004 and Gawker had just launched another diversion on which I happily lavished attention: the politics blog Wonkette.

Wonkette was written by a journalist in her early thirties named Ana Marie Cox who covered D.C. with a dry and cutting wit that I was sure would be lost on the sort of people who control the Most E-Mailed list. But then she landed her first big “scoop,” about the existence of a blog called Washingtonienne kept by an anonymous Capitol Hill staffer who supplemented her income by sleeping with older, married, power-broker types. This being precisely the sort of self-promotional scheme New York’s great unabashed masses were increasingly obliging Gawker to blog about, I would have totally ignored it had Cox not taken the opportunity to post some photos of herself posing with the Washingtonienne at a club. The Washingtonienne looked sort of damaged next to the elder Wonkette, who looked like she had spent an inordinate amount of time practicing in front of the mirror for this moment. Within months Cox’s image would appear in many bigger media outlets, including on the cover of The New York Times Magazine, which expended numerous paragraphs corroborating my suspicions by chronicling her childhood in Lincoln, Nebraska, spent watching Breakfast At Tiffany’s and dreaming of fame, her bliss over being offered a gig on MTV News, and dejection when said gig did not produce a full-time job. “I couldn’t figure it out,” the author, Matthew Klam, wrote. “Why was she so excited about working for MTV? MTV is for nine-year-olds. It’s so 1992. It was as if her sense of what was cool and what was stupid, so unerring on her blog, had abandoned her.”

Thank the deities, I remember thinking at the time, someone called her out on it. This virulent new self-obsessed model for journalistic success needs to be stopped.


Organizationally, Gawker could not have been a purer embodiment of nothing-based dystopia at work in the media. For most of my time there, bloggers earned bonuses that were tied to the page views their posts received, so the leisurely three minutes required to download a haggard image of Amy Winehouse from a celebrity photo agency and post it with a five-word caption was rewarded as generously as the frenzied hour and a half spent compiling the daily roundup of celebrity gossip, and at least twice as generously as anything I actually wanted to spend an hour and a half writing about. Beyond that, awarding page-view bonuses clearly encouraged bloggers to fight over tips and news items that fell into the realm of “obvious traffic getters,” and discouraged us from collaborating in any effort more substantial than the odd round of company-subsidized drinks.

When hiring female bloggers, the company also maintained a bias toward the young and photogenic, and by the time I got there, it occasionally posted on its sites softly lit pictures of its female employees, much in the way American Apparel had done. During my first few months at the company, Emily Gould, a blogger for the flagship site, even posted a photo of herself wearing an American Apparel swimsuit and giving the finger. Anyone who worked for Gawker Media in the summer of 2007 attaches the swimsuit image to a “phase” our colleague was in the throes of which depleted a tremendous amount of our collective attention via instant message, a phase one veteran blogger likened to that experienced by Cox in her Wonkette era. Gould graced the cover of the Times Magazine the next spring, four years after Cox, lying on her bed in a tank top gazing sleepily at the camera.

The photos, along with Gould’s essay about life as a blogger, elicited a deluge of vicious Internet commentary, often from other bloggers who felt Gould had given blogging a bad name—“Some bloggers are able to write about things other than themselves. Seriously,” huffed New York magazine’s Daily Intel blog, a Gawker competitor. And following numerous demands from Jezebel readers that we somehow “weigh in,” I obliged with a post in which I jokingly advanced a theory that Denton had created Gawker with the intention of destroying journalism by infecting its practitioners with a lethal addiction to a kind of reality-TV version of the media, in which “mundane trivialities” and “the ceaseless trade of imaginary currency” kept them impervious to the alarming shortage of real currency—both pay and prestige—in the business by supplanting any underlying theoretical purpose journalism might initially have been invented to serve. That afternoon I ran into Denton at the office.

“I liked your post,” he said, which was his typical response to negative attention.

“Yeah, I mean, I don’t know what all the fuss is over,” I said. “They’re not even particularly hot photos, for Emily.”

“Well, and why does anyone become a writer in the first place?” he asked, stressing the first syllable of “writer,” as if the word itself could only ever be uttered with implied air quotes. “The same reason they start playing guitar in high school and try forming bands. To draw attention to themselves.”

Considering all this, it seems odd to tell you that working for Gawker Media was probably the least-demoralizing media job I’ve ever held. The principal reason is that I eventually blundered into an unexpected intimacy with readers on the dreaded “demand” side of the equation, who turned out to want something other than, or in addition to, what everyone and their algorithms suggested.

Producing a Web site that targets women requires engaging with the topics that have always been the focus of media that target women. But since for me this was mostly an experiment in personal brand-building, I did not feel compelled to conceal my contempt for these topics, and for the reprobate economic forces that, I reasoned, had forced me to write about them. Contempt would just have to be part of the “Moe Tkacik brand” (which was not to be confused with the body of mostly respectable journalism produced by Maureen Tkacik).

Of all the resentments I had accumulated before coming to Jezebel, I had never much dwelled on the misfortune of being born a woman. But women, who so disproportionately bear the nothing-based economy’s unrelenting fusillade of invented insecurities and predatory sales pitches, were ideally positioned to share my list of grievances. It makes sense, in retrospect, that a readership so universally practiced in the faking of things—orgasms, hair color, age, disinterest in men one was actually interested in, etc.—would humor the intolerance for fakery that helped define the “Moe Tkacik brand,” which was basically an angrier, more recklessly confessional, and more contemptuous version of myself.

This point about fakery was driven home for me by a (pretty brilliant) idea that Nick Denton had—to offer a cash reward to whichever turncoat from a women’s magazine slipped us the most egregious example of a retouched cover image. The winner submitted the original version of a ludicrously altered Redbook cover featuring the country singer Faith Hill, which I posted, along with the published cover and a fake art-department memo, under the tagline, “Photoshop of Horrors.” The thing paid for itself with a deluge of traffic and all manner of “mainstream” media attention.

But the real revelation, to me at least, was that the readers who came for Faith Hill returned for posts about the Iranian insurgency, the foreclosure crisis, military contracting, campaign finance, corporate malfeasance, the global food crisis—essentially whatever I found outrageous or absurd or interesting on a given day.

When I realized I could be more honest and funnier about a wider array of topics than any other job had allowed—let alone demanded—I felt I owed it to the readers to become something more than the scornful persona that was Gawker’s trademark. When the timeless dilemmas of dating and dieting and “having it all” invariably cropped up, I felt both liberated and obligated to “overshare,” as they say, copping to all manner of offenses I would have elided in earlier jobs: unprotected sex, a history of eating disorders, a newfound dependence on attention-deficit-disorder drugs, belief in God, etc. This enabled me to more honestly confront feminist pieties and hypocrisies, write more vividly and confidently, and perhaps even challenge the stereotypes about “women who write about shit that happened to them.”

From a commercial perspective, “branding” has consistently bestowed its greatest rewards on those capable of projecting a kind of elusive authority that turns consumers’ fears, insecurities, aspirations, unarticulated dreams, etc. into healthy profit margins. But a sense of humanity is also a kind of authority. And maybe the best policy for our beaten-down population of journalists just naturally involves letting down the old guard of objectivity and letting go of illusions of unimpeachability. Rather than train journalists to dismiss their own experiences, what if we trained them to use those experiences to help them explain the news to their audience? Allow their humanity to shape their journalism? This isn’t some radically profound notion—it only seems that way in the context of the ridiculous zero-sum debate over the relative merits of “straight” news versus the self-absorbed nature of blogs. Maybe there is a way to combine the best of both.

If journalism’s more vital traditions of investigating corruption and synthesizing complex topics are going to be restored, it will never be at the expense of the personal, the sexual, the venal, or the sensational, but rather through mastering the kind of storytelling that understands that none of those things exists in a vacuum. For instance, perhaps the latest political sex scandal is not simply another installment of the unrelenting narcissism and sense of invincibility of people in power. Most of the journalists writing about it have—as we all do—some understanding of the internal conflicts that lead to personal failure. By humanizing journalism, we maybe can begin to develop a mutual trust between reader and writer that would benefit both.

What I’m talking about is, of course, a lot easier to do with the creative liberties afforded a blog—one’s humanity is inescapable when one commits to blogging all day for a living. I don’t think it’s a coincidence that Andrew Sullivan, one of journalism’s preeminent blogging brands, is one of very few journalists to have endured his own sordid sex scandal. Or that Josh Marshall, the studiously wonky founder of Talking Points Memo, reacted to the adultery-provoked downfall of South Carolina Governor Mark Sanford, Marshall’s ideological foe, by entreating Sanford, whom Marshall described as seeming “deeply in love” with his mistress, to “Just Go Be With Her!”

Last year, Michael Massing, the guy who originally gave me the tip about John Timoney more than a decade ago (and who is a contributing editor to this magazine), sent an e-mail to Talking Points Memo, where I was doing a stint covering the financial crisis. He wanted to drop by the office for a New York Review of Books think piece on “the future of journalism.” I wrote back and suggested we first meet for a drink. Over Bloody Marys I told him that I’d work for Goldman Sachs in a second if they’d have me. “Don’t say that!” he replied, as if he would have censored the very thought if he could. So I had to explain just how depressing it was to look ahead while my own future remained so inextricable from the future of journalism.

Which is how I came to write this. One day I was casually telling Massing how an old friend of mine from the Journal, a sweet, respectable thirty-year-old with a husband and no personality disorders or history of substance abuse, had recently quit full-time journalism and started freelancing so she could also write poetry—“I can’t imagine what the twenty-two-year-old J-school me would think,” she told me, “but I just couldn’t see how I could get any better without branching out from journalism”—and a month (and several more conversations) later Massing e-mailed a suggestion that I should write about “the life of a young urban writer now.”

So I wrote what I know, or rather what I’ve learned, which could be summed up this way: when the Internet forced journalism to compete economically after years of monopoly, journalism panicked and adopted some of the worst examples of the nothing-based economy, in which success depends on the continued infantilization of both supply and demand. At the same time, journalism clung to its myths of objectivity and detachment, using them to dismiss the emerging blogger threat as something unserious and fundamentally parasitic, even as it produced a steady stream of obsessive but sneering trend stories on the blogosphere.

Consider the breathless (and stylishly photographed) April 1 piece in The New York Times that spotlighted the “notable scoops” broken by the latest microgeneration of up-and-coming gossip bloggers—two had involved sub-sub-subplots of the lives of reality-show starlets, one was about NBC’s Black History Month cafeteria menu, another was referred to as “an ‘investigation’ into the White House budget director Peter Orszag’s hair,” and the rest were arguably less meaty than those.

Yet one of the featured “next big thing” bloggers was twenty-six-year-old Bess Levin of the Wall Street blog Dealbreaker. The Times had listed as Levin’s “notable scoop” her procurement of an embarrassing party invitation sent out by the “prominent but discreet” hedge-fund manager Steven A. Cohen. Tellingly, the Times failed to mention several much more notable “scoops” Levin had published about Cohen’s hedge fund, such as the one about the portfolio manager who was sued by one of his (male) former traders in what was perhaps the most disturbing sexual harassment complaint in Wall Street history. (And I have read at least fifty of them.)

More troubling was that the Times listed as Levin’s “memorable gaffe” a post in late 2008 in which she reported what she termed an “unfounded rumor” that a major hedge fund’s prime brokers were threatening liquidation. “It turns out the rumor was indeed ‘unfounded’, so she quickly removed it under pressure,” the Times explained in a condescending sentence that could have only been written by someone who didn’t understand that in 2008 virtually all hedge funds were so massively leveraged—in an effort to amp returns—that the threat of liquidation was perpetually on the table.

This passage lays bare an old-media strategy of dismissing bloggers by policing the permeable walls between news and gossip, analysis and opinion, perspective and attention-seeking. Hedge funds are largely unregulated; “news” about them is often inextricable from “rumor.” It is this disturbing reality that has helped make Bess Levin’s “gossip blog” an important source of information about the financial industry.

So, in this context, her pulling this “rumor” can either be seen as a “gaffe” or, given the obvious power imbalances, an example of a hedge-fund manager trying to save his ass and/or blow off steam by intimidating some kid two years out of college.

Back in 2008, during that week after Lehman Brothers declared bankruptcy and raised the curtain on the global credit crisis that would in short order serve as Nick Denton’s rationale for firing me and eighteen other Gawker staffers, Denton had asked me to write a post blaming journalists for the financial crisis. This idea bordered on lunacy, and I refused, even when he explained the foundation of his argument, which basically amounted to: he had worked at the Financial Times in the late nineties, and he said he’d tried repeatedly to write stories probing the potential dangers of unregulated derivatives and the stunning amount of leverage that went along with their use, etc., but his bosses invariably told him, in so many words, to bugger off.

“I am telling you,” he insisted. “I tried so many times to write those stories. It was always, ‘No, no, no. Don’t you understand? That’s innovation.’”

And he was right; about derivatives but also maybe about journalists, many of whom I had also seen over the years apply their well-honed skepticism to just about everything but the age-old imperative to “follow the money,” as so many trillions of dollars re-appropriated themselves in the tax shelters and tropical holding companies of the super-rich. Maybe Denton’s editors assumed he was just trying to draw attention to himself, like all those photogenic, gaffe-prone gossip bloggers. And to that end, given Gawker’s success, he has certainly gotten the last laugh. Although I think he might even agree with me that it’s not much of an end.

Maureen Tkacik is (still) a writer who lives in New York.