David Foster Wallace’s suicide, and why his career trajectory will likely not be repeated

Interesting post Your Book Review: The Pale King.

For a long time I, like so many others, romanticized David Foster Wallace’s suicide. It crystallized him as a tragic figure, eternally 46, handsome, at the height of his powers—we never saw him go gray and saggy, grow uncool, post cringey takes on Twitter. His death preserved him in his youth, and elevated him to the almost angelic.

A recurring question is, why hasn’t there been another author to assume the literary throne or take up the mantle as the next David Foster Wallace? Why has no subsequent author been able to emulate those same attributes that made him special? The cheap, ephemeral entertainment he deplored became the downfall of the ‘great American novelist’, but also more competition from social media. It’s not so much he was uniquely special, but he occupied a special time in which novelists thrived.

David Foster Wallace died in 2008, a little over a year after Twitter was launched. What makes this period special? It was a time, around the early to mid 2000s, where anyone with some credentials could make a name or carve brand for his or herself as an intellectual or ‘man of ideas’, before social media came along and diluted and commodified the intellectual scene. Many famous podcasters, authors, and journalists today got their start during this period, such as Joe Rogan, Malcolm Gladwell, Tim Ferris, Naval Ravikant, and Nassim Taleb, among others.

Michael Pollan in a 2007 essay and expanded in a 2008 book uttered seven words “Eat food. Not too much. Mostly plants” that started a plant food revolution. Garry Taubes in a 2002 NYTs column famously proclaimed “fats don’t make you fat”, seemingly upending the ’70s carb-heavy food orthodoxy as typified by the food pyramid and launching his career too. Gladwell coined the neologism “tipping point” in 2000 with his eponymously-titled book. Tim Ferriss in 2007 created a productivity revolution with the best-selling The 4-Hour Workweek.

But the past decade has seen an oversupply of commentary due to social media. Anyone can create an account on Twitter, write a Substack blog, upload a YouTube video, or post as an ‘anon’ on 4chan and throw their 2 cents into what Habermas called ‘the public sphere’– no gatekeepers required. Pre-Twitter, mass media acted as attention funnels, such as TV appearances or best-seller lists, that could launch someone’s career overnight as a ‘foremost expert’. But today’s would-be ‘thought leaders’ have to compete against much more divided and shorter attention spans and a public inured to being as easily shocked.

By comparison, in 1996, when Infinite Jest was published, the following things did not exist (or were considerably downsized compared to today):

-YouTube
-smartphones
-online gambling and fantasy sports
-streaming media (e.g. Netflix, Hulu)
-social media (e.g. Instagram, Twitter, Facebook, and TikTok)
-e-zines, blogs, and long-form digital content (e.g. Substack, Salon, Slate, Medium)
-Amazon e-book publishing that favors quantity over quality
-broadband/high-speed internet
-Chat GPT/LLMs
-podcasts (e.g. iTunes, Spotify)
-MMOGs (massively multiplayer online games)
-satellite radio

Growing up in the ’90s, there was no social media or Netflix. Music was limited to expensive, infrequent CDs. Movies were also expensive relative to the entertainment value they provided. Even video games were too expensive, as the games had to be purchased individually (no Steam). For a college student on a budget in a pre-broadband era, a famously difficult 1,079-page tome is suddenly a good way to kill time, if nothing else.

Although technology leads to new ways to consume media, it also leads to the creation of more total media. But it’s not like the number of hours in a day scales accordingly, or that people are able to read and consume this content faster, so either the content becomes shorter or ignored. Despite ‘booktok’ (yes, this is a thing), one must still find time to read the books. Yes, there are audiobooks, but these have to still compete against podcasts. As a commenter notes, we’re in something of a ‘content obesity crisis’, which makes it harder for authors to stand out:

But I do understand that Hemingway’s writing style was unique and original at the time, and that he was doing something new and interesting that influenced American literature for a long time. But these days, given the flood of content, it feels like most attempts at doing something “new and interesting” are not only forced, but nearly impossible given that there are a million other people also trying to do new and interesting things that now have the means to disseminate them. I don’t think a book like The Sun Also Rises, where I believe the main impact was the style of writing/dialogue vs the actual story, could ever break through today.

Malcom Gladwell in the early 2000s had the funnel of The New Yorker and his publisher Little, Brown and Company. Someone today trying to copy Gladwell’s career trajectory would have to rely on social media for self-promotion and would be competing against blogs and Twitter for attention. Moreover, his ideas would appear much more trite or banal when compared to the thousands of ‘anons’ and other unpaid contributors who also have interesting but uncredited ideas on social media. The title of ‘expert’ is no longer an exclusive club, but now anyone can assume this role.

Does this contradict my earlier posts about the rise and importance of legacy media? I don’t think so. Legacy media is thriving because of too much content and the need for authority and curation, but this is not the same as legacy media being an artistic authority or dictating tastes. People turn to legacy media because they want to know what is objectively true as the final authority.

Had David Foster Wallace not gone off his meds he would have not hanged himself. He would have finished The Pale King, which likely, as he feared, would not reach the lofty heights of Infinite Jest. Fast-forward to the ‘Trump era’ and he would be just another ‘blue check’, among thousands, on Twitter. ‘Me too’ transgressions would have also come to light (attempts have been made to cancel him posthumously as a misogynist). Subsequent books would not have done as well, again, due to the crowding-out effect of social media and the public moving away from books to more immersive media. He picked an opportune time to die, if such a thing is possible, just as social media was on the cusp of exploding.

We already have a case study of an author whose career followed a similar trajectory and genre as David Foster Wallace and who isn’t dead, that being Johnathan Franzen. His career arguably peaked in 2001 with the release of The Corrections. By comparison, Purity, 2015, his third book that followed this hyper-realism coming-of-age template, saw a marked decline in sales despite positive reviews:

In a June 2018 profile of Franzen in The New York Times Magazine, Purity was revealed to have been a relative commercial disappointment compared to Franzen’s two previous novels. According to the article, Purity has only sold 255,476 copies to date since its release in 2015, compared to 1.15 million copies of Freedom sold since its publication in 2010, and 1.6 million copies of The Corrections sold since its publication in 2001.

Other authors who had success in the late ’90s and to early 2000s such as Zadie Smith and Elizabeth Wurtzel became columnists. Columns are much easier to write than doorstop-sized books and more topical, and thus better able to adapt to a peripatetic news cycle.

Also, that genre, dubbed ‘hysterical realism’ in 2000 by literary critic James Wood, was a counterbalance to the post-Cold War optimism of that same period. Social commentary, as an art form, works at holding a mirror to our worst selves or painting an alternate reality, but now we inhabit such a reality. A former president came within an inch of being assassinated at a rally, and 3 years earlier the U.S. Capitol was stormed, and then a year earlier the world was shut down due to a pandemic. How does art possibly top that? With the rise of AI, it’s more like art imitating life. Art cannot keep up with life, instead of the other way around.