As someone with extensive experience in writing and analyzing opinions, I’d like to share the characteristics that distinguish successful opinions from less effective ones. In an online world oversaturated with content—now amplified by AI-generated material—the ability to express opinions clearly and persuasively is more critical than ever for standing out in the noise. While crafting successful opinions isn’t an exact science, the odds of success can be improved by studying the qualities of well-received opinions and their phrasing, as well as recognizing the pitfalls. At over 6,000 words this is one of the longer posts here. However, given the broad nature of the topic, that being “the reception of opinions in the context of online discussion,” I believe this level of detail is necessary to explore it thoroughly.
One of the biggest challenges of expressing opinions is the inherent unpredictability of how they will be received—a factor largely outside your control. Have you ever carefully crafted what you thought was the perfect Reddit or Hacker News comment—relevant, polite, and backed by evidence—only to see it ignored or, worse, downvoted without explanation? That’s the ‘marketplace of ideas’ effectively telling you your opinion doesn’t hold up. I’ve faced this many times: sharing well-intentioned opinions only to watch them falter. I was convinced they were strong points, yet something was off. Sometimes, it became clear I had overlooked a critical perspective, or perhaps I was blind to an important nuance that others noticed. Other times, the issue was simply poor communication. Regardless of the cause, the result was the same: failure.
On Twitter, where downvotes don’t exist, the equivalent of a bad opinion is the infamous ‘ratio.’ This occurs when a tweet receives an unusually high number of comments compared to likes—typically, as I’ve observed, a ratio of at least 3–5 comments for every 10 likes. This often signals an opinion that is unpopular, polarizing, or perceived as incorrect. On Reddit, I refer to this phenomenon as the ‘kill shot rebuttal,’ where a reply or child comment significantly outperforms its parent in upvotes, effectively dismantling the original post. That said, not all ratios are inherently negative. A tweet that fosters meaningful or productive discourse can result in a high comment-to-like ratio without reflecting poorly on the original opinion. For instance, polls and open-ended questions are designed to encourage engagement and naturally invite a higher ratio by intention.
I’ve encountered this situation enough times—and I’m sure others have too—that it inspired me to write this post, along with a guide on the often-overlooked value of deleting comments. There’s rarely anything to gain from stubbornly defending a bad or unpopular opinion, even if you believe you’re right. Once the ‘marketplace of ideas’ has decided you’re wrong, the verdict is effectively final. It’s akin to those viral videos advising you not to argue with the police—once the decision has been made, no amount of explaining your innocence will change the outcome. Thus, the delete button is your friend. It exists for a reason but is surprisingly underused. Deleting a comment isn’t cowardice, nor avoiding accountability. Instead, it’s a simple way to make your social media experience more positive by reducing unnecessary negativity at your own discretion. Sometimes, it’s just not worth dying on that hill.
When diagnosing the problem of bad opinions, a common thread is that they often come across as both arrogant and ignorant. These opinions project unearned confidence, creating a sense of certainty that isn’t backed by substance. I’ve observed this dynamic when looking at heavily downvoted YouTube videos or Reddit posts (before YouTube removed the downvote count). The combination of arrogance and ignorance is almost always at the root of the backlash. That said, arrogance by itself isn’t inherently bad. It can be tempered by redeeming qualities like charisma, likability, or social proof. For instance, figures like Andrew Tate are polarizing but manage to maintain a strong following because of these traits. The worst-case scenario, however, is being both ignorant and unlikable—a position that invites near-universal rejection. In fiction, screenwriters strive to humanize the most flawed or evil characters by giving them compelling motives or relatable traits. This effort to create multidimensionality contrasts starkly with the one-dimensionality of those who espouse bad opinions without self-awareness, nuance, or empathy.
The following opinions tend to do especially poorly:
First: Getting the facts wrong in such a way that it undermines the premise of the opinion or destroys credibility. This means getting dates, names or other information wrong, which can be verified online. But this is hard for things in which epistemological certainty is not possible or if facts conflict, such as mask efficacy in the context of Covid. Also related is the ‘woosh’ meme or having the point go above your head, which means being wrong because you misread the context or inferred wrong. In this case, correctness works against you because something was taken too literally. I observed this as early as 2016 in the post I Can Tolerate Anything Except Factual Inaccuracies. I was half joking, but the importance of accuracy has only increased in the intervening years among the smart/intellectual-web. For example, footnote-laden posts in which the author spends 2 paragraphs equivocating or trying to anticipate every possible objection for a seemingly inoffensive opinion.
Second: Downplaying or minimizing something which is complicated or difficult because you fail to understand why the complexity exists in the first place. This is related to unearned epistemological confidence. For example, “Why do doctors need so much training? I can read WebMD or Wikipedia for free.” Yes you can, but being a doctor is much more than that. In contrast, silly or unserious observations about low-stakes matters tend to fare better. For example: “Why don’t Pringles make a square can?” Unlike the doctor scenario, this question is lighthearted and doesn’t trivialize something of real importance. Similarly, opinions like “I could beat a lightweight MMA fighter” often fail because they combine ignorance with a lack of humility (spoiler: no, you can’t).
Third: The impact of a vibe shift. There’s a reason intellectual movements and schools of thought are often described with terms borrowed from fashion, like “trendy” or “in vogue.” When the vibe shifts, an opinion that was once well-received can suddenly feel obsolete in the ever-fickle marketplace of ideas. Unlike the stock market, which—despite its volatility—has some degree of predictability, the trajectory of public sentiment is far harder to forecast. For instance, I can be reasonably confident that Walmart stock will perform well in five years, but I have no idea which opinions will remain popular or what new vibe shifts may occur by then. While factual inaccuracies can often be mitigated through research and editing, vibe shifts are inherently unpredictable. Adapting to them requires both a sense of timing and an ability to read the room. Opinions must sometimes be phrased with hedging or nuance to remain palatable, especially if you find yourself on the wrong side of an emerging shift.
What was trendy or widely accepted a decade ago often doesn’t hold up today. (Imagine trying to convince someone in 2014 that Donald Trump would become president.) Social norms and cultural attitudes are constantly evolving. For example, cynicism toward technology and elite colleges has grown significantly over the past decade. The former stems from fears of job loss and automation, while the latter relates to controversies over affirmative action, cancel culture, and wokeness. There is the widespread perception that these institutions are not only antithetical to academic freedom and are un-meritocratic, but also act as unelected gatekeepers to the middle class. While the politicization of issues is nothing new, it seems to have intensified in recent years.
I suspect this stems from elites overplaying their hand during events like COVID-19 and the George Floyd protests, leading to greater public distrust of credentialed experts and policymakers. Many now perceive these figures as self-serving. For instance, people can accept some level of inconvenience during a crisis in the name of public safety. However, when exceptions are made for rioters under the banner of fighting social injustice—while businesses are forced to shut down and the rest of the population is told to stay indoors—it smacks of hypocrisy. Similarly, calls to “trust the science” lose credibility when the science or narrative shifts, or when debate is stifled under the guise of scientific consensus. This kind of inconsistency fosters deep cynicism and leaves many questioning not just individual policies but the integrity of the entire system.
Fourth: A false equivalency based on inexperience, or falsely laying claim to a lived experience from which the equivalency conceived on. An example is likening something comparably small or trivial to going to war even if it’s presumed to be hyperbole. Unless you have actually served in combat, you cannot make such a comparison. On Reddit, someone likened a glazed cherry pie to drugs, and having not actually taken drugs, is downvoted and rebuked, as drugs are much stronger:
Note the clown emoji, which means two things: the opinion is not only wrong, but the person is not to be taken seriously, so all future opinions from this individual can be disregarded, which is the worst possible first impression you can make.
This is why accusations of stolen valor are taken so seriously, like during the 2004 and 2024 U.S. presidential elections. Or during the 2008 election, of Hillary claiming being under sniper fire during a trip to Bosnia during the ’90s, which she recanted when footage showed she was in no danger. Military service is just a job with a paygrade, like any other job, but stolen valor is attempt to covet the lived experience and heroism that is attached to the profession, undeservedly. Interestingly, this does not apply to the health and wellness industry, in which hucksters frequently make unsubstantiated claims or lay claim to unearned expertise, yet no one cares that much. It’s not as if society is totally opposed to pretenders, but the military has been put on a pedestal, unlike other jobs.
Fifth: Randomness. Sometimes, opinions are poorly received for reasons entirely beyond one’s control—due to chance or sheer bad luck. I’ve experienced this myself when commenting on Trump’s tariffs. Previously, I had made similar remarks that were well-received, so I expected a similar response. Instead, my comment backfired, leaving me to claw my way out of a karma hole. This is where the delete button becomes invaluable and, in my opinion, underappreciated. It’s the social media equivalent of an ejector seat, allowing you to escape a bad situation before it spirals further. This phenomenon often arises with polarizing figures like Elon Musk or Donald Trump.
In communities divided in their support, the reception to your comment can hinge on sheer timing. For instance, posting a pro-Trump opinion in a community that happens to lean anti-Trump at that moment—whether by chance or current mood—can result in a barrage of downvotes, even if the same comment might have been well-received earlier in the day. It’s akin to getting a ‘bad batch’ of readers. One way to mitigate this is through the concept of a shared narrative. By framing your opinion in a way that acknowledges common ground, you can reduce the likelihood of triggering knee-jerk reactions from an audience whose views may not align with yours at that moment.
As far as I know, I was the originator the shared narrative concept. I find it useful for its explanatory power. For example, on finance or investing communities, I have observed that being too bearish, bullish, or enthusiastic about any particular strategy or stock will invite inevitable rebuke, as if people want to prove you wrong. As discussed in the above link about factual inaccuracies, there are certain communities in which ingroup favoritism is disfavored over skepticism by the outgroup. This was observed on ‘rationalist’ communities years ago, where posts that criticized rationalism were more highly up-voted than posts that praised rationalism, by those very rationalists themselves who were doing the voting. This comes off as counterintuitive–why would people join a community devoted to something just to criticize or to read criticisms of that topic–but that is just the nature how ‘smart’ communities work, in which the skeptical outsider is deemed more credible than the enthusiastic insider. [This also ties to WEIRD societies, that also have similar counterintuitive ingroup/outgroup patterns, in which outsiders are welcomed with open arms.] A shared narrative instead is to assail the financial media for being sensationalist, which is true and also everyone can agree with this observation, as opposed to having to take a side such as being bullish or bearish on the market or defending a specific investment, which necessitates a much higher burden of proof.
Sixth: Going off the rails. This is where an opinion that is otherwise good veers off into crazy land. For example, the opinion “the Covid vaccines are good and people should be encouraged to be vaccinated” is benign and many people would agree. Or that “people should be reminded to get vaccinated.” But saying that “the unvaccinated ought to be denied medical treatment, as they are demonstrating that they do not value their health or other’s health,” is now off the rails and crosses into being too extreme. There is always some line that if crossed turns a good opinion into a bad one. The role of editors is to help identify when this happens, as the author may be oblivious to it due to his own biases.
There are others, but those are the most common mistakes from my own observations. They also do not involve politics, nor are they logical fallacies (although the first can be considered a formal fallacy). For the fourth, although a false equivalency is an example of an informal fallacy, it’s not the false equivalency which makes the opinion bad, but how it’s attached to a lived experience that was not actually lived.
Conversely, successful opinions tend to blend accuracy, intellectual humility, strong credentials (or credibility), and a keen ability to read the room. Those who excel at persuasion don’t construct fortresses of epistemic certainty. Instead, they openly acknowledge their limitations, and this combination of humility and vulnerability makes them more relatable and likable. Take Joe Rogan, for example. He never positions himself as an intellectual, knowing this approach wouldn’t resonate with his audience due to his lack of formal credentials. Instead, he starts interviews with the implicit premise that he knows very little. By asking questions and breaking down complex concepts, he not only learns alongside his audience but also comes across as genuine and approachable.
Bad opinions, however, aren’t confined to politics or polarizing, high-stakes topics. Even seemingly trivial matters can provoke surprising backlash if poorly phrased or unpopular. This is especially evident in discussions about mathematics, which is why I’ve stopped engaging in them altogether. Someone might dismiss a mathematical result as “uninteresting,” only to be flooded with downvotes and negative replies—either for failing to appreciate the significance of the result or for misunderstanding a core concept.
At least with political topics, you can often read the room and anticipate where people stand. With subjects like math—or other discussions that don’t fall into a clear “left vs. right” or “good vs. bad” dichotomy—this is far more challenging. Similarly, debates about new obesity drugs illustrate this complexity. Arguments that lean too heavily in favor of these medications might be criticized for dismissing their side effects, while arguments favoring “diet and exercise” risk being downvoted for appearing overly judgmental or lacking empathy.
The best people in the world at this never have bad opinions. They could somehow get away with posting comments defending Trump or Alex Jones on /r/politics and somehow not be attacked or downvoted at all (the comments may be downvoted, but the net karma score would remain above 0). There are a handful of such individuals who come to mind, like bloggers Dan Lu, Gwern, and Scott Alexander–and on the other extreme from prolixity to brevity–podcaster and comedian Joe Rogan. I read those bloggers, not only because they have interesting insights, but also they are effective communicators. Same for neoliberal-leaning bloggers Matt Yglesias and Noah Smith. Even if I don’t agree with their politics, admittedly they are among the best at communicating potentially polarizing ideas to a broad readership. Ideas poorly communicated are worthless, so learning how to communicate better–whether in long-form like Scott or extemporaneously in short-form like Rogan–is necessary, and I hope to apply it to my own writing. But like trying to copy Tiger Wood’s golf stroke, it’s not the sort of thing that is easily reproduced. Experts have a way of making things seem deceptively easy.
To put it another way: a good, well-received opinion is like sculpting a key that fits in a lock, but without seeing the inside of the lock beforehand. Or like hitting a moving target blindfolded, only to have the target revealed later, that too is also moving. This is not to say it’s random or luck–there is skill, or else it would not be possible for some people to be so consistently successful at it–but it’s really hard.
Such writers are masters at reading the room and structuring potentially unpopular opinions in a way that is palatable. The first rule of writing is not make the reader hate you; this is easier said than done. Or also have invaluable and necessary attributes that help convey credibility to a skeptical audience, like high IQ or advanced degrees. Among the most important variables for how an opinion is received is not the soundness of the opinion, but the credentials or credibility of whoever espouses it, followed by the fit or suitability of the audience to the message. Because most people do not have exceptional or valuable credentials or characteristics that lend credibility, or have a built-in audience of supporters to provide social proof, they are starting at a major disadvantage.
Likability matters a lot and can help steer the crowd in your favor. Likability is hard to quantify in a scientific sense, yet invaluable. Joe Rogan is the master of taking even seemingly banal or prosaic observations and making them seem interesting to a lot of people, not because of any pretense of intellectualism, but the authenticity of he coveys of being an ordinary guy who is just looking for answers, who happens to also interview celebrities.
To be likable it’s commonly advised to ‘be a good listener’ and to be unpretentious and humble, yet this does not explain the apparent success and likability or aura of people who embody the exact opposite of those traits (e.g. Donald Trump and Andrew Tate), who have tons of followers, and evidently many people who do like them. Elon Musk has the most popular account on Twitter/X, yet ‘humble’ or ‘good listener’ are not among the most common adjectives one reaches for when describing him. I think it’s more like, if you have not yet built a reputation then you have to be humble. Once you gain a reputation for results, then you can skirt these rules to some extent, as Elon does. Or if you are incontrovertibly correct ( in the context of disputing a factual inaccuracy), then you can be more assertive about it. But if you’re neither right nor likable, then you’re in the worst possible place to be, as discussed earlier. This goes to show the difficulty or impreciseness of understanding likability, yet it’s invaluable. I think also some people exude certain negative vibes that are repellent, and it’s hard to quantify why this is, which also makes the problem hard to fix.
On the other extreme, Paul Krugman is an example of a poor communicator of ideas, by being too unlikable that it detracts from the substance of his arguments. The very mention of his name elicits a sort of mental block or dismissal (e.g. memes or jokes about the internet being as relevant as the fax machine) that makes it impossible to address his ideas at any depth, and I believe that this was his own doing. It’s one thing to believe you’re right based on the evidence brought forth, but it entirely another to lay claim to an unearned moral superiority by caricaturing your opponents in the worst possible light. Perhaps Dr. Krugman is more successful if we go by objective accomplishments or accolades, but the likes of Thomas Sowell, Glenn Lowry, and John McWhorter are far more effective communicators of their ideas. Colin Powell was not the smartest person in the room, but being an effective communicator was chosen to present the case for the Iraq War to the United Nations Security Council.
Or in politics, likability was Michael Bloomberg’s problem: really smart, experienced and accomplished guy, but doomed by having the charisma of a wet towel. Same for Jeb Bush. Compare to Obama or Trump, whose campaigns relied more on grassroots activism than throwing money at ads and hoping for something to stick. I think I have this problem as well. I’m pretty successful as far as investing and other business is concerned, but sometimes come off as annoying to others, yet am unable to diagnose or put my finger on why this is–it just is.
Tied in importance with likability is accuracy or correctness. Correctness is necessary, as factual inaccuracies can not only undermine the premise of the opinion, but also erode credibility in the eyes of the recipient. Inaccuracies can also detract attention away from the rest of the message, which may otherwise be correct and reasonable, yet is doomed by the inaccuracy and making a poor first impression as a result. It’s not uncommon on Reddit (but this probably applies to any community with a smart, discriminating audience) to see a comment or a post which tries to make a broader point, but has some inaccuracies, and replies which correct the inaccuracies but otherwise add nothing else to the discussion get more up-votes than the original comment (the kill-shot rebuttal).
Above is an example on Reddit’s /r/math, in which a correction that complex numbers preceded complex analysis has three times more upvotes than the broader (and also correct) point that complex analysis didn’t exist. This pattern of voting seems pedantic–as the broader point still stands–but it shows the necessity of accuracy. An otherwise correct point is doomed to being overlooked or rebuked if it contains an inaccuracy, so you have to get it 100% right 100% of the time.
The ‘akshually guy’ meme is wrong in that being that guy does confer status and peer approval in terms of votes or karma instead of ridicule. That is why he persists and why authors have to write defensively and with footnotes in anticipation his arrival. If people saw the akshually guy for the annoying pedant he is, then the problem would take care of itself, but this does not seem to be the case. Instead, the nitpick or objection will be upvoted and promoted to the top of the thread, derailing the conversation (although it’s possible there is a critical error that does invalidate the author’s premise/thesis). If possible, deleting the comment is an opinion, not to suppress debate, but just for the sake of keeping the discussion on track. Or lots of footnotes to anticipate every possible reader objection in the comments (Scott Alexander does this a lot):
This requires effectively having to write a second companion article that addresses all the possible objections of the first article. I think this trend has gotten out of control, both in terms of excessive fact-checking and the need to punch holes in arguments, and the overuse of footnotes. But such defensive writing is how writers have adapted to being held to very high standards of accuracy for short-form non-fiction. I cannot blame them, and this is one of the downsides of having a comment section, where self-anointed fact-checkers will not only derail the conversation, but be awarded ‘likes’ and other signifiers of status for doing so, instead of being shown the door, which will invite the inevitable cries or accusations of censorship.
Or on Hacker News, an inaccuracy regarding the number of annual mass shootings overshadows the rest of the comment:
Another conversation was derailed by the parent commentator misquoting a biblical verse. The correct quote is: It is easier for a camel to enter the eye of a needle than for a rich man to enter the Kingdom of God. In this case, eye refers to a narrow opening, not a literal sewing needle:
This goes to show the importance of accuracy and how colloquial usage can often be wrong. Most people will not notice or care, but for a more discriminating audience, such details matter. The incorrect version spreads through common parlance, which like the game of telephone, the final translation may bear no likeness to the original. Another example is the phrase “The meek shall inherit the earth”. The correct version is “Blessed are the meek, for they shall inherit the earth.” Meek in this context refers to individuals who possess strength but opt for restraint, not weak or harmless individuals.
One can argue that the internet is not real life and that being accurate one-hundred percent of the time is not necessary in the ‘real world’. I have always found this logic to be a cope. It’s unconvincing to say that the internet is not life or is not real when the internet is consuming a greater share of the pie that is the totality of life. Being that humans use the internet and that digital avatars are disembodied representations of humans, the internet is thus an extension of the human experience. Second, people who have good and correct options have more social status overall at ‘real life’ too and are more successful at work and have more income, not just only successful online. It’s people with high-paying careers who tend to be more accurate, as employers especially for high-paying jobs value correctness. From my own observation, success and status online correlates with success offline.
To tie this back to the section on likability, just because someone like Trump or Tate flagrantly lie or spread misinformation yet have high status and success, does not mean it’s reproducible for average people. This formula works for them, but for everyone else, accuracy matters. Without the platform and branding as they have, you’re stuck adhering to the guidelines of accuracy. It’s the same way that celebrities get preferential treatment, and then expecting that by dressing like Jonny Depp or something you will get similar treatment.
Accuracy is easier said than done, as it’s hard enough getting the facts right, let alone having correct opinions. For example, during Covid, there was no consensus about the efficacy of masks or social distancing. There was conflicting or inconclusive data either way. We’re not even talking about opinions about masks, but just the facts and data if they work or not. Or if vaccinated people needed to wear masks or not. Or the effectiveness of vaccines at stopping the spread versus reducing mortality. The initial argument was that vaccines would reduce the spread, hence making masks unnecessary among the vaccinated. When case counts continued to rise, the goalposts were moved to reducing mortality. Recall the CDC initially advised (in a deleted tweet) against wearing masks. Or debate about the origins of Covid. The lab leak hypothesis was dismissed by the media and labeled on social media as ‘misinformation’, until 2021 when new information came to light and major media outlets took the possibility of a leak seriously, instead of just dismissing it as racist or a conspiracy.
Consider the minimum wage debate, which is a hotly debated topic. Republicans can cite studies showing how raising the minimum wage is harmful, yet other studies show no effect or even positive effects. Who’s right? Then you’d have to delve deeper; perhaps some of the studies have methodological errors, so they can be excluded. This process can continue ad-infinitum, but despite having more information, no one is actually closer to answering the question if raising the minimum wage is ultimately harmful or helpful. Whether it’s gun laws, global warming, or masks, this is seen with many issues, in which having more information does not lend itself to epistemological clarity, and is why such issues remain controversial and debated. Having more information is only useful if it can soundly refute a false prevailing notion or silence critics, like that handwashing prevents infections, which is incontrovertibly true, but at one time was controversial.
Asymmetry is also working against you: people are much more likely to be offended by wrong, incorrect, or bad opinions than express approval of good ones. If your opinion sucks, it’s assured you will know about it either by negative votes or people telling you how and where you are wrong. It’s sorta like the opposite of Amazon reviews, in which all products seem to have 4-5 stars. It’s too bad Amazon cannot bring the honesty and transparency to its reviews that we see with online discussion. It’s not uncommon for a reader to latch on to a word or phase that rubbed the wrong way, ignoring the rest of the piece, or misconstrue something in the most negative light or put words in one’s mouth. It does not help also when some people are determined to not get the point.
So this again comes back to being able to phrase the opinion in such a way as to minimize ambiguity and misconstruction as much as possible. This is why editors are so important; their job is to assume the role of the most critical reader to find any possible objection, logical inconsistency, or other mistake. It’s also hard to tell if someone is intentionally being obtuse to conceal an objection. You’d be amazed, from my own experience, how people seem to lose dozens of IQ points when confronted with something they disagree with. It’s like, “Your bio says you work for a tech company, yet I thought I made my point pretty clear. Let me break it down even more.” I dunno why this is. Maybe it’s trolling. But another reason to either block the person or just disengage if you can sense someone is not disagreeing in good faith.
Improving the odds of correctness requires considerable research weighing both sides of the issue (unless you’re Joe Rogan, in which little research is necessary), which for most people is not worth the effort. It’s easier to just shoot off a tweet based on a hunch or a feeling, without having to do any effort to substantiate it. If you get called out on it, because it’s Twitter, you can just delete the tweet or block the person. On certain Reddit subs and forums, this can get you a ban, as you are expected to defend your positions with evidence.
I mentioned IQ and credentials earlier. Credentials and other signifiers of status and intellect matter more than if people necessarily agree with the opinion per se, or suitability of audience. So if someone has solid credentials, their opinions are likely to be well-received even by people who disagree. An example is the popular writer Erik Hoel, whose articles tend to express skepticism of the capabilities of AI to replace writers and artists, yet are well-received by people who hold the opposite view. In a 2022 article he likened AI art as only an imitation of art, writing “It is pareidolia, an illusion of art, and if culture falls for that illusion we will lose something irreplaceable.” Same for Ted Gioia, another technology and AI skeptic and critic of Tesla and Elon Musk. The rightness or wrongness of the opinion does not matter so much as the perceived intellectual credibility of whoever espouses it. Or the size of the platform, in the case of less intellectually-inclined commentators or audiences. A big platform and brand means more social proof even if the opinion is bad (e.g. wrong, overly offensive, or unoriginal).
Professional pundits who write for major media publications such as World Net Daily, National Review, or Forbes do not need to be accurate or humble, only that they check the necessary boxes (defense spending and Israel = good, student loan debt forgiveness = bad, etc.). The problem is that for everyone else, who doesn’t have the backing of a multi-billion-dollar publisher or conglomerate and the large built-in audience, connections, and imperviousness to criticism that comes with it, it’s sink or swim by the quality of the writing alone. Without such connections and a built-in audience, for content to go viral requires assiduous attentiveness to accuracy or else it will be torn to pieces or ignored by the very ‘influencers’ who hold the keys to virality.
In contrast to the brevity of Rogan, someone like Scott Alexander excels at composing detailed, well-researched opinions that are well-received by a large, ideologically-diverse audience. Such opinions if composed by someone who is not as deft with words or as attentive to the ‘opposing side’ and accuracy would likely be met with considerably more objection or even outright derision. Scott can write an essay praising capitalism, and even Marxists who disagree with the premise can still respect Scott’s willingness to entertain, rather than dismiss, criticisms of capitalism.
So this means one must occupy one of two extremes: writing lengthy, air-tight arguments backed by considerable data, or have a knack for composing terse opinions that are incisive and pack a punch, like Rogan or Ben Shapiro. The latter is related to the the so-called ‘hot take’. The idea is to express an opinion that many people can relate to on either side of the aisle, is not obvious [if it’s too obvious, then it’s not funny or clever], and exposes the sort of logical inconsistencies of whoever the target is but without moralizing about the issue. Here is an example:
People will dismiss Joe Rogan's health advice because 'hE's NoT a dOcTor' and then go get medical advice from Bill Gates. 🤣
— ZUBY: (@ZubyMusic) May 1, 2021
Even liberals have to concede he has a point, or at least that he is not wrong.
Farther to the right of the IQ distribution for technology or rationalist-inclined audiences, is someone like Roon, whose tweets always do well. I have yet to see anything he posts ever be ratioed. Like Rogan or Zuby, his takes are consistently well-received, but for a generally smarter audience. Also, like Ross Douthat, Erik Hoel and other pundits, he also possess the invaluable intellectual credibility for his content to do well with a large audience, including those who may disagree with him, although in short-form instead of long-form.
On the opposite extreme are so-called ‘stale takes’, which tend to state the obvious or are unoriginal. The quintessential example is the hackneyed line, “dems are the real racists,” which is almost never voiced in sincerity anymore. Even as recently as 2016, careers were made from repeating this over and over, or at least variations of it or integrating such themes of perceived Democratic hypocrisy into commentary, as at the time it seemed so transgressive to contrast how the left, which aggressively positions itself as anti-racist, as either having racist ulterior motives or being inadvertently racist.
Survivorship bias is a huge problem in regard to having opinions. We only see those whose opinions succeeded in the marketplace of ideas. Stand-up comedy has this problem; the jokes you hear are those which survived the social filter. This is why authenticity and ‘being yourself’ is overrated. Sure, this advice works great if your ‘authentic self’ happens to also align with that which society deems valuable or confers status–not so much if it does not. Prisons are full of people whose authentic values were incompatible with society. Or you get fired. That joke you thought was funny–not only didn’t land–but now you ruined your career. It’s more like society likes the idea of creative or authentic people who are already successful, so there is less risk of ostracization by supporting those individual’s creative endeavors or outlets. As it’s said, nothing succeeds like success.
Survivorship bias can affect one’s perception of what type of opinions are successful in the marketplace of ideas. Seeing someone on Twitter who already has a huge brand who tweets about conspiracies or other fringe or populist stuff does not mean you should emulate that; you will certainly fail, as that niche is likely saturated. By contrast, Richard Hanania had rapid success starting in 2021, without connections, by being an anti-populist and appealing to smart people receptive to his center-right elitist contrarianism, yet are skeptical of mainstream right/left-wing narratives; for example, being pro-vaccine and pro-immigration–but anti-lockdown and anti-woke. This was a niche that was greatly underserved compared to the MAGA-types and the ‘wokes’ that otherwise dominated after Covid, up until around 2022.
Similar to ‘dems are the real racists’, there was a time in the early 2000s, especially with Bush v. Gore and the September 11 attacks still fresh in people’s minds, where there was this huge, untapped market for political hacks who could phone in 500 words for World Net Daily. The overly partisan content of the likes of Krugman, Ben Shapiro, or Ann Coulter only seems viable today because of survivorship bias: the aforementioned individuals already have large brands and readership (e.g. having a long-running NYTs column) when it was easier to make a living with mass-produced partisan agitprop during the early 2000s, and this platform and readership has carried over to today.
For many millennials and gen-x, The Daily Show with Jon Stewart was a portal to politics and a cornerstone of the proverbial watercooler. The 13 minutes of occasionally funny material in-between commercials and the concluding and oft-skipped guest segment in which Mr. Stewart with varying levels of success tried to glean some sort of rapport with his subject, was seen as the high-water mark of political satire at the time, along with The Colbert Report–a highly-successful spinoff in 2005 of a similar format. Jon Stewart’s post-politics style seemed so transgressive at the time, but this in large part was due to Comedy Central, a subsidiary of the entertainment conglomerate Viacom, which ensured a captive audience of millions and consequent social proof. But unless you’re him, anyone else can now do it on Twitter for free. What I mean is, Jon Stewart was so successful because of his timing–by not having to compete against the masses of unpaid contributors on social media as we see today–in addition to the platform of Comedy Central, not because of anything uniquely special on his part. It’s not like he pioneered the genre of being cynical of politics, or was the only one doing that, or had the best jokes.
Academic-style writing works at getting content viral, without a built-in audience or huge platform, because well-educated people with large social networks are receptive to this type of content and writing style and are more inclined to share it. This is how Richard Hanania found success without much connections or a big platform in an otherwise crowded marketplace of pundits. By comparison, the conspiratorial, partisan, populist stuff is ignored by said influencers, so it does not go viral even if it seems viral when others post it. But this is because those pundits (e.g. Candace Owens, Ben Shapiro, and Matt Walsh) already have huge platforms due to the backing of major media companies. As with Jon Stewart, it’s the platform and the associated social proof of having thousands of ‘likes’ and ‘retweets’ that makes the content seem good, not that it’s actually any good. Even the worst Taylor Swift song will be popular owing to the size of her platform. Or back to Trump and Tate, the arrogant persona works if you have the social proof, platform, and survivorship bias on your side, which the vast majority of people don’t, so emulating those attributes is certain failure.
As the scope of this essay shows, having good opinions is not easy. If nothing else, the main takeaway is that accuracy is extremely important unless you happen to be among the lucky handful of individuals who already have a large platform and branding. Likability matters a lot too. This is followed by perceived intellectual credibility. Or on the other extreme, the Rogan approach of having no pretension of epistemic certainty or intellectual credentials can also work.
I can personally attest never having quite good enough opinions. I am always omitting some key detail or overlooking something, and I see it in others as well. In trying to diagnose my problem of having poor opinions motivated me to write this post in the hope it can help others or at least try to explain why it’s so hard to have good opinions, so you shouldn’t feel bad if you don’t succeed–few do. Of course, one can say “who cares…they are just words,” but I disagree. As de-platforming and cancel culture have shown, bad opinions can cause lasting reputational and professional harm. Professionals who espouse opinions for a living (e.g. comedians, late-night talk show hosts, and pundits) and have avoided accidentally self-destructing make it look like second nature, but it’s anything but easy.
“John McWhorter is an outspoken critic of the so-called ‘woke left’ and identity politics,”
I’ve only just read something by him in this interview but he most definitely is into identity politics:
https://www.vox.com/vox-conversations-podcast/2021/11/2/22728801/vox-conversations-john-mcwhorter-woke-racism
He’s learned how to use the soothing jargon and is not an outspoken critic at all. He’s onboard and is worried it’s becoming so obvious that the Eloi might start to notice. I’d say his job is to prepare corporate types to rationalize upper management quotas.