Inaction and Indifference as Rebellion, and the Decline of the Culture Wars

Activism includes but is not limited to telling people what to do or what to believe. By that definition, mainstream liberalism and conservatism is activist. There is an authoritarian and conformist tone to it that implores the subject to do something; for example, for the left, ‘you must spread your wealth and check your privilege’, as part of a collective ‘good’. But, especially since 2013, both the ‘left’ and ‘right’, particularly millennials, are tired of having to ‘do’ things, to have to ‘believe’ things, or to have strong convictions about things. With the exception of SJWs and, to a lesser extent, the alt-right and Trump, millennials are tired of action and dogma, preferring inaction and indifference. Decades ago, young people rebelled through action (protests, Woodstock, drugs, cross-country motorcycle rides), but now ‘rebellion’ is through inaction: staying home and watching Netflix instead of partying, going MGTOW, abstaining from drugs and alcohol, minimalism, personal finance, learning coding, and eschewing careerism.

As part of the post-2013 rise of ‘introspection culture’ (which is related to intellectualism culture), where ‘boring’ has become the new ‘hip’, naval-gazing and introspective articles, such as the widely-shared personal account of someone disconnecting from the internet for a month How I Got My Attention Back, frequently go viral, as every personal observation, no matter how small, has suddenly found a captive audience. In terms of clicks and viralness, even attention-grabbing headlines about major pubic figures such as Donald Trump find it hard to compete against seemingly mundane and contemplative topics, such as the articles In Defense Of A Boring, Comfortable Life and Why are Adults so busy?, both of which went viral.

With the decline of activism and the rise of introspection, which is individualistic and to some extent self-absorbed, the ‘culture wars’ are dying, as far as millennials are concerned. The post-2009 bull market (which is officially the longest ever), the post-2009 economic expansion (also the longest ever although the GDP growth is still sluggish), as well as a culture, economy, and society that celebrates and prizes individualism (such as taking pictures on Instagram), has also made culture wars less relevant. People see headlines about surging stock prices, stratospheric web 2.0 valuations, and Chinese buying up all the expensive estate estate in America, and we want a piece of the action instead of missing out (FOMO)–but also headlines about social security dwindling, the bad labor market, or how deficit spending threatens social programs, and millennials realize that while culture wars may be a sort of ‘bonding experience’ between like-minded people, no amount chest-thumping about social issues will change anything as far as policy is concerned nor provide financial peace of mind in increasingly uncertain economic times (such saving for retirement, paying for healthcare and education, covering the mortgage, or getting a job). Because of the aforementioned social, cultural, and economic factors, the culture wars ‘lost’ in the ‘court of pubic opinion’ or the ‘marketplace of ideas’, because the ‘generals’ failed to provide a sufficiently compelling case for why people should keep fighting when other issues seem more pressing.

As further evidence of this capitulation, particularly among the millennial-right, in 2016, Peter Thiel’s RNC speech, in which he proudly proclaimed being gay, was met with raucous applause. Such an ebullient response would have been inconceivable even as recently as a generation ago. Additionally, Thiel implored the ‘right’ to focus less on culture war issues (such as the controversy over same-sex bathrooms) and more on entrepreneurshi and innovation. However, conservatism in the individualistic, Randian sense (capitalism, private property, ‘ownership society’) is thriving, which is why Peter Thiel, who is a business and investing genius, not a culture warrior, is beloved by many millennials on the right. Same for Elon Musk.

With the exception of condoning obvious criminality that violates the non-aggression principle, such as the exploitation of minors, taking a moral ‘high ground’, in recent years, has become an untenable position in our era of moral ambiguity. For one, it’s a lost cause. For decades, spanning four presidential administrations, as well as talk radio and TV, the ‘right’ has nothing to show for its efforts, as American culture and society has inexorably moved ‘left’. Also, wrapping yourself in a cloak of moral sanctity and piousness leaves one exposed to charges of hypocrisy should one’s own indiscretions come to light. Rather than pressing judgement, it’s easier, but also more robust, to just not care. Moralizing, which includes SJW-activism, is sometimes an unwanted imposition that goes against one’s capacity for self-determination and self-regulation. ‘Our’ values, as in the ‘right’, are the bedrock of civilization, and leftist values are anathema to this. But to have strong values at all, from throwing in the towel on the culture wars or the rise centrism is, in and of itself, becoming an anachronism.

Idiocracy in America? Probably not

Anatoly Karlin’s article A Short History of the Third Millennium went massively viral, being read by thousands and getting almost 200 comments. Online, especially, there is considerable interest in ‘weird’, speculative topics such as futurology and existentialism, and these are issues that may have dramatic ramifications for the future of humanity: is radical life extension possible? How about whole brain emulation? Or creating super-human intelligence through gene modification and embryo selection? Will artificial intelligence render all jobs obsolete, or possibly even threaten to enslave us? Will humanity see the singularity and the transition to a type-1 and beyond civilization, or will it kill itself first? Are we destined for greatness or doomed to perish under a dysgenic dystopia?

Human genetic editing is banned by government edict around the world, to “protect human dignity” in the religious countries and “prevent inequality” in the religiously progressive ones. The 1% predictably flout these regulations at will, improving their progeny while keeping the rest of the human biomass down where they believe it belongs, but the elites do not have the demographic weight to compensate for plummeting average IQs as dysgenics decisively overtakes the FLynn Effect.

The good news is, historically, the trend has been towards towards the expanded use and adoption of new technologies, not restrictions. It cost $3 billion and over a decade just to sequence the human genome, let alone do much with it. It doesn’t make any economic sense for companies to spend so much money and time developing technologies, only to intentionally restrict the usage of such technologies to only an ‘elite’. By making technology readily available, it lowers costs and spurs further innovation. Now it only costs $1,000 to sequence a human genome.

It’s like the belief that elites have secret cancer cures that they are keeping themselves. Again, this is bad economics considering that there is huge demand (millions of people get cancer) and cancer drugs cost hundreds of millions, even billions of dollars to develop, creating an economic need to make these treatments available to as many people as possible, in order to recoup the costs. In a free market economy, if a company or entity were to a hoard a technology, another entity will eventually develop a cheaper and better version and make it available to more people, likely putting the first entity out of business.

An obvious counter-example are sports cars and private planes, which are still only available to elites. This is because the technology doesn’t exist to make private planes as cheap as a Honda. Another factor is branding, which is why Nike shoes and Rolex watches are so expensive even though their underlying technologies are not revolutionary. The reason why cancer treatments, which can cost hundreds of thousands of dollars, are made available regardless of ability to pay is because the government has deemed it a ‘public good’. It’s possible genomic modification will become another luxury item and not a pubic good. It’s possible offshore embryo modification labs will be created for ultra-high-worth clientele who want their children to be endowed with traits that auger well for socioeconomic success, such as having a high IQ.

FLynn effect of environmental IQ increases is petering out across the world, especially in the high IQ nations responsible for most technological progress in the first place (Dutton, Van Der Linden, & Lynn, 2016). In the longterm “business as usual” scenario, this will result in an Idiocracy incapable of any further technological progress and at permanent risk of a Malthusian population crash should average IQ fall below the level necessary to sustain technological civilization.

However, even if current trends persist, the movie Idiocracy becoming reality is unlikely, as I discus in more detail here.

Even if the FLynn effect is tapering off, that doesn’t mean it will reverse. Anther possibility is that early gains in IQ are attributable to environment, and now that essentials such as food, shelter, sanitation, clean water, electricity, and literacy are much more common, the ‘low hanging’ fruit has been picked, putting more precedence on genetic factors, which are much slower to evolve than environmental ones, which is why it may seem like the FLynn effect is reversing.

For global IQs to keep falling without a bottom, there has to be some sort of environmental selection pressure to favor increasingly low IQs.

Just as the human population rose tenfold from 1 billion in 1800 to 10 billion by 2100, so it will rise by yet another order of magnitude in the next two or three centuries. But this demographic expansion is highly dysgenic, so global average IQ falls by a standard deviation and technology stagnates.

Even if this happens, the growing world population will mean more total smart people, which seems to be the case right now. Russia, Europe, and East and South Asia have billions of people and produce thousands, if not millions, of geniuses each year by virtue of the normal distribution of IQs. Even populations with a mean IQ of less than 100 still produce geniuses. Furthermore, smart people are more likely to procreate with other smart people (assortative mating), resulting in ‘enclaves’ of high-IQ, even as the rest off the world regresses.

The best and brightest from all over the world flock to America’s most prestigious universities and companies, which is why I’m skeptical of the America ‘Idiocracy’ scenario. As further evidence against ‘dumbing-down’, the number of research publications on Arxiv, a pre-print repository that specializes in physics and math papers, has surged in recent years. There is also no evidence yet of technological stagnation either (for example, genome sequencing is becoming cheaper). Like a fantasy sports team that gets the best players from all the teams, America is getting the best and brightest from all over the world. This is a major reason why the US economy has proven so resilient and strong in recent years whereas other economies have struggled with falling currencies, high inflation, falling stock markets, corruption, and slow growth. It’s a testament to these smart people that America is as functional as it is given all the forces of decay by leftism.

What matters most, however, are the total number of genetically ‘smart people’, not the proportion of smart people relative to the overall population size. Similar to how only a single human can oversee an entire ant colony, you don’t need many smart people to manage large populations.

By ‘genetically smart’, I mean people who have non-adjusted IQs above 130 or so. If the world were to become less intelligent, IQ tests would be adjusted to be easier, in order to keep the ‘mean’ IQ score still at 100. A century from now, an IQ of 130 may only be the same as a score of 115 today. But it’s still possible to have many people with biological IQs still at 130.

As the world population swells, it’s imperative that the population of smart people at least remain constant and undiluted. As far as America is concerned, the obvious answer is eugenics and restricting immigration by IQ and country, to keep such enclaves from becoming contaminated. Silicon Valley, perhaps the greatest IQ enclave ever, needs to be made aware of the threat of low-IQs to it homogeneity and stability. But if not for the sake of boosting IQ, we need restrictions for the sake of preserving civilization and civility against the hordes that threaten it, for without civilization, high IQ is useless.

Never Apologize to Liberals

It’s been two months since Tim Urban updated Wait But Why, his last post being a mea culpa of sorts for not being sufficiently aware of his ‘white male privilege’. Either he’s working on another epic post or he’s still shell-shocked from the backlash wrought by his last two posts.

From the comments of Tim’s November 9th ‘It’s Going to Be Okay’ post, on the morning of Trump’s win:

He was right…there were incidents of racism and violence following Trump’s win–against white Trump supporters perpetrated by blacks. Of course, we can’t let such details get in the way of a narrative that Trump voters are literally the KKK.

But anyway, Tim’s biggest mistake was believing that liberals would respond to facts, logic, and reason–but that’s not how far-left liberalism works–it never has and never will. Far-left liberals care about winning and ideological purity, above all else. Like a cancer, virus, zerglings, or termites, they are ruthless and single-minded and will stop at nothing in their pursuit of power and control, destroying their host in the process if need be. You think you are in control, but one wrong move and it’s over, like a tiger turning against his hapless trainer.

SJWs swarming a comments section

Cancer dies when it’s deprived of glucose. Likewise, the way you defeat SJW-liberalism is to starve it of the attention and chaos it seeks–ignore their pleads for you to check your privilege. They want your contrition–never capitulate and give it to them. First, it won’t work: it’s too late. The scarlet letter is irrevocable. Second, by caving in, you’re only emboldening them. Third, you’re not alone: the rest of the internet (such as Reddit and 4chan), including even other liberals (as the example above shows, being a liberal does NOT make one immune to the SJW piety mob–not by any stretch of the imagination), hate the SJW-left, so you have many allies that you can turn to for support. Fourth, of course, is that being a white heterosexual male is nothing to apologize about.

Why Choose Traditional Publishing Over Self-Publishing

The Two Choices, by M.T. White

If you have the talent, something to say, and are persistent enough but don’t have a large brand, traditional publishing (which includes indie publishers) is almost always the way to go. Just do a Google search for almost any traditionally published fiction title and you’ll see hundreds or even thousands of ‘Good Reads’ reviews, indicating thousands of sales. You cannot get that kind of volume self-publishing unless you are either very lucky, have a large personal brand, or are extremely good at networking. Yeah, traditional publishers take a big cut but they can also bring in big volume. Stephen King would not be worth as much as he is had he done it alone. The odds of success at publishing increase dramatically if you have a high-IQ (yeah, IQ is important for success at writing, no way around this), are persistent enough and put in the necessary work, and if you write how-to books instead of fiction (STEM books, despite the difficulty of the subject matter, sell a surprisingly many copies and command a much higher price than fiction. This is because traditionally published physics and math textbooks are quite expensive ($50-150), so if you can sell you own textbook for half the price and undercut the major competitors, you can still make a lot of money per sale.)

To get an idea of how substantial traditionally published books sales are, here is a list of the bestsellers 2015, with the number of copies sold for each book:

Top 15 best-selling books of 2015 revealed – how many of these blockbusters have you read?

#1-3 sold a combined 2.08 million copies.

M.T. White writes:

Notice, you’re not writing your book to appeal to readers. You’re writing your book to appeal to agents and acquisition’s editors. Naturally, they are looking for product that will sell, but to them it is just product—to you it will be a book written full of compromise…maybe. No matter what, you have to appeal to THEM first before your book hits shelves. And THEY might have very different tastes than you. They live in a different city (probably New York), while you might live in rural Texas. They might have polite sensibilities, while you have vulgar ones. You might find someone who is in alignment with you but as I stated above, there is a 99.9% you won’t. You have to appeal to THEM before your book even makes it to the press.

Not sure how much of this is true. Although with traditional publishers you relinquish some creative control, imho, publishers care more about selling books than political correctness, and I imagine well-written books tend to sell better than poorly-written ones. If the author is already famous (such as by being a celebrity), quality may be secondary. Consider Milo Yiannopoulos’ book, Dangerous, which is obviously not politically correct but is still being published by a major publisher, Simon & Schuster (specifically, Threshold Editions, a subsidiary of Simon & Schuster). Major publishers will take chances if they can get good sales volume. For example, Ann Coulter’s latest book Trump We Trust: E Pluribus Awesome! is punished by Sentinel, a subsidiary of Penguin that specializes in conservative books.

But the reality for self-publishers is pretty bleak when you crunch the data. The median number of sales for the typical self-published books is zero, literally indicating no copies sold. This means if the author paid for editing, formatting, and cover design, he or she lost money. The mean is higher, due to the couple hundred or so outliers that sell thousands of copies, inflating the overall average. Or to put it another way, if Bill Gates enters a bar that has 50 patrons, the mean wealth for each patron is $1 billion, but the median wealth is still around zero.

Only 40 Amazon self-publishers have sold a million e-book copies the past five-years:

40 self-published authors “make money”, all the others, and they number in the hundreds of thousands, don’t. This interesting statistic, recently revealed in a New York Times article, applies to the Kindle Store, but since Amazon is in fact the largest digital publishing platform in the world, it is a safe bet that self-published authors are not doing any better elsewhere.

“Making money” here means selling more than one million e-book copies in the last five years. Yes, 40 authors have managed that, and have even gone on to establishing their own publishing house, like Meredith Wild. Her story is fully reported in the New York Times, here, and well worth pondering over.

The number of books on the Kindle marketplace meanwhile has exploded:

The digital market is indeed scary, primarily because of its dimension: over 4 million titles today in the Kindle Store, compared with 600,000 six years ago (again, the data is from the same article). This means “book discovery” has become the number one problem. How can your book stand out in such a vast crowd?

The somewhat depressing reality is, as discussed in Pencil Pushers and the Miracle of Capitalism, the creative arts, which includes writing, doesn’t pay well relative to the rarity of talent and IQ required to succeed at it. Same for much of entertainment, such as acting and singing. The top athletes, musicians, and actors ‘only’ make about $100 million (yeah pity them lol) but the top businessmen make hundreds of millions or even billions. Only one musician has ever made a billion dollars, Sir. Paul McCartney, I think.

But back to writing, a ‘success’ at publishing, both traditional or self-published, is to make $30,000 a year pre-tax. A six-figure contract is considered a ‘big success’. For example, if you sell 3,000 books on Amazon for $15/each and keep $10 per sale, you’ll earn $30,000/year, which puts you in the top 1% of all authors (traditional, indie, and self-published).

According to a research report, by my guesstimate, only 100 Kindle publishers make over $50k year:

Unfortunately, despite an exhaustive Google search I was unable to determine the total number of Kindle publishers, so these numbers are hard to statistically quantify.

Indie publishing also has similar barriers and quality control as big-5, but sales volume is predictably lower. Indie publishing caters to a near-infinite number of genres, from math books, youth fiction, and history.

But such success likely also requires top 1% of IQ and talent. But according to the US Census Bureau, the median annual personal income for all workers over age 15 is $30,240, and a mean personal income of $44,510–which only requires top 50% of talent and IQ to attain, versus top 1% talent. You have to have been the smartest in you class (straight A’s on all the writing assignments and a near-perfect verbal score on the SAT) to have a shot at making $30-50k a year with fiction writing. If not IQ, then top 1% of work ethic and determination. Or top 1% of social networking. No matter how you look at it, in the arts, you need to be exceptional to make an unexceptional income. Selling books is hard. Like painting, you can’t do it for the money, but rather out of a love for the process, and any windfalls should be treated as surprises. Milo’s $250,000 advance is a such a big deal because for authors that’s a large amount of money, which is the same annual salary as a mid-level nobody from a medium-sized firm. Milo has top .001% recognition (a Google search for his name reveals over 4.3 million results, making him one of the most visible people alive) to make a salary obtainable by 2% of the general population.

Some say you have to build a brand to self-publish, but what does that entail? Writing blog post or a book typically involves a typical formula: plot, characters, theme, protagonists, antagonists, and a beginning, a middle, and an end. How does one even begin to self-promote? It’s so vague and the brevity of such advice belies the great difficulty of applying it. Going the traditionally punished route seems easier, because then you can focus most of your energy on writing the best possible book you can, rather than promotion and building a brand, which is the job of the publisher. The major problem is that there are to many self-published books, and major inflencers are inundated with paid review requests, meaning that the typical book will get very little exposure and the turnaround time will be very long. The odds anyone, myself included, can build a personal brand big enough to sell at least 100 books is close to nil, even if you work hard at it.

But overall, compared to self-publishing, traditionally published offers a bigger average and median payout (both terms of the advance and, if it sells enough copies, the royalties) but also there is more time and work involved. If you get traditionally published (which includes smaller indie firms), even if the book doesn’t sell more than a few thousand copies, it opens up a lot opportunities and recognition that makes the slog worthwhile, that self-publishing typically does not provide. You’ll start getting a trickle of ‘Good Reads’ reviews…people will start ‘Googling’ your name, and after maybe three or so books, you’ll develop a decent-sized following, which increases the odds of subsequent books being a success, as well as larger advances. Self-publishing: less time and work (although still a lot), smaller average payout (most books sell less than a dozen copies) and less recognition.

Related: The Stark Realities of Self-Publishing

Black Lives Commit More Crime

From Marginal Revolution Black Lives Matter

As many have already noted in the comments, Tyler ignores how blacks also commit disproportionate more crimes than whites and are more likely to engage in deadly force against officers.

Related: There is No Systemic ‘War on Blacks’ by Police

From the comments:

Do White Police Officers Unfairly Target Black Suspects?

2015 statistics on murders and police killings in USA: Police killings: 986. Of these, 495 of those killed were white (50%), 258 were black (26%), 172 were Hispanic (17%). Murders: 15,696 total, of which 52% of victims were black, and in these cases over 90% of the perpetrators were black.

“Do White Police Officers Unfairly Target Black Suspects?” Abstract: “Using a unique data set we link the race of police officers who kill suspects with the race of those who are killed across the United States. We have data on a total of 2,699 fatal police killings for the years 2013 to 2015. This is 1,333 more killings by police than is provided by the FBI data on justifiable police homicides. When either the violent crime rate or the demographics of a city are accounted for, we find that white police officers are not significantly more likely to kill a black suspect. For the estimates where we know the race of the officer who killed the suspect, the ratio of the rate that blacks are killed by black versus white officers is large — ranging from 3 to 5 times larger. However, because the media may under report the officer’s race when black officers are involved, other results that account for the fact that a disproportionate number of the unknown race officers may be more reliable. They indicate no statistically significant difference between killings of black suspects by black and white officers. Our panel data analysis that looks at killings at the police department level confirms this. These findings are inconsistent with taste-based racial discrimination against blacks by white police officers. Our estimates examining the killings of white and Hispanic suspects found no differences with respect to the races of police officers. If the police are engaged in discrimination, such discriminatory behavior should also be more difficult when body or other cameras are recording their actions. We find no evidence that body cameras affect either the number of police killings or the racial composition of those killings.”

And also:

There were 6,095 black homicide deaths in 2014—the most recent year for which such data are available—compared with 5,397 homicide deaths for whites and Hispanics combined. Almost all of those black homicide victims had black killers. Police officers—of all races—are also disproportionately endangered by black assailants. Over the past decade, according to FBI data, 40% of cop killers have been black. Officers are killed by blacks at a rate 2.5 times higher than the rate at which blacks are killed by police. Some may find evidence of police bias in the fact that blacks make up 26% of the police-shooting victims, compared with their 13% representation in the national population. But as residents of poor black neighborhoods know too well, violent crimes are disproportionately committed by blacks. According to the Bureau of Justice Statistics, blacks were charged with 62% of all robberies, 57% of murders and 45% of assaults in the 75 largest U.S. counties in 2009, though they made up roughly 15% of the population there.

The Sweet, Boring Middle

Don’t read Marginal Revolution much anymore-find it kinda boring (too much economics minutiae and trivia)-but that reflects a deficiency of my own attention span and intelligence to appreciate it, not the inability of Tyler Cowen to be interesting. And evidently, his website is very interesting judging by the immense amount of traffic it gets, so my opinion is obviously an outlier. And again and again, as I discussed a month ago regarding Scott (both Scott Adams and Alexander), the greatest growth in the ‘intellectual middle’. The middle is the ‘sweet spot’, by courting both sides without having to have opinions that are ‘too extreme’ as to dissuade too many people from reading or sharing. For those on the ‘extreme’, how many would compromise some principle for a lot of traffic? I imagine many would, as that is the economically rational thing to do. Contrary to Daniel Kahneman and Michael Lewis who insist everyone is irrational, most people become rational decision makers when faced with easily quantifiable choices like choosing more money (or clicks, social status, etc.) versus less.

As part of the post-2008 ‘Cambrian explosion’ of intellectualism, Tyler’s blog is much more popular now (along with other ‘smart’ sites such as Slate Star Codex, Bryan Caplan’s Econ Log, Ribbonfarm, Wait But why, Scott Adams’ Dilbert Blog, and many, many more) than as recently as a few years ago, in agreement with this trend. But it gets even better–by pulling out ‘intellectual passport’, one who is in the ‘intellectual middle’ need not have to ideologically conform or compromise to be accepted, but rather be granted entry and be accepted into various ‘extreme’ but high-IQ groups (such as most of the alt-right) by virtue of being smart and authentic. That goes against much of social theory convention that says one must conform to be accepted. By being authentic, even if such views are counter-narrative, you gain more respect. Dissembling one’s motives and perceived pandering, even if one is ideologically close, has a repulsive effect, which explains why the alt-right has so strongly repudiated a handful of individuals, who despite otherwise agreeing with the alt-right on many things (such as being pro-Trump, pro-immigration control, anti-SJW, etc.) were perceived as only being involved ‘for the money’ and appropriating the alt-right ‘label’ for personal gain.

Regarding Tyler Cowen and Bryan Caplan, both unapologetically support open borders but oddly enough are highly respected by various alt-right and reactionary groups, their articles frequently cited. This has to do with intellectualism (both Tyler and Bryan are very smart) and authenticity (neither of them compromise or pander (and they don’t need to given how popular their sites are), and such steadfastness is respected), as discussed above–but also various shared narratives play a role, specifically a shared dislike of majoritarian systems, as described in Intellectual Solvent, Part 3:

Both smart left-wing and smart right-ring bloggers can relate to be ill-served by majoritarian school systems, that neglect the talented in favor of bringing the slowpokes and troublemakers up to speed. The same also applies to work settings, of the talents of smart employees being underutilized and or unappreciated, and this frustration crosses political lines. Both sides agree that incompetent people seem to be ‘running the show’, not the best and brightest, although in achieving opposite desired goals.

There comes a point where your’re so competent, pandering and compromising is unnecessary, and echoing Heidegger regarding authenticity, I think that’s what everyone should aspire to. George Carlin never had to compromise his angry, nihilist message to be accepted: audiences of all political makes found him funny, because, ultimately, he was a good comedian. When you read Paul Krugman or even Ann Coulter, there is the sense of desperation “I need to be accepted…I need to be edgy, funny, and partisan”, and that histrionic, excitable style of internet writing doesn’t work anymore [1]…better to be competent, even if that means being sightly ‘boring’, like Tyler Cowen.

[1] It works if you already have a huge audience that you built in the 90′s and early 2000′s as in the case of Ann Coulter and Paul Krugman, but not in the post-2013 era.

Intellectuals choose correctness over consensus

Related to Identity, IQ, and Incoherence of the Alt-Right

Intellectuals care more about correctness (or what they perceive as being correct) than consensus; for collectivist and identity-driven movements, it’s reversed. For example, Francis Fukuyama, considered one of the intellectual ‘founders’ of neoconservatism, went from in 2001 ‘co-signing William Kristol’s September 20, 2001 letter to President George W. Bush to “capture or kill Osama bin Laden”, but also embark upon “a determined effort to remove Saddam Hussein from power in Iraq”‘ to, just two years later, demanding Rumsfeld’s resignation. Fukuyama signed a similar letter in 1998 directed at Bill Clinton. Although the subsequent deterioration and financial cost of the occupation vindicated Fukuyama, such an abrupt about-face after only two years did reek of perfidy and betrayal. Not surprisingly, Fukuyama and Kristol are no longer close friends.

Wealth, Intellectualism, and Individualism, Part 7

Part 6

Nerd mannerisms and appropriations, especially in pop culture and on Instagram, where pretty women donning faux glasses post memes about social isolation, have become the ‘new normal’, and words like ‘normie’ have become pejorative.

Nowadays everyone wants to be the ‘smartest person in the room’, not the most outgoing or popular. But ironically, in being smart, you become popular, whether you seek the attention or not.

Autistic-like traits such as social awkwardness, dismissiveness, curtness and bluntness (as opposed to sugarcoating, sentimentalism, and extroversion) convey authenticity and credibility, versus being a shallow ‘normie’ or ‘people-pleaser’, leading to a boost in social status both online and offline, whereas decades ago these smart people were ignored or relegated to the lower echelons of the social hierarchy.

Fast-forward to today, from Silicon Valley to Wall St., to having the most subscribers and followers on Instagram, Twitter, Vine, and YouTube, and in terms of higher wages (for STEM jobs), surging real estate (in Silicon Valley), stratospheric Web 2.0 valuations, and a perpetually rising stock market, as well as approbation and cultural appropriation, it’s not a stretch to say nerds, or more specially, introverts, rule the world right now.

Due to STEM, his popular blog, and by being really smart, Scott Aaronson has far more status than the vast majority of ‘normies’ (except for, perhaps, some athletes and actors). Same for Tyler Cowen, an economist (which is close enough to STEM), whose Marginal Revolution blog is extremely popular, read by thousands of people every day. Yeah, Marginal Revolution is not a big as TMZ or ESPN, but 1,000-10,000 dedicated readers/fans is about 1,000-10,000 more than the typical ‘normie’, who has close to zero after excluding immediate fiends and family. Those are just a handful of examples of out many; more will be given later.

From Virtue Signaling and Status:

We all want to be perceived as smarter because smart people are among the most successful in society today as measured by wealth, wages, and social status. While famous athletes and other entertainers make a lot of money, no one seeks their counsel on anything substantive, whereas if you’re smart you are elevated to the status of an ‘oracle’, and your opinions on a wide-range of issues – be it global warming, economics, sociology, or history – are valued and sought.

Intellectuals, particularly in the most difficult of fields, have become America’s new priesthood or nobility, sought for answers and bestowed with high social status, and whether it’s the latest gizmo from Google, Amazon, or Tesla, or the latest particle discovery in the field of high-energy physics, their contributions are broadcast by the media to the world. From The Daily View [...]

Smart people are among the most important and respected people in the world. They have the most Karma on Reddit, the most points on sites like Stack Exchange, the highest reputation on forums, and most views on YouTube for technical, artsy, or philosophical subjects. They have the credentials – SAT scores and degrees – to lend their expertise in a variety of fields and are showered with accolades …

Smart people are displacing ‘old money’ on the Forbes 400 list, getting their Web 2.0 companies valued or acquired for billions of dollars, watching their stocks and real estate zoom into the stratosphere – even as real wages for most people haven’t budged. A meritocracy epitomized by Bay Area tech scene or the financial cognoscenti of Manhattan, where erudition, wealth, and the specter of all-knowing omnipotence is valued.

And from the Economist, Be nice to nerds:

“Be nice to nerds. Chances are you may end up working for them,” wrote Charles Sykes, author of the book “50 Rules Kids Won’t Learn in School”, first published in 2007. Today there are more reasons than ever to treat nerds with respect: never mind the fact that every company is clamouring to hire them, geeks are starting to shape markets for new products and services.

Behaviors that may seem repulsive and anti-social, paradoxically, draw people in as ‘nerds’ are sought for their expertise and sober objectivity in contrast to the mainstream media, which is full of hoaxes, sensationalism, inaccuracies, omissions, and biases. From Deconstructing a Viral Article:

As I show in the example of Warren Buffett, intellectualism, competence, and merit is what draws people in, not being extroverted. Every year, thousands of people flock to Omaha for Buffett’s annual shareholder meetings – not because Buffet is a people-pleaser, but because he is very competent and his insights are invaluable. Elon Musk, another example of someone who is extremely competent, had the most popular Reddit AMA ever. Richard Dawkins, who lately seems to have gotten into habit of offending the easily offended, also had an enormously popular AMA.

They (nerds, quants, wonks, experts) are providing the answers to life’s most intractable mysteries, from theories of the origin of the universe, to theories of biology, economics, and sociology – to try to explain why wealth inequality is so persistent or why some groups always underperform academically and economically despite despite billions of dollars of entitlement spending over many decades. Sugar-coated, politically correct explanations and ‘nice’ discourse has fallen short at explaining the world, and people demand answers, even if such answers aren’t wrapped in a pretty bow of political correctness.

A lengthy 1994 New Yorker profile of Bill Gates aptly applies to many smart millennials today, who disregard obsoleted social conventions and niceties for bluntness and disheveledness, in their ‘pursuit of the truth’:

“Bill just doesn’t think about clothes. And his hygiene is not good. And his glasses—how can he see out of them? But Bill’s attitude is: I’m in this pure mind state, and clothes and hygiene are last on the list.”


Gates is famously confrontational. If he strongly disagrees with what you’re saying, he is in the habit of blurting out, “That’s the stupidest fucking thing I’ve ever heard!” People tell stories of Gates spraying saliva into the face of some hapless employee as he yells, “This stuff isn’t hard! I could do this stuff in a weekend!”

Back in 1994, a less intelligent era dominated by shows like Friends, Baywatch, and 90210, social conventions were more important than they are now, making Gates’ behavior truly anomalous, but now it’s commonplace, almost expected, and (as mentioned earlier) conveys authenticity and honesty. In the 90′s the clubs were busting, but now everyone wants to stay at home, quiet, watching Netflix, being introspective, or posting pictures on Instagram. Nightclub attendance has plunged.

Or as summed-up by the brilliant Eric Winstein, creator of the online mathematics encyclopedia MathWorld:

Right now I think we’re in something of a ‘competence bubble’ of sorts, where competence is valued more than ever as measured by social prestige, wealth, and wages, with ‘social skills’ and ‘people skills’ being less important. This is also related our post-2008 results-orientated economy, whereby quantifiable results have become more important than agreeability, as part of the push by corporations towards greater productivity and efficiency. Smart people, because they tend to be more competent, are especially suited for America’s competitive economic and social environment that prizes quantifiable, individual results over ‘collectivist’ traits like social skills.

To be continued…

Freedom vs. Liberty

A common misconceptions is that ‘freedom’ must arise from ‘liberty’, or that the two are interchangeable. Part of the problem is the false dichotomy that the absence of liberty implies the existence of oppression (liberty follows from liberation), and that the former must actively resist the latter. This leads to an endless struggle of liberty versus oppression, that never ceases, because one can keep devising new forms of oppression that liberty must ‘fight’, the result being the inexorable march to the progressive ‘singularity’.

Advocating a compatibilist view, Thomas Hobbes and David Hume postulated that free will is simply freedom from external coercion. Coercion is often justified by do-gooders under the guise of ‘maximizing liberty‘ and fighting imagined oppression, ironically leading to less of the very freedom and liberty they seek. In a libertarian sense, to be ‘free’ is be be able to act on your own volition, provided such actions don’t impede one’s autonomy or private property. Classical liberalism is correct about some things (in choosing ‘equal opportunity’ over ‘equal outcomes’, as well as support of private property) but is wrong about trying to impose its values, which are often universalist, on society, business, and in government.

Reactionaries argue that democracy, as well as constructs such as ‘liberty’, the virtues of which are often touted by mainstream politicians, are the problem, not the solution. For aforementioned reasons, reactionaries are more sympathetic to freedom than liberty, which is could explain why many on the alt-right were (or maybe a handful still are) libertarians or an-caps. But private property and autonomy alone, without a national ‘cohesiveness’, is insufficient. The pursuit and idealization of such ‘universal’ ideals are an invitation to heterogeneity, because such abstractions are not specific enough and can be applied to everyone. Biology and possibly ethnicity, however, are specific, the opposite of universal.

The Post-2008 ‘Philosophy Boom’

This article is going viral: Why read old philosophy?

Since 2008, we’ve been in what can be described as ‘philosophy boom’, as articles and stories about philosophy frequently go viral on sites such as Reddit, 4Chan, and Hacker News, and there seems to be a lot of interest in the subject on Quora and elsewhere. The resurgence of philosophy can be explained by several factors:

Philosophy, especially in recent years, is finding a home in many theoretical STEM applications (such has computer science, set and logic theories, quantum physics, etc.)., and the two are becoming increasingly intertwined. Philosophers seek to emulate physicists, and physicists seek to better understand philosophy. The former is related to the so-called ‘physics envy’ in economics, but such envy also seems to be reversed for STEM subjects, too.

To wit, when the insufferable pedant Neil Degrasse Tyson proclaimed philosophy as ‘useless’, he was instantly met with strong rebuke – by other physicists, including the brilliant Sean M. Carroll, who is much smarter and more accomplished than Tyson can ever hope to be, defending philosophy.

Additionally, both physics and philosophy involve abstractions, are subtle, and tend to be very specific and precise in terminology.

The study of philosophy is analogous to understanding the ‘source code’ behind declarative statements, giving a deeper understanding than is otherwise revealed prima facie, or (in the case of source code), rendered on a computer display. A low-information political pundit may extol the ‘goodness’ of ‘freedom and liberty’ as it applies to common situations such as politics. Philosophy, however, goes deeper by inquiring what it means for something to be ‘good’, what the concept ‘freedom’ means, and whether the two are always mutually inclusive. Whereas punditry is concerned with the present, philosophy seeks to understand antecedents and origins, building on the body of prior philosophical work. This is analogous to source code, which is the antecedent of the output, and newer programming languages are derived or inspired from older ones. Similarly, regarding mathematics, applied mathematics manipulates existing concepts to get outputted results (the answer). Abstract and pure mathematics takes it a step further by trying to determine the conditions where answer is or isn’t possible. This is probably why so many people in computer science, physics, and mathematics are enamored with philosophy, and the other way around.

Second, the study of philosophy, although it may not have as many direct real-world applications as engineering, biology, or computer science, is still valuable for signaling intellect. Philosophy majors have as high of SAT scores as STEM majors. Philosophy is useful for study because it helps us organize our thought processes and reasoning, with a rigor that one wouldn’t otherwise hold themselves to, as the source code analogy above shows. This is probably why philosophy majors are sought for employment, because the degree signals above-average critical thinking and analytical skills.

For example, from the Fire Thirty Eight article Philosophers Don’t Get Much Respect, But Their Earnings Don’t Suck, here’s an infographic that shows how philosophy majors not only make good wages (as high as most STEM subjects) but also have high scores on the GRE and LSAT, both of which are good proxies for IQ. Because philosophy majors are smart, they can readily grasp non-philosophy concepts, which is valuable for employers, who benefit from having employees who are quick to learn and can anticipate needs.

And actually, philosophy is respected, or at least online based on my own observations. As mentioned above, not only do philosophy articles frequently go viral and get a lot of up-votes and positive comments, there is a lot of discussion online about philosophy, and people online seek philosophers for their insight and wisdom. But also, the monastic pursuit of knowledge, deep ‘truths’, and abstractions, in a culture of instant gratification, reductionist narratives, superficiality, and 24-7 entertainment disguised as information (infotainment), is commendable and meritorious. The sacrifice of immediate wealth and ‘payoff’ (having a low time preference) pays dividends long into the future, as others who seek ‘immediate employment’, after many decades, still find themselves in a personal rut, unable to advance beyond the 9-5 grind of being an invisible, unimportant person. Peter Thiel, possibly one of the smartest and most successful people alive as measured by net worth and accomplishments, majored in philosophy:

After graduating from San Mateo High School, Thiel went on to study philosophy at Stanford University. During Thiel’s time at Stanford, debates on identity politics and political correctness were ongoing at the university and a “Western Culture” program, which was criticized by The Rainbow Agenda because of a perceived over-representation of the achievements made by European men, was replaced with a “Culture, Ideas and Values” course, which instead pushed diversity and multiculturalism. This replacement provoked controversy on the campus, and led to Thiel founding The Stanford Review, a paper for conservative and libertarian viewpoints, in 1987, through the funding of Irving Kristol.[19]

That was many decades ago, and now he’s a billionaire..of course, his wild success cannot be generalized to everyone, but his story is example of how delayed gratification can lead to massive payoffs later, as opposed to to skipping college to seek immediate employment and gratification.

Regarding how philosophy is respected, from an earlier post SJW Narrative Collapse, Part Infinity:

The fact that the story went so viral, making it to the front page of Reddit, but also the intense, impassioned discussion in the comments, is further evidence of how finance is so important to millennials, who would rather debate regulation and high frequency trading than waste time on mind-numbing, disposable pop culture entertainment. This is more evidence of how intellectualism has become so important, contrary to pronouncements of how America is ‘dumbing down’. There is a huge demand for intellectualism that the internet and communities like Reddit, Hacker News and 4Chan are satisfying.

An from the post Millennials and Misconceptions in which I give an example from Reddit of how stories and comments that praise education and the attainment of knowledge are up-voted, whereas posts that advocate a more parochial, narrow-minded appeal to ‘instant gratification’ are down-voted.

A STEM degree is preferable, but that doesn’t make the liberal arts useless in the eyes of millennials, provided the degree has some degree of intellectual rigor and are not completely useless or commercialized (like degrees ‘child development’ or ‘search engine marketing’).

A Google search reveals many more examples on Reddit of philosophy majors being respected, so this belief that philosophy majors are unappreciated or are ignored is thoroughly debunked. Maybe as recently as a decade ago, this may have been the case, but online, especially on Quora on Reddit, there is a huge outpouring of interest in philosophy, as millennials see the value of it, along with other intellectualized subjects such as physics, math, and computer science. This also ties in with the post-2008 ‘explosion‘ of ‘intellectualism culture‘.

But also, why is there so much interest in learning complicated, esoteric math concepts? All things ‘smart’ have gotten more attention as of late, such as as theoretical physics, quantum mechanics, philosophy and math (as well as all these things melded together)…It’s like the AP-math class of high school but expanded to include almost everyone, not just a dozen students.

But does this contradict pragmatism. No, because pragmatism is intellectual of nature. Pragmatism, similar to utilitarianism, seeks to maximize resources and outcomes, based on the preponderance of empirical evidence, versus all alternatives being considered, which can include delayed gratification and the purist pursuit of intellectual endeavors, if over the long-run, one derives desirable quantifiable results, such as wealth or status, from such deferment. Pragmatism is contrasted to deontological ethics, the latter which is rule-based, not outcome-based, but this can be easily reconciled, as is often the case online, by stipulating that one’s ‘rule’ is to always choose what leads to the most optimal long-run outcome.