Despite indoctrination, a college degree may still be the best path out of poverty

From Washington Post: Poor kids who do everything right don’t do better than rich kids who do everything wrong

I had to double-check because the chart seems to contradicts the author’s thesis that the American meritocracy is dead and that upward mobility is impossible.

It doesn’t look so bad when you consider that 67% of poor college grads are at least 50-percentile in wealth compared to 49% of rich high school dropouts. I’m sure is even better when you compare poor high school dropouts vs. poor college graduates, which is why a college degree may still be worth the money and the best pathway out of poverty, especially if you major in STEM.

This is why, despite being on the ‘right’, and how colleges have become like West Point but instead of producing lieutenants they are producing SJWs, I’m not so quick to join the anti-college bandwagon, because the evidence still suggests that a college degree is worthwhile, especially for STEM. Yes, there is a a lot of student loan debt, but also a ton of financial aid, too, for students of all socioeconomic levels. There is almost no excuse for someone of a reasonably high IQ to not take advantage of these generous financial aid programs, to major in STEM.

Some of the most common arguments against college are as follows:

‘I have a degree and all I can find are crappy jobs. Therefore, college is useless.’ This is a legitimate grievance, and I have empathy for millennials who have degrees and are unable to find decent jobs, but this not necessarily proof that college is worthless. For every story of indebtedness and bad jobs prospects, there are other stories, especially on Reddit, of 20 and 30-something graduates in fields like accounting or fiance who have solid six-figure jobs, a home, and are paying off their student loan debt, too. In the case of grads who have bad jobs, consider that getting the degree may have been necessary to get the job in the first place, and despite the low pay is better than no job.

‘Look how rich and successful I became (in a field outside of my college degree); the degree is useless, because I became successful in a field that is not applicable to the degree.’ You see this a lot – college grads who major in finance or computer science who become rich and successful in fields outside of computer computer science or finance, and so it would seem like the degree was not necessary. But when you look closer, often these people leveraged their degree early in life, and after amassing financial and social capital (thanks to the early job opportunities and connections afforded by the degree), were later able to parlay these resources to an unrelated endeavor.

‘I became really rich and successful without a degree and or after dropping out.’ Examples include Steve Jobs and Bill Gates, although to their credit neither boasted like this and were more humble. There’s a major survivorship bias here. 80-95% of small businesses fail within a decade, and failures never get as much media attention as successes, giving the false impression that most small businesses succeed. Then you have post-2008 economic trends that favor big, successful companies, that can leverage cheap credit, economies of scale, and networking affects, to keep growing and crowding out smaller businesses. As I explain in Pencil Pushers, success in entrepreneurship requires top-5 percent talent, whereas most day jobs require maybe only top-50 percent talent, to make less money. Bill Gates, Michael Dell, Mark Zuckerberg, and Steve Jobs were able to leverage their superior IQs (as well as connections, family wealth, luck & timing, and other factors) to succeed wildly without a college diploma, which is not applicable to the vast majority of college dropouts who try to follow in their lead, and fail.

Look at all the failed efforts since 2011 or so to create a viable competitor to Facebook (remember Ello, which I correctly predicted would fail), Instagram, Snapchat, or Twitter, just like tons of money was wasted trying to create a competitor to Google (Bing anyone? There’s a joke that the only reason Bing has market share is because everyone who buys a PC must first use Bing to install Chrome) or the iPod and iPhone (Zune anyone?). The money could have been better spent on Facebook stock (which has surged from $30 to $132 in just four years, and keeps going up to no end), Google stock (up 1100% since 2005), Amazon stock (up 150% since 2014), the S&P 500 (which has nearly doubled since 2011), or on Bay Area real estate (which also has doubled since 2011) than starting an actual company. That’s how easy wealth is created…by piggybacking on exiting successes, not trying to create one from the ground up. Sometimes the path of least resistance is the best one.

Contrary to the $200,000 figure cited by James Altucher and others, the average debt per graduate is only around $25,000 – or about the cost of a new car. But unlike a car new, which loses 30-50% of its value after the first year, a degree creates wealth both in terms of higher lifetime earnings and as an inflation hedge. This is because wages for non-graduates have lagged the CPI, and college graduates have seen the most wage growth since the 2009 recovery. This makes a degree a good hedge against inflation and wage deflation.

It doesn’t bear repeating that the higher education system is broken, that too many students are taking on debt to major in low-ROI subjects, and that there is a lot of indoctrination, but as bad as it is, a college degree may still be the best shot for reasonably intelligent people to enter the middle class.

Why Post-Election Revolt and Crisis is Unlikely, Part 2

In an earlier post, I argued that the outcome of the 2016 US presidential election would likely not lead to national upheaval, but I want to expand on it.

As the Great Depression showed, economic crisis does not necessarily lead to revolt. But historically speaking, revolt generally occurs during periods of extreme economic disruption and diminished well-being, typically as a consequence of war that indebts the nation. Second, it’s typically the wealthy, well-connected, and educated that foment revolution, not the poorest, which is kinda counterintuitive. But look at Donald Trump, who himself optimizes the ‘elite’, who rose to power by lending an ear to the concerns of millions of Americans when the other candidates seemed deaf. Che Guevara, Engels (who funded Marx), and Bin Laden – all had wealthy upbringings.

Revolution may also occur when the financial interest of elites are threatened, or if elites are able harness the frustrations of the proletariat to force a regime change. For example, there is evidence the American Revolution was the work of plantation elites:

According to von Borch, it was a colonial aristocratic elite espousing republican principles that articulated the revolt against England:

“Here we have what is, perhaps, the most deep-seated paradox in the emergence of America. The ‘Virginia dynasty’ of the first presidents of the independent federal State—Washington, Jefferson, Madison and Monroe—came from precisely this planter aristocracy. Within that aristocracy there developed the powers and the ideas which made the colonies independent of England and gave them a free, if conservative, domestic regime. The revolution against England was planned on the dignified estates on the banks of the Virginia streams….

Consider the Russian Revolution, lead by Lenin and the Bolsheviks, which overthrew the Tsarist autocracy (specificity Tsar Nicholas II., who was executed along with his family as a consequence of the revolution). Extreme poverty, worker strikes, and joblessness following the first world war, that had significantly weakened the Russian Empire, were aggravating factors:

The war also developed a weariness in the city, owing to a lack of food in response to the disruption of agriculture. Food scarcity had become a considerable problem in Russia, but the cause of this did not lie in any failure of the harvests, which had not been significantly altered during wartime. The indirect reason was that the government, in order to finance the war, had been printing millions of ruble notes, and by 1917 inflation had made prices increase up to four times what they had been in 1914. The peasantry were consequently faced with the higher cost of purchases, but made no corresponding gain in the sale of their own produce, since this was largely taken by the middlemen on whom they depended.

But Lenin himself was an elite, born to a wealthy family and had a law degree. Generally, history shows revolutions are led by elites who put their comfortable lifestyle on hold to advance a cause. Disaffected people will not mobilize without someone in charge behind the scenes giving orders. This makes a second revolution in America less likely, as someone has to first rise to the occasion to get the ball rolling and bankroll the actual revolution. For example, George Soros and other elites who funded BLM.

Regarding war, debt, poverty, and revolution, other historical examples include the French Revolution, in which major financial crisis and debt due to France’s costly involvements in the Seven Years’ War and the American Revolution, played a role. Same for the rise of Hitler, precipitated by Germany’s debt and weakened economy due to losing WW1.

However, although revolutions require a leader and considerable financial and organizational support, ‘lone wolf’ insurgencies don’t, and it’s possible there will be an upsurge in domestic terrorism should national sentiment decline substantially. As the 2002 Beltway sniper incident showed, which for nearly a month paralyzed much of the Maryland and D.C. regions with fear, the economic and psychological impact of terrorism are significant relative to the number of people and costs involved, which is why it’s effective and why governments expend so much resources trying to thwart terrorism. Same for the 1982 Chicago Tylenol poisonings, which attracted global media coverage and mass hysteria despite only seven deaths.

But, economically speaking, America is a long way from becoming like post-WW1 Russia. Around that time, 60-75% of the Russian population were illiterate peasants, so the conditions for revolution were a lot more ripe back then than now. Although US involvement in the Middle East has been costly, its nothing compared to the debt and hyperinflation that faced the Wiemar Republic.

Social media is like a portal to ‘Middle America’, and it’s not uncommon for middle class households to have multiple TVs, luxury brand automobiles, large SUVs, expensive iPhones with equally expensive phone plans, and designer clothes. Lifestyles were much more minimalist in earlier generations, mainly because people simply didn’t have the disposable income and credit to buy stuff. Since the mid 80′s, there has been an explosion in consumption as credit and nominal wages have surged, and technologies and free trade have make electronics and other tangibles cheaper and more accessible.

Inflation-adjusted consumer credit gained only slightly from 1965-1985 but began to surge afterwards and again in the mid 90′s.

The result is a higher standard of living – but perhaps at the cost of ‘community’, as explored by Robert Putnam in his influential 1995 essay Bowling Alone. As recently as a generation ago, entire families would gather to watch TV, because often they could only afford a single TV, to watch one of maybe a handful of shows that were available at the time. Nowadays, every family member has a personal computer and TV and can choose to watch one of thousands of shows by his or her own self, thanks to technologies like Netflix.

From Social Matter Escaping Muddied Experience:

American society has been structured so that natural experience is minimized and in its place are mediated experiences that an expert or team of experts have crafted, edited, framed, and even written for the individual. The mixing of true experience versus mediated experience is discussed in detail in Jerry Mander’s book Four Arguments For The Elimination Of Television from the 1970s. This does not stop with television, but has morphed with the growth of the Internet. Television still reaches so many and has such power due to its physical effects.

That’s not to say there aren’t problems. Many millennials are struggling, the post-2008 recovery is uneven and hasn’t produced enough good-paying jobs, and millions of Americans have anxiety over job loss and or going broke if there is a medical emergency.

Although America seems (especially if you listen to the media) more divided than ever, such division has always existed. In the 60′s, it was division over segregation vs. integration. Perhaps the media is also playing role by focusing on the negatives and overlooking the positives, and this shapes people’s perceptions of the economy and society.

But the problem is, although Americans are far from starving to death, existential anxiety is a more intractable problem (you can’t just throw money at it and make it go away), and solutions are hard to come by. Material wealth and consumption doesn’t necessarily bring fulfillment or peace of mind. Maybe the answer is cheap and abundant entertainment. Maybe it’s religion. Maybe it’s promoting financial literacy so people will save for a rainy day and retirement (thus creating peace of mind) instead of frittering money on positional goods, although this may also hurt America’s consumption-based economy.

Liberal Media Trying to Prematurely Declare Hillary Winner

In the days following the release of Trump’s 2005 comments, something unexpected (at least for the left) happened – people suddenly stopped caring, but more importantly, Trump’s polls did not budge.

This happens all the time – the liberal media tries to fan outrage – and initially people are outraged ‘omg Trump said the p-word! He must step down’, but then it fades, much to the disappointment of the left, who hoped it would have staying power.

The comments were quickly subsumed by ‘pop culture’ and people find it funny more than offensive. Ironically, the left, through their own doing, has made the public inured to remarks that perhaps many generations ago would have been more shocking (back when Bruce and Carlin pushed the edge of the envelope), but now it’s like ‘whatever’.

And same for the media’s efforts to equate Trump with fascism, which also didn’t stick despite the left’s best efforts.

A few days ago, on the heels of Trumps ‘lackluster’ third debate performance, the media created a narrative that Trump had resigned himself to losing the election, and that Trump contesting the results should he lose, a sign of ‘instability’ or an ‘affront to democracy’ on his part:

GOP braces for Trump loss, roiled by refusal to accept election results

At charity roast, Donald Trump delivered what might as well be a campaign eulogy

Campaign eulogy? A little presumptuous you think. The left is so desperate for Trump to lose, why bother with the actual…um…election and counting of the votes. Screw that. Let’s just just say Trump lost.

The left has to invent reasons for Trump ‘falling behind’ as if these reasons are revelatory or important, when it’s old news.

Now Trump is coming back, just a day later:

Trump gains on Clinton, poll shows ‘rigged’ message resonates

Trump knew what he was doing all along…he knows that many Americans share his suspicions of the integrity of the voting process. Even Gore, a favorite of the left, contested the results of Florida in 2000.

The reality is, the people who are ‘appalled’ by Trumps’s demeanor or comments about women were never going to support him. That’s why these ‘horse race’ polls are meaningless. 95% of the country is decided, as is the case in every presidential election at this time. It boils own to the 5-10% undecided – those in swing states – who matter. Despite 24-7 media coverage, the polls have been in a 10-point band since July, which is pretty remarkable given all the stuff that has happened, and is further evidence that minds tend to be made up long before the voting actually begins:

This is just like Brexit, where for months there was only a 5-10 point difference between ‘exit’ and ‘remain’ all the way until the vote (‘exit’ won by 4 points).

This is why elections and politics is mostly a waste – inordinate amounts of time and resources are spent trying to woo no more than a million or so swing and undecided voters, who hold the ‘fate of the nation’ in their hands. This is somewhat analogous to the 1955 Issac Asimov short story Franchise, in which a a single voter “Voter of the Year” represents the entire electorate.

From Nate Silver Clinton Probably Finished Off Trump Last Night:

That’s not to say that a polling miss is impossible. Our polls-only model still gives Trump a 14 percent chance and our polls-plus forecast a 17 percent chance, although that’s before accounting for any impact of last night’s debate or some of the other circumstances I’ve described.

So a five point difference equals 82% chance of winning. Yeah, the electoral map slightly favors Hillary, but to assign an 85% chance of Trump losing based on a five to seven point difference in the polls seems absurd.

Individualism vs. the State

From Social Matter The End Of Atomistic Individualism: A Theory On Who You Are

The purpose of this thought experiment is an attempt to formulate a new, sustainable, non-atomistic understanding of the concept of individualism. Modern individualism, as a product of the Enlightenment, has the function of isolating and alienating individuals from God, society, and eventually even from themselves. From Putnam’s Bowling Alone to the transgender movement, modernity loudly proclaims the inability of people to belong, even to themselves. It instead offers a vision of individualism, in which the person creates themselves in their own image, as if Adam were to form himself in the Garden.

Just as it is vain to think that a lump of clay will form itself into a man, so it is equally vain to think that an alienated, atomized person can create in themselves a personality out of the muck of consumerism and mass media. Modernity tells us that we can form our own personality with tattoos, body modification, consumerist consumption, and status objects like automobiles.

But Putnam is also a strong proponent of democracy. One can argue that atomic individualism, with is related to libertarism, is antithetical to democracy and the democratic process. Sometimes, I think we want it both ways: to oppose both individualism and democracy, but this may not be logically consistent. The answer , like many things, seems to lie somewhere in the middle. This could mean a community united by commonalities (such as culture), but without democracy, and individualism is also preserved. This is similar to the nation state concept:

The most obvious impact of the nation state, as compared to its non-national predecessors, is the creation of a uniform national culture, through state policy. The model of the nation state implies that its population constitutes a nation, united by a common descent, a common language and many forms of shared culture. When the implied unity was absent, the nation state often tried to create it. It promoted a uniform national language, through language policy. The creation of national systems of compulsory primary education and a relatively uniform curriculum in secondary schools, was the most effective instrument in the spread of the national languages. The schools also taught the national history, often in a propagandistic and mythologised version, and (especially during conflicts) some nation states still teach this kind of history.[20]

But I have also heard arguments that individualism is inextricably linked with liberalism and that individualism is an ‘enlightenment’ ideal. But a distinction must be made between enlightenment ideals, which are the antecedent to neo and classical varieties of liberalism, versus welfare liberal ideals (which is a more recent development). The former supports individualism, but also the possibility of unequal outcomes that may arise from it. The latter seeks conformity in the form of egalitarianism and equal outcomes (higher taxes, more social spending, wealth spreading, etc.) despite giving the outward appearance of supporting individualism. Marxist and other far-left variants of liberalism also oppose individualism, preferring the state to mandate ‘equal outcomes’ as well as individual subservience to the state.

But both the ‘left’ and the ‘right’ seem to have a love-hate relationship with individualism. For the ‘left’, they like individualism as a way to rebel against the status quo, but the also oppose individualism if it leads to too much wealth inequality or what they perceive as oppression (such as ‘homophobia’ of a baker for not making a baking a ‘gay cake’); for the ‘right’, they like individualism in context of free markets, personal autonomy, and personal property, but oppose it because it may lead to the breakdown of communities, decline of organized religion, the separation of church and state, and increased ‘moral decay’.

The libertarian or minarchist position, which is somewhere in the middle, may be the most logically consistent in bridging this schism, that strikes a balance between individualism and cohesion. Minarchism is like a shopping mall, where stores exist as individual entities under the patronage of the mall, a symbiosis of sorts where both the mall and businesses benefit. Business pay the mall in exchange for the benefits the mall provides (such as security, infrastructure, and customers). Because it’s elective, businesses don’t have to join, but America’s tax system isn’t and individuals, businesses have to pay to fund services they don’t want or need.

And from Family and Individualism:

In any society, there is probably an optimal balance between individualism and collectivism. A society that is 100% atomized, by definition, is not a society. But history also shows that total conformity is no better. Those quirky people on the right side of the Bell Curve, with their idiosyncrasies, are needed for society to advance technologically, while everyone else goes about tending to civilization. If you go through Charles Murray’s database of human accomplishments, you’ll find virtually all accomplishments were made by smart people. Liberals value social justice and equality over quantifiable results. The left wants America to be a nation of takers, not creators.

Related: Individualism vs. Thede

Post-election rioting and crisis: It’s not going to happen

The media is entertaining the laughable premise that there may be rioting and revolting en mass should either Trump or Hilary become president.

As I have said again and again, the media, by in large, is useless. It’s a giant time-suck that fools people into believing they are being informed, when it’s really about pushing advertising spliced between hype and sensationalism. Without advertising, the vast majority of media would have no reason to exist, as it creates no economic value on its own nor has any redeeming value to society. There’s a saying, ‘voting is bad for your soul’. So is the media.

Nowadays, anyone can create ‘news site’ or write an article for a supposedly reputable site and pass it off as being authoritative. Here is one such example: After Trump loses: An ominous American future imagined, in which the author speculates that a Clinton presidency may lead to mass rioting and economic collapse. And there are other articles that speculate the opposite, that there will be crisis and revolt should Trump win, which is equally risible. Although I want Trump to win, and America will do better under Trump than Clinton, the stakes aren’t as high as the media hype would suggest. Every four years, it’s the ‘election of the century’, a ‘new paradigm’, and ‘all or nothing’, and other premonitions that this election is more important than the 45 or so others that preceded it and the world didn’t come to an end, but this one will.

So why won’t there be rioting? Many years ago, some people were concerned that activating the newly built Large Hadron Collider would trigger a chain reaction that would turn the world into a super-dense blob of ‘strange matter’ and of course ending all life in the process. Then some scientists calculated that similar heavy-ion collisions occur naturally, such as on the moon, which obviously after billions of years hasn’t turned into a ball of goo, and thus it was reasonable to assume the collider was safe.

Likewise, the much worse conditions have existed in the past, but nothing has happened beyond sporadic outbursts such as the Watts riots in 1965 and the 1992 Los Angeles riots. if Americans were going to rebel en masse they would have done so in the late 1970’s – a period of high inflation and, overall, a much worse economy. Or during the Great Depression, which was even worse. The misery index, generally regarded as a measure of well-being and is calculated by adding the seasonally adjusted unemployment rate to the annual inflation rate, is – historically speaking – low:

But that’s not a complete picture, nor is it a vindication of Obama or Clinton. Rather, thank the ingenuity of free market capitalism and America’s best and brightest whose innovations boost living standards and create jobs, as well as thank America’s reserve currency status that keeps borrowing costs and inflation low. Americans may think they have it bad and find many reasons to complain, but other countries have it much worse.

From Brookings Are Americans better off than they were a decade or two ago?

The bottom line: According to this metric, Americans enjoy a high level of economic welfare relative to most other countries, and the level of Americans’ well-being has continued to improve over the past few decades despite the severe disruptions of the last one. However, the rate of improvement has slowed noticeably in recent years, consistent with the growing sense of dissatisfaction evident in polls and politics.

Other factors play a role, and as I discuss in Explaining America’s Economic and Social Stability, America owes its stability to its strong economy, culture of consumption and innovation, large geographic size, and cultural heterogeneity.

Non-aggression principle, and where it fails

From Wikipedia, the non-aggression principle (NAP) is a ethical stance held among right-libertarians that forbids acts of ‘aggression’ against one’s property rights. Somewhat confusingly, the NAP seems to have many definitions and variants:

1961 Ayn Rand In an essay called “Man’s Rights” in the book The Virtue of Selfishness she formulated “The precondition of a civilized society is the barring of physical force from social relationships. … In a civilized society, force may be used only in retaliation and only against those who initiate its use.”[8][9][10]

1963 Murray Rothbard “No one may threaten or commit violence (‘aggress’) against another man’s person or property. Violence may be employed only against the man who commits such violence; that is, only defensively against the aggressive violence of another. In short, no violence may be employed against a nonaggressor. Here is the fundamental rule from which can be deduced the entire corpus of libertarian theory.” Cited from “War, Peace, and the State” (1963) which appeared in Egalitarianism as a Revolt Against Nature and Other Essays[11]

Natural Rights: Some derive the non-aggression principle deontologically by appealing to rights that are independent of civil or social convention. Such approaches often reference self-ownership, ethical intuitionism, or the right to life. Thinkers in the natural law tradition include Lysander Spooner, Murray Rothbard, and Robert Nozick.

This suggests the NAP is sometimes interchangeable with the ‘golden rule’. As stated by Rothbard, defensive or retaliatory force should be commensurate with the force initiated by the aggressor. Thus it’s entirely within one’s discretion to engage in lethal fire against an intruder and or if one’s life is threatened, but not against targets that did not initiate sufficient aggression to justify a potentially lethal response. ‘Aggression’ can also be defined to mean coercion or force against property owners by entities such the government, an example being the IRS. The latter definition is descriptive wheres the former is prescriptive.

NAP as applied to deontological libertarianism seems to have a blind spot, which is how to deal with individuals that impose a collective or systemic burden on society but are not acting ‘aggressively’. Yes, if someone is threatening your life, lethal retaliatory force is justified, but what about small-time criminals, the mentally ill and unstable, loafers, and other ne’er-do-wells? The NAP, in the prescriptive sense, forbids simply rounding up all the unproductive and executing or exiling them all, as that is disproportionate ‘aggression’ against one’s right to life (natural rights) even if said individuals, individually, impose small externalities on society but collectively exact a significant burden.

For example, consider shoplifting, which is a very common and often non-aggressive crime that costs retailers, collectively, billions of dollars a year, and imposes a greater cost on society in terms of retailers passing on the costs of ‘shrink’ on to consumers. Obviously this is a big problem, and although most thefts are only around $50-100, when done many times by thousands of people, it adds up.

So how would a hypothetical libertarian or an-cap society that adheres to prescriptive NAP handle this problem. I imagine under NAP, the shopkeeper would exhort the thief to return the goods, possibly with the threat of physical violence for non-compliance, and the thief would probably comply knowing that there would be no long-term consequences for his actions and that getting caught is just a ‘cost of doing business’. Upon leaving the store, the thief continues to hit up more stores, occasionally getting caught, but it’s only a small setback, as there are no major repercussions for the thief besides the temporary inconvenience of having to relinquish the goods when occasionally caught. Stores don’t have the infrastructure nor want to bear the cost of warehousing thieves, and NAP stipulates violence is not an option for small violations of personal property. Also, some employees would probably have trepidation about exacting potentially lethal force to deter theft.

At some point having had enough, shopkeepers will band together, perhaps pooling money to create a facility to deal with these scoundrels in a manner that is sufficiency humane in compliance with NAP – in other words, a jail. But the problem is these reprobates, by in large, are unproductive and unwilling to work, and prison labor is not an option for most, and even if it were it’s not enough to cover the costs of housing them, but it’s worth the cost anyway if it means fewer thefts and sequestration of the most most undesirable of elements from greater society. This effectively acts as a ‘tax’ that individuals and businesses voluntarily enter into, from a utilitarian perspective, for the ‘greater good’ but also for the sake of their own profit margins and safety, with the prison becoming a ‘public good’, not a private one. Even if these prisons were privately run, business and individuals would still pay into them, effectively acting as a tax, the only difference being that it would be voluntary.

But then this is not ‘true’ libertarianism or an-cap. However, there are ways around this, such as by keeping ‘public goods’ as small and efficient as possible, and only when necessary, and possibly elective (meaning that one can ‘opt out’ of a public good). But the thing is, most self-professed libertarian and an-caps are only ‘partial libertarians‘ (similar to minarchism, as expounded by Nozick, Rand, Rothbard, and Hans Hoppe), and I don’t means this pejoratively, as the dilemma above suggests, may be the optimal approach of balancing individual rights with the ‘common good’. Interestingly, while there are ‘true’ neocons, neoliberals, and paleocons, it’s hard to find ‘true libertarians’.

Who Is in Charge?

From Free Northerner Chronic Kinglessness:

This is a perfect example of what Moldbug, referencing Carlyle, referred to as chronic kinglessness.

This is the secret of politics and modern society: nobody is in charge, no one has power, and nobody is running the show: not the people, not the corporations, not the politicians, not the bureaucrats, not the courts, not the military, not the journalists, not the bankers, not the white male patriarchs, not the SJW’s, not the Jews, not Davos, not the Bilderbergs, not the Tri-lateral Commission, not the Illuminati, and not the lizard-people.

Everybody likes to posit that some bogeyman composed of people they dislike is in charge and running, ruining, things behind the scenes because that is comforting. Even if a conspiracy is leading to disaster, at least we’re being led. Even if they are evil incarnate, at least they know what they’re doing and are leading society in a specific direction. It is comforting to know someone is in charge, even if we hate them.

To some extent this is true: no one in western society has divine power (the divine rights of Kings), but rather power often concentrated by a bureaucracy of sorts. At the turn of the millennium we saw the rise of so-called ‘fabian socialism’, a form of government where power is concentrated not by the proletariat (Trotskyism, Leninism) but rather by bureaucracies (Stalinism), with the likes of Bill Gates (in 2000, the most powerful man in technology, who having recently retired from Microsoft to work on ‘global philanthropy’), Warren Buffet (the most powerful man in business and close friends with Gates), George Soros (the most powerful man in finance), and Bill Clinton and Tony Blair (at the time the most powerful statesman alive, and still quite powerful), as ‘thought leaders’ and ‘evangelists’ that ‘jet set’ to the World Economic Forum in Davos to discuss how to improve ‘global welfare’ for people that they otherwise want nothing to do with, so much so that they have chosen to seclude themselves in the one of the most remote but expensive parts of the world to do so. At around the same time, in 1999, the Euro was formed, and the European Union was extended. It’s a paternalistic state where corporations and bureaucracies have control, with leftist social policies on issues such as immigration, where both commerce and nations have ill-defined borders. It’s similar to classical liberalism, democratic socialism (but really more like social democracy), and neoliberalism, but opposite of anarcho and libertarian socialism – the latter which reject concentrated forms of power. This continues to this day, 16 years later, with Obama and Mark Zuckerberg filling the ranks of a ‘global elite’ with socialist tendencies. Even with Brexit, not much has changed. But that’s pretty much who is ‘in charge’, at least as far as much of Europe is concerned. America is slightly more individualistic and power is concentrated among the president and so-called ‘executive orders’ that can override congress.

But on a more abstract level, power may not be exacted by a tangible entity (a king, a bureaucracy) but rather by forces (such as economic or cultural) beyond anyone’s control – inevitability, fatalism, and predestination. Despite all the cries for change against the ‘status quo’, things tend to remain constant, albeit with small changes here and there. Although Free Northerner says there is a ‘power void’, there are no shortage of people or targets to blame for disenfranchisement, for feeling ‘left out’. There are always ‘elites’ in one form or another. If nothing were imposing their force, there would be no resistance. So there is power somewhere…or maybe all of these ‘small oppressors’ are a symptom of a bigger problem. Democracy give the illusion of individual power, which more and more people are seeing through for the ruse that it is, so perhaps absolute monarchy is the alternative, which by having total power, is more empowering for individuals? Democracy and ‘freedom’ means always having to prove yourself, people having to fight a ‘mental war’ against mediocrity to ‘rise to the top’, whereas a consanguineous nobility and power structure means that people at least know their place in the hierarchy, but more importantly can come to terms with it instead of fighting it.

Wealth, Intellectualism, and Individualism, Part 5 (intellectualism)

Continuing on the wealth, individualism, and intellectualism series…

Part 1,2,3,4

The final pillar is intellectualism. Thanks to recent economic trends, Web 2.0, ‘nerd culture’, the growing importance of STEM, ‘esoteric celebrities‘, long-form journalism, as well as the elevation and idolization of intellectuals in public life, particularity for STEM fields, and recent groundbreaking discoveries and progress in physics, mathematics AI, and computer science, right now America is in something of an ‘intellectual renaissance‘.

This section, the longest of the three, covers:

1. ‘intellect’ as a form of social capital

2. rise of ‘nerd culture’ and cultural appropriation

3. the post-2008 ‘intellectual renaissance’ in America, as evidenced by the increased demand for complicated, esoteric subjects, that until recently were neglected, and in the process turning many intellectuals into ‘esoteric celebrities’

4. rejection of ‘low information’ in post-2013 internet journalism and in online discourse; ‘fact checking’, correctness more important than political tribal loyalty; proliferation of the ‘contrarian mainstream’ and esoteric ideologies

5. the ‘Social Darwinist’ aspect of how less intelligent people are faring worse in America’s competitive post-2008 economy

6. ‘shared narratives’ (#2,3,4,6 are under the umbrella of ‘intellectualism culture’)

7. the wealth-intellectualism-individualism synthesis

Intellectualism is how you become a part of the process, the national debate, rather than merely a spectator. It’s a common misconception that to be ingratiated you must conform, be ordinary, but it’s actually the opposite: to become a participant, you must be exceptional.

To be continued

SJW Narrative Collapse, Part Infinity

This is pretty funny… going on Reddit (I recommend logging out to see which default threads are on the front page, not subscribed ones), and it looks like the left, to quote the title of a Charles Murray book, is losing ground. A story on /r/news about “Leaflets calling for death of those who insult Islam ‘handed out at London mosque’”, was up-voted to the front page, much to the anger of the left, that wishes that this story would disappear and not be promoted to the ‘front page of the internet’ for the world to see the truth about the ‘religion of peace’:

Pretty much everything I write on this blog is true or will eventually be true, whether it’s about economics, the stock market, the media, Bay Area real estate, internet journalism, intellectualism, web 2.0 valuations, or the post-2013 demise of the SJW narrative.

The truth always prevails, but sometimes it takes a little while to break free from the web of misinformation and false narratives that are so appealing but also wrong. We’re seeing this with the post-2013 SJW backlash, in addition to the ‘alt right’, Red Pill, MGTOW, NRx and the ‘Dark Enlightenment’, Gamergate HBD, ‘frog Twitter’, and the election of Trump. And through this blog – which began in 2014 as these politically incorrect ideologies and movements were beginning to burst through like a battering ram against the fortress of leftism – I am proud to be a part of it, too.

By unleashing the frog that lies within us all, we can make America great again.

Who else is feeling deplorable today?

Leftist assumptions about economics and finance are being repudiated by the internet’s army of fact checkers.

For example, through the writings of Robert Shiller (a Noble Prize economist who shills for the left) and Michael Lewis (another liberal, who wrote The Big Short and Flash Boys), the left conveyed a narrative that high frequency trading was an unalloyed evil – an assumption that for many years went unchallenged by the ‘general public’ until only recently, as millennials on Reddit (as part off the post-2013 SJW backlash) eventually learned that high frequency trading actually helps traders by lowering transaction costs and speeding order executions.

A New York Times column If War Can Have Ethics, Wall Street Can, Too made it to the top of Reddit a couple days ago, but commenters attacked the leftist premise of the article, particularity as it pertains to high frequency trading:

Working at an investment bank conveys authenticity and authority in the eyes of other ‘redditors’, who up-voted the post in agreement. In many ways, finance and economics could be considered ‘STEM’, as it’s considered intellectually rigorous and involves empirical evidence, math, and number-crunching, and that’s why it ranks high in the hierarchy of degrees in terms of respect, along with philosophy, physics, and mathematics.

This was from /r/philosophy, not a ‘right wing’ sub, so it’s not like I cherry picked a sub that agrees with my view, and I could easily find more examples beyond the ones in the screenshot. But the reality is, there are a lot of misconceptions promoted by the left about algorithmic trading that are are easy to refute, and I have done so here. It’s nice to see so many people coming around to reality, rejecting the ‘blame the rich/banks’ mentality that was so pervasive in 2008-2012.

The same goes for the much maligned 2008 bank bailouts, which many people, in agreement with posts I wrote in 2011-2015, realize were necessary from a utilitarian standpoint, and helped the economy by stemming the bleeding from the ailing banking housing sectors so that the healthier sectors such as web 2.0, payment processing, information technology, and retail could thrive. The bailouts may have created moral hazard but indirectly created trillions of dollars of wealth in the form of rising asset prices, economic growth, and improved confidence – all at nearly no cost (as the bailout was funded with near-zero yielding debt).

The fact that the story went so viral, making it to the front page of Reddit, but also the intense, impassioned discussion in the comments, is further evidence of how finance is so important to millennials, who would rather debate regulation and high frequency trading than waste time on mind-numbing, disposable pop culture entertainment. This is more evidence of how intellectualism has become so important, contrary to pronouncements of how America is ‘dumbing down’. There is a huge demand for intellectualism that the internet and communities like Reddit, Hacker News and 4Chan are satisfying.

This is just one of many examples of how the truth always prevails. A reality-based worldview based on rationalism and logic always prevails. Leftists, who have to use misinformation and emotion to convert the uninformed to their causes, are losing.

‘Show, don’t tell’

‘Show, don’t tell’ is a literary technique whereby the author ‘shows’ what is happening through vivid language and senses as to allow the reader to make inferences from the clues that the author leaves behind, than merely ‘telling’ the reader what is happening.

But this also applies to extent to post-2013 internet journalism, with the trend being towards more ‘showing’ and less ‘telling’. ‘Telling’, in the context of punditry and exposition, is to tell the reader what is on your mind, often in a brusque, impassioned, or long-winded manner, and it sometimes reads like a rant. For decades, until its abrupt end some time around 2013 for reasons that still remain largely a mystery, punditry and journalism, particularly online, was dominated by ‘telling’. For much of the 90′s and 2000′s, during the whole Clinton and Bush era, bloggers could make a good living writing emotive, hyperbole-laden ‘cons/libs are good/bad’ screeds, which were shared through email lists, blogs, and aggregators like Drudge, in the process generating significant traffic and advertising revenue for bloggers and aggregators alike. Of course, all that changed around 2013 and it became much harder up-and-coming bloggers, pundits, and writers to ‘make their monthly nut’ by linking the ‘left/right’ with the incarnation of Satan, as worked so well in the 90′s and 2000′s.

Between 2007-2009 on a different website, I blogged about the 2008 presidential election, but it was less about ‘coverage’ than me taking potshots at Obama at every opportunity, and it was a lot of fun. But much has changed since then – that was back when Twitter, Facebook, and YouTube were far smaller than they are today, and there were fewer political pundits. The major online media properties (Fox, CNN, WSJ, etc.) tended to have delayed opinion coverage (because they were late to the whole ‘blogging’ thing), allowing bloggers to one-up them, but these mega-sites have long since caught up. There is just so much content: Vox, WSJ, Bloomberg, etc., but the number of hours in a day, reading speed, and eyeballs haven’t grown to match the rate that content is being produced, so the result is a constant churn of content piled upon heaps of older content, and a lot of it ignored. This ties into post-2008 themes of winner-take-all, network-driven capitalism and how we’re in an age of abundance (content, ‘stuff’) but at the same time great scarcity (attention, differentiation).

But, again, the decline of opinionated punditry blogging (‘telling’) and the post-2013 rise of data-driven, nuanced style of online journalism, is also to blame for this decline, and as I will expound in a forthcoming post, most of the novelty that made blogging very successful in the 90′s and 2000′s has now worn off.

Although pundits can still do well with ‘telling’, it’s usually because they have already built a large, established audience/readership during the ‘telling’ days, that remain loyal. But in our era of ‘showing’, ‘telling’ techniques no longer work (or at least not nearly as well as they did years ago), and are perceived as ‘low information’ and intellectually lazy by a savvier readership and demographic that have grown tired and weary of charged partisan polemics and instead seek nuance, data, and intellectualism.

What is ‘showing’? Whereas ‘telling’ is to tell the reader what is on your mind, often in blunt terms, ‘showing’ is the use of data and empirical evidence to nudge, not force, the reader to your desired conclusion. Your opinions, beliefs are secondary to the evidence, not foremost. ‘Showing’ techniques include citing many references and links contextually and in footnotes, referring to historical evidence and case studies, data visualizations, as well as comparing and contrasting viewpoints, with the implication that your view is correct (or ‘less wrong’) but without explicitly saying so (showing).

Regarding contrast and brevity, the undisputed master of this technique was Jon Stewart of the Daily Show. For instructional purposes, if you can overlook his obvious political bias, he used this technique masterfully by showing on the screen, before the live audience, examples of the ‘right’ being hypocritical, by pasting quotes or juxtaposing images, and the audience ate it up without fail. He didn’t have to recite a 1000-word rant to convey his point, but just by the use of imagery and the inflection of his voice, was effective. This technique is employed heavily on Twitter, with Tweets of screenshots and highlighted passages of hypocrisy, not links to huge rants, going viral. Here is one example of viral image passed around the alt-right and NRx sphere of Twitter, about the difference between ‘order’ and ‘chaos’:

The aforementioned paragraph discusses brevity, but what about long-form, which is also hugely popular online? Isn’t that a contradiction? Not quite. In long-form, again, ‘showing’, not ‘telling’, is employed, but even more so. Showing in long-form involves, as mentioned above, data visualizations and lots and lots of links. Some examples of long-form sites that heavily employ ‘telling’ techniques to great success are Priceonomics, Ribbon Farm, Slate Star Codex, and Wait But Why.

A good example of ‘telling’ is a recent article by Social Matter Where Did It All Go Wrong. Note the extensive use of hyperlinking within the post, that instantly conveys authority and expertise in the eyes of readers. And not just links to Wikipedia, but many other sources, too. There’re are links to Moldbug, published studies,, etc. – all within a single paragraph. Even if you don’t agree with the underlying message, there is so much information that it’s impossible to not come away smarter having read the post, and because of these ‘showing’ techniques the post was successful and generated significant discussion in the comments and was shared extensively. Of course, it’s not going not be as popular as a Wait But Why article (and that’s an unfair comparison to make, as NRx is a very small niche, relatively speaking) but these techniques help immensely for all niches.

One problem with long-form is that it’s kinda pain in the ass, as it raises the standards for everyone, which is good because it means better content, but such content is very time consuming to produce. When Wait But Why wrote about cryonics, they didn’t just write an 800-word article about it – they wrote the most exhaustive article about cryonics ever. After I wrote my article about the simulation hypothesis, I realized that to get it to the standards of Wait Buy Why it would have to be 5,000-9,000 words and be filled with links and pictures, but I don’t think I would have been able to hold my interest long enough. After a certain point, you just want to move on to a new subject.

Also, having massive traffic, as Wait But Why obviously has, is in itself a great motivator to create longer content. Supply-side says that supply create demand, but demand also creates supply. If a boxing promoter cannot sell enough tickets (demand) to make it worthwhile, there is no fight (supply).