Lately there has been a plethora of these ‘how-world-is-changing’ type articles on social news sites, and they seem to do well, as measured by comments, interactions, and shares, so here is another one:
Within a decade … between a third and a half of the current employees in finance will lose their jobs to … automation software. It began with the lower-paid clerks … It has moved on to research and analysis, as software … has become capable of parsing enormous data sets far more quickly and reliably than humans ever could. The next ‘tranche’ … will come from the employees who deal with clients: Soon, sophisticated interfaces will mean that clients no longer feel they need or even want to work through a human being.
The Luddite fallacy means that there will always be an abundance of jobs for all skill levels, despite advances in technology. Carriage mechanics become auto mechanics, who become rocket mechanics, etc. Whether or not it remains a fallacy is up to debate, but technology has yet to eliminate all jobs – far from it. However, instead of fewer jobs, the labor market distribution may become bimodal, with a ‘hollowing out’ of the middle.
Then came the inevitable crash. Home prices collapsed and well-paying white collar jobs disappeared. The brother-in-law who made $150,000 a year as a mortgage broker in 2007 was unemployed in 2009 and driving a FedEx truck in 2010–or doing nothing at all, as the labor force participation rate collapsed. …
But I think every year there is some ‘disillusion’. In the 2000’s it was about 911, Iraq, the dotcom bubble bursting, and in 2008, the financial meltdown. In the 90’s it was about Y2K, militia groups and domestic terrorism against abortion clinics and government buildings, Clinton’s ‘culture wars’, the Persian Gulf War, the first biotech revolution, and the beginning, maturation, and eventual euphoria of the world wide web.
Now it’s apps and social media, Iraq and Syria, on-demand services such as Netflix and Uber, the sharing and gig economy, spree shootings and domestic terrorism by Muslims instead of Christians, ‘outrage porn’ as the new culture wars, perpetually low interest rates, the crummy labor market that never seems to get better, a generation of graduates shackled by debt, and the ongoing debates over wealth inequality.
A major theme of post-2008 America is upheaval, or a re-shuffling to a new, more cut-throat ‘status quo’: jobs that were considered safe and good-paying no longer so, replaced by cheaper but more efficient ones; student loan debt rising, but job opportunities for many graduates falling; ;economic and technological abundance, but possible lack of fulfillment; too many applicants and too little hiring for many jobs; IQ being more important than ever (cognitive castes and Social Darwinism); millennials having to clean-up the mess the baby boomers left behind.
Another theme is the ‘inevitability of everything’, meaning that everything has become predictable up until the moment it happens. The stock market has continued to defy the predictions of collapse and doom and gloom, year after year, leaving the bears empty-handed, yearning for the crisis that will never come, as many seek ‘reset‘. The stock market will keep going up, reflecting the inevitably of the ‘status quo’ economy, that while it may suck for some – is persistent, so you may as well get used to it and adapt. Due to economic fundamentals and other factors, the bull market will likely continue.
Anomie and enniu are other themes, possibly related to the un-participatory nature of both the economy and society:
… what will happen instead is that the economic contributions of the most productive will be able to compensate for the least. The future is one where a decreasingly small percentage of individuals and corporations contribute to the bulk of economic output and activity – the Pareto Principle again, in which 20% contributes 80%, as shown below:
In the future, the curve will become steeper – possibly until a singularity is attained – one company to rule all- the Matrix? This could be the ‘other’ singularity, but instead of AI and computing power, it’s a company or economic entity.
Then you have the social component. Whether it’s the latest physics discovery making headlines, web 2.0 start-ups being worth billions seemingly overnight, Twitter wars, or social media and campus outrage, you have all this activity going on, but on the other hand most of us are on the outside ‘looking in’ rather than contributing or participating in any meaningful way to the debate. Social media platforms give us the illusion of influence and power, but for the vast majority of people it’s very limited, like screaming from the rooftop in a neighborhood where each home is spaced two miles apart. Social media and the celebritization of individualism may magnify this isolation.
Pre-2013, political discourse was dumb, and so was most internet content, too, with the hegemony of Buzzfeed-style ‘listicles’. Then, in 2013, a switch suddenly flicked: the assent of rationalism, long-form internet content, and centrism, all representing a smarter, more evolved form of online discourse and a rejection of ‘low information’, including pandering. This is also related to the post-2013 SJW-backlash and the rise of the ‘alt-right‘, as a challenger to ‘low-information’ mainstream conservatism. Rational liberals, to their credit, played a role in this backlash, denouncing certain aspects of ‘social justice’ once free speech began to be impugned by the more radical elements of the left. Preaching tolerance is hypocritical if you’re only tolerant of those who hold similar views.
In the comments of the Slate Star Codex Sub Reddit, someone bemoans how culture is dominated by a ‘priestly class’ that puts education on the top of the ‘status pedestal’. A retort is that college professors don’t dominate our culture any more than, say, Kim Kardashian.
The common thread here is individualism, a defining characteristic of post-2008 society. What Kim Kardashian and the professor have in common is that their domains are highly individualistic, and both are elevated in society based on individual merits: for the professor, it’s making discoveries, which is related to intellectual-based authenticity and is highly individualistic, and Kim Kardashian for being authentic in her socialite lifestyle, in not having to conform to ‘midstream’ beauty standards and conventional categories of fame (being a singer, actor, etc). Our culture of individualism prizes individual accomplishments (like a physics or math discovery), popularity (Instagram & Twitter followers), and merit (related to individual intellectual accomplishments), which tend to be harder and more exclusive and celebrated than collectivist ones. Religion is inherently collectivist, generally having low barriers to entry for salvation. Same for political parities, which tend to have low barriers to entry for participation. Neither spotlight the individual. But a degree in physics or math, while much harder to obtain than going to church, brings much more prestige to the individual than being a random churchgoer. Perhaps some are tired of the celebration of ‘self’ and wish to return to simpler, more collectivist times. As I discuss earlier, some individualism and intellectualism is need to for society to advance, and there is is probably an optimal balance between the two.
Maybe you can call this New Era the Renaissance of The Mind, one that deals with data and social currency, not mortar and easels.