Why the job-apocalypse probably won’t happen

From Scott’s blog WILL AUTOMATION LEAD TO ECONOMIC CRISIS?. This is not a post by Scott himself, but rather a written collaboration by two guest contributors about a mutually agreed upon topic:

Adversarial collaboration on the question: “Automation/AI will not lead to a general, sustained economic crisis within our lifetimes or for the foreseeable future. Automation/AI’s effects into the future will have effects similar to technology’s effects in the past and, on the whole, follow the general trend.”

In recent years, especially since 2008, it has become intellectually fashionable among the educated and pundit-class to predict that automation will cause massive and irreplaceable job loss. Andrew Yang, who is considered one of the ‘smarter’ and economically-savvy candidates, has built an entire platform around this concern, promising a UBI to help ameliorate and rectify technology-induced unemployment (or what economists call structural unemployment). A Google search for “technology job loss” reveals enough entries, published in the past 5 years alone, to take a lifetime to read. Although technology makes some jobs obsolete, it also helps create new jobs, such as jobs pertaining to information technology or social media, which decades ago didn’t exist. The concern however is that a threshold will be crossed in which this job-replacement mechanism that economists take for granted, fails. At some point, technology will cease creating new jobs, the result being permanent job loss and a much smaller labor force.

My opinion is such fears are unfounded. However, I have a new, important mitigating factor that I have not yet covered which I think can help resolve this debate. It finally dawned on me possibly the best, most succinct and convincing answer to this topic.

Consider a recent article about how Americans are creating ‘phone farms’ to profit from advertisers, by emulating human click and video-watching behavior.

Why buy a bunch of phones when this can be emulated with software. The answer is, these websites are complicated and, presumably, have advanced anti-spam systems in place, so creating a C or php program to emulate human user behavior would be too costly and difficult. Such a program could cost $30k or more, which is beyond the budget of the vast majority of individuals and would need to be modified constantly to account for changes of the underlying social network or service it is exploiting. Humans can instantly adapt at no cost. This is an example of how low-tech can beat high-tech and why the robot/AI ‘job apocalypse’ may never come.

The same for fake Amazon reviews. In theory, any human browser behavior can be emulated by a sufficiently advanced program such as by using proxies, changing user agents, faking a human writing style, mimicking mouse clicks, etc., and such a program may over the long-run pay for itself, but the initial costs are high enough that few can afford it, making humans necessary.

As society becomes increasingly complicated and competitive, humans are needed more than ever in order to adapt to sudden and unpredictable changes, in which software and automation falls short.

Or consider the industry of fake Facebook likes. Why not code a Facebook script that can do this rather than hiring people to click ‘like’ buttons. Such scripts exist, but they are not sophisticated enough to reliably fool Facebook’s increasingly advanced and constantly evolving anti-spam algorithms, and to create a script that can fool Facebook would be so initially expensive that using humans is cheaper and lower risk from a financial standpoint. Facebook requires authentic Facebook accounts that are tied to real users. User actions (such as likes and shares) by new and fake Facebook accounts are more likely to be filtered by Facebook’s algorithms or suspended. Just as there exists the ‘dark web,’ the fake-web is a multi-billion dollar industry that spans globally, that seeks to reproduce and emulate real human behavior, whether it’s manipulating Yelp or Amazon reviews or astroturfing a political or corporate agenda.

A pizzeria of today, save for LCD-display menus instead of letter boards or chalk menus, is indistinguishable from a pizzeria of 50 years ago. There are still people cleaning tables and floors and operating the cashiers and ovens. This phenomenon of low-tech, low-skill jobs being harder to automate than high-tech jobs is not a new observation and is covered in the original link, but if a Pizza Hut or McDonald’s franchise owner were, in theory, to try automate those jobs and replace them with robots, it would be so prohibitively expensive in the short-run that even if it paid for itself after 30 years, it would not be feasible. The first computers were so expensive that only the largest of institutions and companies could afford them and realize the productivity gains. The US government used IBM tabulating computers to speed up compiling Census results, but it took almost 2 generations for this capability to become mainstream and affordable enough for average businesses and individuals.