I saw this article going viral, about the rise and fall of Bell Labs: What would it take to recreate Bell Labs?
The Cold War era saw the rise of ‘genius collectives’–large institutions which employed a lot of smart people for research and development purposes. This included such organizations as the RAND corporation, Bell Labs, DARPA and PARC. But their activity or influence peaked by the late ’70s and underwent significant decline by the end of the Cold War, and became all but a shadow of their former glory by the early 2000s. Hardly anyone talks about them anymore, or at least not without a twang of nostalgia for a long-lost era. So what happened?
My guess is, such institutions became bloated and obsolete, supplanted by Silicon Valley and other ‘big tech’ companies, which are more efficient, more profitable, and better-run, but also equally innovative, whether it’s space launches or autonomous cars with Tesla and Space-X, or logistics and cloud infrastructure with Amazon, or AI and GPUs with Nvidia. The dozen or so largest tech companies are effectively laying the groundwork for the modern economy in much the same way those earlier institutions did, but more streamlined and profitable. Thus, there is no need to recreate Bell Labs when big tech fills this role. Second, the U.S. government, rather than having to create federally-funded labs, can just piggyback off these tech companies, which are more than happy to help, like for surveillance purposes. The FBI has Google on speed-dial when it needs location data, for example. Rather than having to compete with Space-X, NASA has partnered with it.
Moreover, in explaining the decline, these institutions employed too many people of mediocre talent and the commercial or intellectual output was insufficient to justify the costs, so they were gradually unwound as either federal budgets shrunk, like in the case of RAND, or their parent companies lost relevance, as was the case with IBM, AT&T (the parent of Bell Labs from 1935-1999; now Nokia), PARC, and others.
Bell Labs, for example, at its peak in the late 70s employed 25,000 people in the U.S. By comparison, Meta employs 67,000 people worldwide, but this is otherwise a vastly more profitable and bigger company than AT&T. Microsoft Employs 100,000 people in the U.S. but again we’re talking a vastly bigger company, and adjusted for population growth, this would be closer to 70,000 during the late 70s. IBM, one of the most innovative companies of its era, employed a staggering 269,291 people in 1970 for a gross revenue of $7.5 billion, or $28,000 per employee (or $227,000 in today’s dollars). By comparison, with an annual income of $135 billion for 2023, Meta generated $2,000,000 per employee–9x more than IBM.
Moreover, the hiring the process and the demands placed on today’s tech employees or prospective employees is likely far greater and rigorous compared to during the Cold War, such as whiteboard tests, multiple interview rounds, and aggressive resume filtering and screening. Zuckerberg and other tech CEOs have admonished workers for not putting in 100%. By comparison, working at an R&D lab during the ’60s was leisurely, with long smoking breaks at conference rooms and chit-chat at lunch. And the intellectual bar for being hired was lower.
I don’t think there were any ‘boomer’ jobs that could compare to today in required intellectual chops as young people today in ‘FAANMG’, except maybe working for NASA. Just the mention of ‘leetcode’ is enough to evoke shudders. Yeah, there were some obvious, notable genius-level intellects such as Einstein, Feynman, and Dyson back in the day, but the difference is that there were likely fewer ‘math geniuses’ or ‘physics geniuses’ on a relative basis compared to today, in which there are entire active communities (such as r/math or r/physics and also StackExchange/MathOverflow) full of highly-competent people in those fields, not just a handful of famous old names plucked from a book.
In other words, human capital as measured by collective intellect has surged, but our memories are limited to idolizing just a few people. If you ask strangers who the most famous or important composers are, likely they will name off Mozart, Bach, Beethoven…maybe Stravinsky or Bernstein. It’s the same cluster of names. But it’s not like the production of classical music and compositions has slowed. Due to movies and TV, more musical scores are being produced than ever before. I think this is why it’s easy to underappreciate or underestimate intellectual, cultural, and technological progress being made today, because we’re still latched on to the same people from hundreds of years ago.
As much as pundits may lament how young people today cannot write well, are being dumbed-down at school, or score poorly or mediocre on ‘general knowledge exams’, boomers probably fared no better growing up. Moreover, one must take into into account demographic change and selection effects, in which large extant populations of low-scoring groups pull down the average. Matched by ethnicity, America’s first and second generation immigrants are as competitive compared to in their native lands. During the early to mid 20th century, high school was less common or not mandatory, so this meant a lot of dull or otherwise academically-disinclined kids dropped out, who would otherwise pull down the averages today.
Or maybe there is a dichotomy; America’s education system is like a big factory in which most kids are mediocre, but there are some diamonds in the rough, too, who excel later, like in tech, finance, or graduate school. As evidence of this sort of barbell, in spite of alleged dumbing-down of the curriculum, high-stakes math competitions have become commonplace and important for high schoolers to distinguish oneself when applying for college, compared to in the ’60s when its was only the Putnam exam for select college students. In the 70’s though there were fewer opportunities for exceptional kids compared to today, both academically and in terms of careers.