In part one, I discuss how selection effects can bias our perception of reality. For studies of weight gain/loss and metabolism on non-obese subjects, it may not be possible to extrapolate such results to obese people.
Now I want to bring attention to the second claim, “drug dealers earn minimum wage,” popularized by the 2005 bestseller Freakonomics, by University of Chicago economist Steven Levitt and co-author Stephen Dubner. The book became something of a cultural phenomenon, spawning three sequels, a podcast, and even a critically panned movie. For example, on social media we still see people repeat this claim as if it were fact:
it’s been analyzed many times that common drug dealers make less than minimum wage
— claude shannon (@catpoopburglar) February 27, 2026
In the 20 years since its publication, much of Freakonomics has largely been discredited, amounting to little more than a “smarter” version of a Malcolm Gladwell book, where narrative takes precedence over accuracy. Even if the authors didn’t intend to deceive, the claims brought forth in the book are based on, by now, very old studies, few of which have ever been reproduced and which do not necessarily hold true today, even if they were true when they were published.
Regarding drug dealers earning minimum wage, this is based on the study “An Economic Analysis of a Drug-Selling Gang’s Finances,” by Steven D. Levitt and Sudhir Alladi Venkatesh, which since its publication in 2000 has been cited 1,124 times. According to Levitt’s research, drug gangs kept meticulous books at every level of the operation, and this worked out to roughly minimum wage for low-level drug dealers (foot soldiers) when adjusted for 1995 dollars.
The study does not directly investigate these gangs. Instead, it relies on a meta-analysis of research conducted in the early 1990s. Those earlier studies relied on data collected in the early to mid-1980s by researchers who were able to obtain firsthand information from drug gangs. This context matters, as the data largely reflects the crack cocaine epidemic, when crack was the primary drug sold by many of these gangs.
Academic publishing is one of the slowest things imaginable. The entire process–data collection, writing, peer review, revisions, and the eventual publication of the paper or book–can easily take five years or even longer. As a result, data gathered in the early to mid-1980s appeared in studies published in the early 1990s, which were then cited in a 2000 paper and later popularized in a 2005 book. In other words, the data was already decades old by the time it entered the mainstream conversation.
Yet online, people continue to cite claims from a two-decade-old book based on even older studies as if they still apply to drug dealers today. But a lot has changed in the 40 years since the crack epidemic began. Drug prices have changed, the type of drugs sold has changed, laws have changed, and sentencing has changed. Today, drug dealers have smartphones and can list their product on social media or the “dark web.” They no longer depend as much on a hierarchical structure and middlemen taking a cut of the earnings. There is also cryptocurrency and other ways to obscure earnings, or to avoid potentially dangerous in-person transactions.
Overall, drug dealers today have many more options. If I had to guess, given these factors, today’s drug dealers are much more successful compared to three or more decades ago. It’s possible that the move to social media and the “dark web” has lessened the occupational hazard as well. Not to mention, the profession of “drug dealer” is extremely broad, varying by location or the type of client, which goes well beyond the scope of a single study. Dealers with wealthy clients in high-income areas would presumably earn more money and face less risk. It’s not at all an apples-to-apples comparison.
Finally, there is the common claim that the link between IQ vs. income tapers off around 125–130, meaning that additional IQ points do not translate into higher income, or may even be detrimental. There are many flaws with this reasoning:
These are covered in the posts here and here, but to recap:
1. Like above, this relationship is based on old studies from the 1970s–1990s that people assume are still applicable to the 2020s. A lot has changed economically since then.
The highest-paying careers in the 1970s–1990s did not really differentiate between an IQ of, say, 120 vs. 140, unlike today, where geniuses are landing extremely competitive and lucrative jobs at firms like Jane Street or leading AI companies.
Conversely, someone with “only” a 120–125 IQ is unlikely to place well in high-stakes math competitions, gain admission to elite colleges, or pass difficult phone or whiteboard interviews, and may have to settle for more middling opportunities. The same dynamic is seen in academia, such as with tenure.
2. This does not control for individual preferences. Smarter people tend to be overrepresented in lower-paying creative endeavors, such as the arts, academia, or writing. However, smarter people whose individual preferences are aligned toward wealth creation have many more options that pay much better. Post-COVID, at the very tail end of IQ (>140), opportunities have exploded, such as in AI or quantitative finance.
3. Such studies lack granularity. Categories or professions such as “SWE”, “quant”, or “investment banker” are absent. Being a programmer in the 1980s was not at all like getting a SWE job at a tech company today, the latter being much more cognitively demanding (both the job and the screening) and competitive. This raises the IQ threshold or bar.
4. Some studies purporting low or diminishing returns to IQ vs. income are based on foreign data, which again does not map well to the much more competitive economic landscape and the fatter “right tail” of the income distribution seen in the US. It should be obvious already, but Denmark, the UK, Sweden, or Norway are not at all like America.
In both examples, we see how attempting to extrapolate old data and economic conditions to the present does not work.