There is some buzz over superforrcasting and superforecasters, related to a 2015 book by Philip E. Tetlock and Dan Gardner about superforecasters – people with an above average ability to forecast events and outcomes.
Blurb from Wikipedia:
Superforecasting: The Art and Science of Prediction is a book by Philip E. Tetlock and Dan Gardner released in 2015. It details findings from The Good Judgment Project.
A number of people participated in an IARPA tournament that encouraged forecasters to update forecasts in real time. The top performers on the 2,800 tournament were categorized as superforecasters based on their Brier score. The collective Brier score of superforecasters was 0.25, compared with a score of 0.37 for other forecasters. Some discussed superforecasters included Doug Lorch, Bill Flack, and Sanford Sillman (an atmospheric scientist). Superforecasters even “performed” 30 percent better than the average for intelligence community analysts who could read secret data.
When given intelligence tests, superforecasters scored well-above average, but not at the genius level (they scored above 80% of people).
Even if you have less than a 115 IQ, you can still be a superforecaster by following this simple heuristic: the media is almost always wrong. 95% of the time, what is supposed to be a crisis and a big deal according to the media – amounts to little more than speed bump. Recent examples include fears over QE ending, fears about Ukraine and Russia, fears over the debt ceiling, fears over ebola, fears over recession, Y2K (remember that?), the national debt, China, and so on. So if the media says to sell stocks, you should probably be buying more stocks. If media says consumer spending is going to be weak, it will probably be better than expected.
The only time I can recall the media being right was in 2005-2006 when pundits were predicting a housing crash and bubble, and indeed the housing and stock market did crash in 2008.
But the rest of the time, the outcome is never as bad as expected.