- Human beings are, by definition, very bad at predicting the future. Worse, we can be tricked by "prediction like" systems which seem to give accurate predictions but fail to provide accurate descriptions for what will happen.
- Thanks to survivorship bias, bad predictions will often quickly be forgotten or ignored, while "good" predictions will get undeserved attention for essentially being overconfident guesses that turned out to be true. This disparity encourages prediction over precaution.
- Intelligence isn't a solution, and may make the problem worse by making individual actors overconfident in their still-flawed predictions.
- At the end of the day, making individual predictions is a bad idea, but general precaution and preparing for a wide range of possibilities is always a good plan.
Can humans predict the future? I’d argue that the answer is a clear and solid no, with some minor caveats.
Throughout history, lots of people have tried to predict the future - everyone from roadside prophets telling us that the world will end, to stock market television telling us that our stocks will go up, or down. Sometimes these people are right, and sometimes they’re wrong.
One problem with predicting the future is that, by definition, it’s subject to a very tricky case of survivorship bias.
Survivorship bias tells us that history’s failures tend to quickly be forgotten (written out of the history books) while its successes (or sometimes, very spectacular failures) tend to be made a big deal out of. Because of this disparity of attention, we may think that the successes are "more likely" when the reality is the opposite. We only pay attention to things that we think are important - and we’re far more likely to think of a true prediction as important than a false one.
This means that people who make predictions that quickly turn out to be false, are also quickly forgotten. People who make predictions that are true, on the other hand, can get a lot of attention for it.
Think of all the rich people in interviews who tell us that it’s all because of hard work - they predicted that they would succeed based on their actions. We ignore all the uninteresting non-rich people who also worked very hard in often unfair conditions just attempting to make a living. But at the end of the day we will still praise the rich person and shower themselves with praise for their self-involved predictions of their own future performance.
Because of survivorship bias, there’s an uneven distribution of power in the equation - there’s minimal social penalties for predicting the future, but there can be huge social upsides. Famous baseball player Babe Ruth once made a gesture that seemed to predict where he would hit a home run - and then he immediately did, immediately becoming a legend. The dude pointed his finger, and now we have a wikipedia page for it, and people have spent countless hours trying to analyze and deconstruct that (highly visible, extremely survivorship-biased) moment.
Sometimes, predicting the future isn't actually predicting the future - it's gaming a system.
There’s a very classic stock market scam story. Let’s say that you want to get 1000 people to buy a particular stock pick from you, with healthy commissions so that you get rich off the deal. Of course, in this scenario, it doesn’t matter whether or not the stock buy actually benefits these 1000 people - you’ve made your money from the sale. So if you’re particularly uncaring, you can use a simple trick.
Take a mailing list of 8000 people, and you mail half of them a letter that introduces yourself and tells them that you’re a genius stock trader and that you can predict that stock X will go up in a month. Now to the other half, you mail a letter that also introduces yourself as a genius stock trader, but this time says that X stock will go down in a month. Use tons of flowery language to make this seem real, and send them off.
Then you wait a month - at this point, 4000 people have received bad advice from you, and 4000 people have received good advice. So, you mail the 4000 people who have received good advice and repeat the process with a new stock, Y. Now in a month, 2000 people have received 2 rounds of good advice, 2000 have received one good and one bad, and the remaining 4000 have received one bad. Repeat this process one more time, and you now have 1000 people who have received good advice from you for 3 months in a row. Send these 1000 people a letter asking to buy a stock of your choice - any stock, it doesn’t matter which - and you’re in the money. You're a "genius" stock trader, and they trust you.
This can be repeated with a new crop of 8000 people, or you can get people even more excited in you by doubling your base for each additional month you want to have a “streak” of successes.
The point of this story is to demonstrate how easily we can be tricked by obviously false predictions. 7000 people received poor advice, and 1000 people received “good” advice purely by chance, and only in order to set them up to financially invest in a decision that could be either good or bad.
This isn’t prediction - this is just gaming a system.
Yet, this is how a lot of stock market predictions work. We trust people who have a “long record of predicting success”, when we have almost no way of telling if someone’s success is highly reliant on chance.
Stock market analysts, for all their bells and whistles, often fail to outperform purely random chance like in the analysis above. In fact, many analysts can produce above average returns for a very long time span - a decade or more - only to be wiped out in the next market crash - losing more in a single year than they made in that entire decade. Their “success” wasn’t success - it was just failure that hadn’t yet come to completion.
Human beings are particularly bad at predicting the future because we simply can’t foresee or predict small events which will impact the overall plan.
Here’s an example that I can draw from my life as a fitness coach. I can spend hours sitting down and writing out the perfect plan for a client for the next 12 months. I can put in all the bells and whistles, the most perfect program - and I know for a fact that there’s a 100% chance that it’ll never actually happen as written.
The reason for this is that life happens - people can’t predict things that will interrupt their schedule, like unexpected bills, work stress, loss of a job, relationship changes, when they might need to take vacation, or when they’re going to just have a crap week due to low energy levels. These little interruptions don’t usually mean much on their own, but they quickly add up - soon, you’re weeks or months off of plan.
Writing a good plan, then, means planning for these little missteps. It means leaving slack in the program for time off, expecting that things will take longer than you expect. This is a difficult skill, because no one ever wants to admit that they may not be 100%. All the time, we make decisions based on false overconfidence and promises that we tell ourselves. We buy gym memberships telling ourselves that we’ll make use of them - when the reality is that most people, on their own, tucker out after a couple months. So, we often fail to properly plan for the future because we don’t want to admit to ourselves that things will go wrong.
The same thing happens constantly with government projects, construction projects, and anything else where there’s a lot of complication. Large projects can often be constantly delayed as parameters change, as expectations are adjusted, as political events cause changes in prices, material, labor. We always hear about these projects going over budget, and very rarely about them going under budget.
We’re not well-suited to preparing for extremely low-probability, high-impact events, also called black swan events. Events like a market crash or an earthquake are massively devastating and next to impossible to predict - yes, we can say that they’ll probably happen “sooner or later”, and we know that they exist - but we can’t predict what day something like this will happen, and we don’t wake up each morning thinking “maybe today is the day”.
Then there’s the problem of intelligence.
By examining the past and the way things have reacted in the past, we can have a rough idea of what will most likely happen in the future. I know, thanks to a lot of experience, that the sun rises in the east every morning and sets in the west in the evening. Events like this are common and happen regularly, so it’s easy to analyze their behavior and have a rough idea of what should happen in the future, minus a few small variations. I can easily predict the rough course of these things, if not the exact moment-to-moment behavior.
But then there’s the matter of more practically complex systems - like what the stock market will do, what kind of food is most ecologically sustainable per calorie to purchase at my local market, what the weather will be like in Seattle, or what Vladimir Putin is thinking right now. In these kinds of events, no person can realistically have a solid understanding of all the variables involved. Much of the knowledge simply isn’t available to us because it’s virtually impossible to know every single thing going on in the market, and statistically there’s only a very small number of people who are close to Putin and can theoretically know what he’s thinking. There’s simply too many variables to account for.
Smart people are more often overconfident due to their intelligence. A smart person who knows more than the average person, and spends more time than the average person studying current events, may still not know enough to realistically predict the results of the complex system - but may be convinced that they can, simply because they do know “more” than the next person. The stock market is a prime example of this.
The reliance of data in the stock market can provide us a great deal of information about what has happened in the past - but no matter how much data we have, we still can’t predict how it will work in the future. Confidence in this data only makes this problem worse - that there will always be other data which we don’t have, which could throw off our analysis. The more we rely on the data we do have, the more blind we are to the data we don’t.
At the end of the day, can anyone predict the future? In specifics, I’d say no.
Intelligence and mastery can give us some idea of what is likely to happen in the future, but still can’t predict individual events, and is still subject to the problem of black swan events.
The general takeaway I would make is this: at the end of the day, it’s not smart to make predictions, by definition. Instead, it’s better to use the precautionary principle and take everything that is new or not fully understood yet with rigorous caution. Be careful, don’t be overconfident, and prepare for a wide range of possibilities in the future. We can’t predict the future - but we can plan for possibilities, whether that means leaving slack in a diet or cutting down on spending so we can save a few dollars for the case of a sudden expense.
Failing to plan, is planning to fail.
Enjoy this post? Share the gains!
Ready to be your best self? Check out the Better book series, or download the sample chapters by signing up for our mailing list. Signing up for the mailing list also gets you two free exercise programs: GAINS, a well-rounded program for beginners, and Deadlift Every Day, an elite program for maximizing your strength with high frequency deadlifting.
Interested in coaching to maximize your results? Inquire here. If you don’t have the money for books or long term coaching, but still want to support the site, sign up for the mailing list or consider donating a small monthly amount to my Patreon.
Some of the links in this post may be affiliate links. For more info, check out my affiliate disclosure.