December 2014If the world were static, we could have monotonically increasingconfidence in our beliefs. The more (and more varied) experiencea belief survived, the less likely it would be false. Most peopleimplicitly believe something like this about their opinions. Andthey're justified in doing so with opinions about things that don'tchange much, like human nature. But you can't trust your opinionsin the same way about things that change, which could includepractically everything else.When experts are wrong, it's often because they're experts on anearlier version of the world.Is it possible to avoid that? Can you protect yourself againstobsolete beliefs? To some extent, yes. I spent almost a decadeinvesting in early stage startups, and curiously enough protectingyourself against obsolete beliefs is exactly what you have to doto succeed as a startup investor. Most really good startup ideaslook like bad ideas at first, and many of those look bad specificallybecause some change in the world just switched them from bad togood. I spent a lot of time learning to recognize such ideas, andthe techniques I used may be applicable to ideas in general.The first step is to have an explicit belief in change. People whofall victim to a monotonically increasing confidence in theiropinions are implicitly concluding the world is static. If youconsciously remind yourself it isn't, you start to look for change.Where should one look for it? Beyond the moderately usefulgeneralization that human nature doesn't change much, the unfortunatefact is that change is hard to predict. This is largely a tautologybut worth remembering all the same: change that matters usuallycomes from an unforeseen quarter.So I don't even try to predict it. When I get asked in interviewsto predict the future, I always have to struggle to come up withsomething plausible-sounding on the fly, like a student who hasn'tprepared for an exam.[1]But it's not out of laziness that I haven'tprepared. It seems to me that beliefs about the future are sorarely correct that they usually aren't worth the extra rigiditythey impose, and that the best strategy is simply to be aggressivelyopen-minded. Instead of trying to point yourself in the rightdirection, admit you have no idea what the right direction is, andtry instead to be super sensitive to the winds of change.It's ok to have working hypotheses, even though they may constrainyou a bit, because they also motivate you. It's exciting to chasethings and exciting to try to guess answers. But you have to bedisciplined about not letting your hypotheses harden into anythingmore.[2]I believe this passive m.o. works not just for evaluating new ideasbut also for having them. The way to come up with new ideas is notto try explicitly to, but to try to solve problems and simply notdiscount weird hunches you have in the process.The winds of change originate in the unconscious minds of domainexperts. If you're sufficiently expert in a field, any weird ideaor apparently irrelevant question that occurs to you is ipso factoworth exploring. [3] Within Y Combinator, when an idea is describedas crazy, it's a compliment—in fact, on average probably ahigher compliment than when an idea is described as good.Startup investors have extraordinary incentives for correctingobsolete beliefs. If they can realize before other investors thatsome apparently unpromising startup isn't, they can make a hugeamount of money. But the incentives are more than just financial.Investors' opinions are explicitly tested: startups come to themand they have to say yes or no, and then, fairly quickly, they learnwhether they guessed right. The investors who say no to a Google(and there were several) will remember it for the rest of theirlives.Anyone who must in some sense bet on ideas rather than merelycommenting on them has similar incentives. Which means anyone whowants such incentives can have them, by turning their comments intobets: if you write about a topic in some fairly durable and publicform, you'll find you worry much more about getting things rightthan most people would in a casual conversation.[4]Another trick I've found to protect myself against obsolete beliefsis to focus initially on people rather than ideas. Though the natureof future discoveries is hard to predict, I've found I can predictquite well what sort of people will make them. Good new ideas comefrom earnest, energetic, independent-minded people.Betting on people over ideas saved me countless times as an investor.We thought Airbnb was a bad idea, for example. But we could tellthe founders were earnest, energetic, and independent-minded.(Indeed, almost pathologically so.) So we suspended disbelief andfunded them.This too seems a technique that should be generally applicable.Surround yourself with the sort of people new ideas come from. Ifyou want to notice quickly when your beliefs become obsolete, youcan't do better than to be friends with the people whose discoverieswill make them so.It's hard enough already not to become the prisoner of your ownexpertise, but it will only get harder, because change is accelerating.That's not a recent trend; change has been accelerating since thepaleolithic era. Ideas beget ideas. I don't expect that to change.But I could be wrong.Notes[1]My usual trick is to talk about aspects of the present thatmost people haven't noticed yet.[2]Especially if they become well enough known that people startto identify them with you. You have to be extra skeptical aboutthings you want to believe, and once a hypothesis starts to beidentified with you, it will almost certainly start to be in thatcategory.[3]In practice "sufficiently expert" doesn't require one to berecognized as an expert—which is a trailing indicator in anycase. In many fields a year of focused work plus caring a lot wouldbe enough.[4]Though they are public and persist indefinitely, comments one.g. forums and places like Twitter seem empirically to work likecasual conversation. The threshold may be whether what you writehas a title.Thanks to Sam Altman, Patrick Collison, and Robert Morrisfor reading drafts of this.