should reach the right conclusion more often than we, the nonexperts. Once Niederhoffer went bust, surely his expert credentials were revoked by the masses and relegated to nonexpert status, right?
Mustafa Zaida, a professional investor who ran a European hedge fund, apparently didn't think so. In 2002, Zaida seeded a new offshore fund called the Matador Fund, with Niederhoffer directing the trading activities. Zaida reportedly commented, “He's definitely learned his lesson.” It's hard to know exactly what Zaida's thinking was here, but he clearly believed Niederhoffer still maintained at least some degree of expertise.
The Matador Fund performed well initially, compounding at high rates for several years and growing to $350 million. Then in 2007, during the credit crisis, Matador reportedly lost more than 75 percent of its value. As had happened in 1997, Niederhoffer's account was liquidated. He had “blown up” for the second time in about a decade.4 And while these episodes were highly public, there are less public rumors that Niederhoffer blew up a third time, although we don't know whether to give much credence to such rumors.
Regardless, for fairly extended periods of time, Niederhoffer definitely appeared to be an expert; he generated high returns, seemingly without excessive downside risk. But did he eliminate the possibility of extreme downside outcomes? No. This was emphatically not the case, as he empirically demonstrated his ability to be steamrolled, not once, but twice.
Some might argue that if Niederhoffer told investors, “You may lose all your money pursuing this strategy, but it will give you high returns,” then they were not really relying on his expertise to protect them from bankruptcy. But perhaps this is beside the point. If you are aware of a strategy that compounds at 30 percent, but you know that every few years there will be a year when you lose all of your money, then that is not a strategy worth pursuing. Any expert who recommends such a strategy should not be considered an expert in financial matters.
Of course, there is an alternative explanation here. Maybe Niederhoffer wasn't an expert at all. Maybe Niederhoffer just chose risky strategies that made him look like a genius while they were working, but when he blew up, he demonstrated that he wasn't doing anything special at all. The emperor was revealed to have no clothes. All the fancy academic pedigrees, the studies and papers, the published book, the high returns – in short, all the things that made Niederhoffer an “expert,” were perhaps really just an illusion. Perhaps there really was no “expertise” involved, whatsoever. Certainly, after several bankruptcies, that conclusion seems reasonable.
Of course, this story is not meant to pick on Niederhoffer. Like all experts, Niederhoffer is only human. But as we will highlight over the next few chapters, humans are systematically flawed. And so if humans are systematically flawed, why do we still rely on experts for all of our most important decisions?
“If you do fundamental trading, one morning you feel like a genius, the next day you feel like an idiot…by 1998 I decided we would go 100 % models…we slavishly follow the model. You do whatever it [the model] says no matter how smart or dumb you think it is. And that turned out to be a wonderful business.”
Let's start off by examining our coauthor, Wes Gray, a person many would consider an “expert.” In fact, in many respects, Wes is eerily similar to Vic Niederhoffer. Wes graduated from an uber-prestigious undergraduate business program at the Wharton School of the University of Pennsylvania and earned an MBA and a PhD in finance from the University of Chicago – sound familiar? Well, it should: This is essentially the same academic training as Vic Niederhoffer.
Upon completion of his PhD, Wes entered academia and spent four years as a full-time tenure-track professor. Wes resigned his post as a full-time academic because he raised almost $200 million in assets from a multibillion-dollar family office and a handful of other ultra-high-net-worth families. This is all uncannily similar to how Niederhoffer started his career. Vic also did his time as a professor, and then left academia after a billionaire (i.e., Soros) gave him a large slug of capital. Let's hope the similarity in the stories between Vic and Wes ends at this stage. The last thing Wes wants to do is blow up multiple asset management firms and lose investor capital. He is also deathly afraid of steamrollers.
Clearly, some people believe Wes is an “expert” and are willing to let him manage a large amount of capital without a multi-decade track record. But why might investors' future experiences differ between Vic and Wes? On paper, the two Chicago finance PhDs are virtually the same. It has been said that the definition of insanity is doing the same thing over and over again and expecting a different outcome. So should we avoid an expert like Wes because he is essentially a carbon copy of Vic?
We think the key difference between Wes and Vic is not related to their financial expertise. The difference is related to their skepticism with regard to their own expertise. On most discretionary, day-to-day aspects of investing, for example, picking individual stock picks or the direction of interest rates, Wes believes firmly that he is completely wrong almost all of the time, whereas Vic believed he could master the markets. And while an expert with no faith in his or her ability sounds counterintuitive, it is actually invaluable because this approach to being an expert minimizes the chance for overconfidence. In fact, Wes has established internal firm structures to ensure that he is reminded on a frequent basis that he is a terrible expert in this sense. But why would an expert systematically convince himself that he is not an expert? The reason Wes engages in this peculiar behavior is explained in a quote often attributed to Mark Twain, “It ain't what we don't know that causes us problems; it's what we know for sure that simply ain't so.”
An expert, or any market participant, must acknowledge his own fallibility and must constantly remind himself why he is flawed. This is very difficult to do consistently, since our natural inclination is to believe we are better than average. Unfortunately, on average, we are only going to be average. The ability to question one's own convictions, even when they are firmly held, turns out to be a very useful thing in investing.
The next example highlights how our minds can tell us something with 100 percent confidence, when in fact, what our mind is telling us is 100 percent incorrect.
Figure 1.1 highlights this point.6 Stare at box A and box B in the figure. If you are a human being you will identify that box A is darker than box B.
Figure 1.1 Ed Adelson Checkerboard Illusion
Then ask yourself:
“How much would I bet that A is darker than B?” Would you bet $5? $20? $100?
Or perhaps you would borrow money from a bank, and leverage your bet up 10 times and bet $1,000,000 on this bet. Why not, right? It is a guarantee.
We know how a human approaches this question, but how does a computer think about this question? A computer identifies the red-green-blue (RGB) values for a pixel in box A and the RGB values for a pixel in B. Next, the computer tabulates the results: 120-120-120 for box A; 120-120-120 for box B. Finally, the computer compares the RGB values of the pixel in A and the pixel in B, identifies a match, and concludes that box A and box B are the exact same color. The results are clear to the computer.
So which is it? After taking into consideration the results from the computer algorithm, would you still consider A darker than B? We don't know about you, but we still think A looks darker than B – call us crazy. But then that's what makes us human – we aren't perfect.
The sad reality is the computer is correct, and our perception is wrong. Our mind is being fooled by an illusion created by a vision scientist at MIT, Professor Ed Adelson. Dr. Adelson exploits local contrast between neighboring checker squares, and the mind's perception of the pillar as casting a shadow. The combination creates a powerful illusion that tricks every human mind. The human mind is, as succinctly stated by Duke Psychology Professor Dan Ariely, “predictably irrational.”
That may seem to be a strong statement.