The following is a rough transcript which has not been revised by The Jim Rutt Show or by Michael Mauboussin. Please check with us before using any quotations from this transcript. Thank you.
Jim: Howdy. This is Jim Rutt and this is The Jim Rutt Show.
Jim: Listeners have asked us to provide pointers to some of the resources we talk about on the show. We now have links to books and articles, reference to recent podcast that are available on our website. We also offer full transcripts. Go to jimruttshow.com. That’s jimruttshow.com.
Jim: Today’s guest is Michael Mauboussin of BlueMountain Capital.
Michael: Hey Jim, how you doing?
Jim: I’m doing great. How about you?
Michael: I’m great, thank you.
Jim: Great to have you. Prior to joining BlueMountain, Michael was Head of Global Financial Strategies at Credit Suisse and Chief Investment Strategist at Legg Mason Capital Management. I know Michael from the Santa Fe Institute, where he’s been a long-time stalwart supporter. For the last seven years he’s been a chairman of our board of trustees. He is also the author of one of the best books on investing, More Than You Know: Finding Financial Wisdom in Unconventional Places. And he has a new book out, The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing, which we’ll get to you soon. But first Michael, could you tell our audience a bit about the Santa Fe Institute. What you find of value there for your work and for your life?
Michael: Well, first off, Jim, you didn’t mention this, but of course I took over … I filled your very big shoes as chairman of the board. And as an Institute that was founded about 35 years ago by a number of scientists who felt that academia had become too siloed and that many of the [inaudible 00:01:37] and important issues in the world stood at the intersection of disciplines. So to non-degree conferring Institute that works on basic research questions across disciplines. And the unifying theme, if there is one, is complex systems. So interactions of lots of little agents and what emerges from that and how those things evolve over time.
Michael: My introduction to SFI was through Bill Miller, and it was a very, one of these very specific moments I recall. We were at a baseball game and I had a beer in one hand and a hot dog and another, and Bill said, “There’s this place out in New Mexico you ought to check out.” I was a financial analyst and I was reading a lot of stuff in science and biology, evolution, all this kind of stuff. He encouraged me to go out there probably a little over close to 25 years ago and I was immediately enthralled, just some of the presentations, the way of thinking. And I think for me, one of the most extraordinary values of the Institute is just the fact that it has a lot of, it draws people who are actively open-minded, intellectually curious people.
Michael: So yeah, it’s a neat place. And I think often people in the business world or finance world feel like some of the ideas or themes are not that applicable to business or investing, but if you hang around a little bit and have an open mind, there are just enormous number of connections between the world of business and the world of investing and the study of complex systems. So it’s not only been personally very gratifying and the source of great growth, but also professionally actually really interesting and valuable.
Jim: Yeah, indeed. I have to say it’s been an unfair weapon for me ever since I got involved in, to the point of applicability. One of the points I like to make the people is that yes, sometimes the actual science hardcore may be a little difficult to apply, but the metaphors that come out of it are unbelievable. Things like fitness landscapes, increasing returns, the scale, et cetera, that are somewhat different than the conventional way of looking at the world and are extraordinarily powerful lenses for thinking about both business and investing.
Michael: I agree with that 100% and the way I think about not just in business or investing, but just life is that it’s really good to try to build a toolbox of tools so that when you face a problem, you have the right tool. In this case, perhaps picking up on what you said, a metaphor, to try to solve that or think about it effectively. So I think one of the joys of life, although it’s a bit of work, is to constantly working on your toolbox and thinking about different ways of approaching problems. I would just say that associating with SFI and the folks there is just a great way to help build out that toolbox, and again, very intellectually fulfilling, but actually very practical as well.
Jim: Indeed. Now, before we hop into talking about your book, I’m sure our audience would also appreciate a little insight into your day job as a value investor. In prepping for the podcast, I happened upon a really interesting document you wrote titled, Who Is On The Other Side. It’s 51 pages and we’ll put a link through it up on the website and it’s well worth reading. I thought that was just an amazing perspective on investing. Could you condense your perspective down to a few minutes?
Michael: Yeah, sure. I mean, and by the way, and I also teach at Columbia business school as an adjunct, so I’m a pretend professor, but it’s been great. And I teach the course in a number of different modules. One of the modules is thinking about why the market’s mispriced securities. This is something, over the years, Jim, that I typically pontificate about, but really had done very little to codify my thinking. So I’m like, all right, after all these years, maybe I should sit down and actually get serious about this and try to codify this.
Michael: The title I saw from another presentation, Who Is On The Other Side. This is also if you’re a poker player, which I know that you are, or at least have happened in the past, when you’re up against somebody, you want to think about what does that other person thinking? What do they have in their hand? How are they approaching the world? It’s very much true for investing. When you put on a trade, you buy yourself something. The question you should be asking yourself is, what do I know or what I think I know that this person doesn’t, and why are they motivated to do something on the opposite side of my trade?
Michael: So when I try to break this down, there are four potential areas of mispricing. The first is, you know, we call it behavioral, but basically we’re social beings and investing is inherently a social exercise. And from time to time we sort of correlate our behaviors and lead to sort of accesses in markets, or the theme of overextrapolation. So behavioral, I mean, there are a lot of other subsets, but that’s a big one.
Michael: The other one’s analytical. So you and I have the exact same information, but we analyze it in different ways or come to different conclusions. So you can imagine how that would work as well. The third one’s informational, which is the most obvious, is that you know something that I don’t know, or vice versa. Of course, governments around the world have tried to encourage companies to disclose information uniformly and as low cost as possible to sort of get rid of that advantage. But there certainly have to be informational advantages as well. And by the way, even part of that is there’s a huge literature on what we pay attention to. So it’s not so much, the information is not out there, we just don’t weigh it all the same way.
Michael: And then the final one is technical, which is sometimes people have to buy herself or reasons that have nothing to do with fundamentals. You have to pay your kid’s college tuition, you’re going to sell just because you need a raise the cash to pay the tuition and it has nothing to do, or there’s no statement about the values at all.
Michael: So I just tried to create this sort of comprehensive framework and say, every time you put on a trade and you think you’re going to generate an excess return, just give some thought as to who’s on the other side of that and why you think specifically you have some sort of an edge. And I just think it’s a really rich way to think about things versus just thinking you’re the most clever person walking around. And if you’ve been in markets, by the way, for any period of time and you’re not humble about it, then something’s wrong. I mean, markets are the most humbling mechanism I can imagine.
Jim: Yeah, I can tell you the same experience over my 35 years or so. And to your point, nothing drives me crazier than hearing two morons at a restaurant discussing the tip they got from their broker as if this was God’s own truth.
Michael: Exactly. You are human so it’s all entertaining. So we love stories and we love to defer to experts, we love to defer to people who are “in the know”. But yeah, we can just know in real life that doesn’t work out so well.
Jim: Yeah. In fact, we’ll finish this section by a quote that you had in the paper from, I guess he’s your hero, Benjamin Graham. He said, “Have the courage of your knowledge and experience. If you have formed a conclusion from the facts and if you know your judgment is sound, act on it, even though others may hesitate or differ. You are neither right nor wrong because the crowd disagrees with you. You are right because your data and reasoning are right.”
Michael: Beautiful, right? And it’s interesting that the course that I teach at Columbia business school is actually sort of the same course that Ben Graham started teaching in the 1920. So it’s been around in Columbia for many, many years. And I think that if you read Graham’s work, Security Analysis from 1934 and The Intelligent Investor, it’s replete with examples that were at the time contemporary, but they feel very dated. And a lot of his methods actually today, I think, come across as very dated.
Michael: But Graham’s overarching lesson to all of us is really about temperament and philosophy of how you approach investing in general. And that quote, I think, really embodies something that will ultimately be timeless, which is people may go bonkers, but when the facts and you’ve done the work and the facts in your judgment align to just suggest you should do something, you should just go ahead with some conviction. So a lot of that is … The lasting lesson of Ben Graham is really about temperament and how you approach this philosophically. That’s a powerful thing that we’ll talk about in another 100 years from now, I think.
Jim: And we’ll talk about things related to it as we talk about your book. So now let’s turn to The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. I knew this was going to be interesting and right up front I saw that you had said, “If you are the underdog, you want to inject luck by making the game more complex.” This is the rule I’ve long found useful, but yet does not seem to be something that most people intuitively know.
Jim: For example, chess. I’m a mediocre real world chess player, but a pretty decent barroom test player. And mostly because when I discover that my opponent is better than me, which is fairly often, I just relentlessly raise the complexity of the middle game and most people don’t pick that strategy. I’ve got some other examples from business, but why don’t you tell us your perspective on that? Make the game more complex when you’re the underdog.
Michael: Yeah, I mean, that’s awesome, Jim, and I’ll just say, let me make this a specific Santa Fe Institute connection as well. Part of the motivation or the insight on that came from a conversation one day with Scott Page now at the University of Michigan. Scott taught me about this thing called the Colonel Blotto game, which I’d never heard of. And for those that are not initiated with Colonel Blotto, it was developed by a mathematician back in the 1920s. It’s a segment of game theory, and actually it was quite popular at the [Rand 00:10:47] Institute in the 1950s.
Michael: Now, if you talk about game theory, everyone’s heard of the prisoner’s dilemma, almost no one’s heard of Colonel Blotto. Part of it because of the math was really difficult. And then I don’t know, 1520 years ago, a young academic named Brian Roberson came along and mathematized the Colonel Blotto game. So he developed this, and this is sort of where this led to this insight. So let me just drive the Colonel Blotto game very simply. The setup is you allocate two parties and number of soldiers.
Michael: So Jim, you and I’ll play. So I’ll give you 100 soldiers and I’ll have 100 soldiers, and then we’ll have three we’ll specify a number of battlefields. It’s called three battlefields to start in the initial condition. So what we do, again, we do this independently and blindly, is we allocate soldiers to each battlefield, then we sort of lift the veil, and then whoever has the most soldiers per battlefield wins that battle and whoever wins most battles wins the war. So that’s the basic setup. And the way I just described 100 soldiers, 100 soldiers, three battlefields, that’s basically rock, paper, scissors. There are some dumb strategies.
Michael: But most strategies, it’s just going to be a cycle, sort of cycle of win loss. So that’s the basic setup. But where Colonel Blotto gets really interesting is when you change and you have asymmetry in the number of soldiers. Now I give you 150 soldiers. I have 100. There are still ways for me to win, but obviously many fewer. And then the second thing is to expand the number of battlefields. And that’s where Roberson stuff got really, really interesting because it turns out that in an asymmetric battle, the way you dilute the strength of the stronger player is to increase the number of battlefields or i.e., to add complexity to it, as you sort of said, add luck, but just add complexity, as you mentioned in the middle game and chess.
Michael: And so this is this really fascinating idea that when you’re the stronger player, what you want to do is simplify the game. Because then your skill or your strength will overwhelm your competitor. And if you’re the underdog, you want to make the game more complex. Now, in the business world, we would talk about things like disruptive innovation, the work by Clay Christensen and disruptive innovation. Sort of the classic approach to sort of taking a weird approach to trying to compete, not going toe to toe with the incumbent, but rather sort of hitting on the edges. Again, making the incumbent do something they’re not comfortable doing or they’re not interested in doing or they’re not incentivized to doing. Okay. Jim, can I also mention something weird that I think you’re gonna really like, by the way?
Jim: Have at it.
Michael: I don’t think we’ve talked about this directly, but there’s a really interesting book by Roy Baumeister called Is There Anything Good About Men? Do you know this book?
Jim: Do not.
Michael: Okay, this is going to be up your alley. Baumeister starts with this, it’s this thing about … By the way, Is There Anything Good About Men? People often joke, it’s going to be a pretty short, short book, but it’s actually, his point is there actually is just men and women are different for a lot of interesting reasons. But there’s a chapter in there that’s particularly interesting and when he talks about … So one of the observations we know just in life is that men and women in most things, the averages are about the same. And this is like personality stuff and cognitive stuff and so forth. But we also know that in many, many things, the variance of men is higher than the variance of women. So there are more men on the right tail and more men in the left tail, even if the average is the same.
Jim: Yeah, that’s a comprehensive social science result pretty much.
Michael: Correct. And so Baumeister asks us sort of question like, why is that? That’s sort of a weird thing. So one of his chapters, he opens and he says, “Let me ask you a really simple question. If you think back to all of our ancestors, what percent were women and what percent were men?” And your intuitive answer is, “Well, everyone has a mother and a father, so it should be 50/50.” And as you know where I’m going with this, the answer is that’s not at all the case.
Michael: And this is based on genetics. And no one really knows the exact number, but the ratio is, something like two to one women to men. So in other words, let’s just call it two thirds of all ancestors are women, and one third of ancestors are men. So saying this differently, in the sweep of human history, and by the way, don’t look at just humans, look at any other primates or any other mammals in general, it’s the case that women have a roughly twice the probability of reproducing than men do.
Michael: So for the vast part of human history, most men were reproductive dead ends. And a few men actually reproduced a ton. So one of the ways you can think about this is model. This actually is an option, like a financial option, where above a certain strike price, you’re in the money and below that you’re out of the money. And if you want an option to be valuable, how do you do that? And the answer is increase the volatility. Isn’t this interesting, this interesting connection?
Jim: Aha, I love this.
Michael: I knew it was going to be up your alley. So the argument is something along the lines of mother nature says, “Well, we know that most males are not going to reproduce.” Forget about the contemporary world, but going back over time, most males are not going to reproduce. So the strategy from a female’s point of view is reduce variance and you’re pretty much going to be in the money. For a male it’s like, hey, you’re probably not going to be in the money. So increase variants. So again, every time there’s a conversation about there too many men on the right side of that distribution, you should, first question should be, what about the left side of the distribution? And check that out.
Michael: And there are even interesting things. I mean, this is, I think a little bit more speculative, but the math SATs that boys do better than girls on average by number of points. But the reason, the conjecture is the reason is about 150 or 200,000 fewer boys take the SAT than girls. And the speculation is the boys that don’t take the sat or the left tail. And if they were compelled to take it, they would fill in the left tail, the mean would shift right in line, and we’d have the same distribution we see everywhere else.
Michael: So anyway, so this is an interesting idea. So I think not only is this a cool idea from the world point of view of business. And as you pointed out, just in strategy, or you playing chess. But it’s also mother nature figured this out and I think that one of the reasons we have high variability is because that’s the way you get a higher likelihood of being in the money with an option. How’s that? How’s that for a great idea?
Jim: That’s great. It’s a perfect setup because I have a note here that says that beyond raising the complexity, a very closely related corollary is when you’re behind raise the variants. Similar, but not quite the same thing. And just a couple of factual items. Some mammals are way more asymmetrical in a reproduction than even historical humans, a white-tailed deer. I’m both the deer hunter and I use them as my model species on my theories of conscious cognition. There are about 6x. Only about one in six males reproduces. And of course, we’ve all heard the story of Genghis Khan. Supposedly 25% of the people in central Asia are descendant from him. So even in humans, the numbers can be more asymmetric.
Michael: Right. And by the way, I’ll just say that that Baumeister is specul … By the way, a lot of this I think is, I think it makes sense, but it’s somewhat speculative, but Baumeister speculates that the ratio was much higher for humans as well. And some of our more contemporary social mechanisms have narrowed the gap between men and women. So I don’t know if we were ever white tail deer territory. And again, look at other primates, you see very similar things. And again, there’s within, even when there’s a lot of differences within, for example, primates or other mammals. But the basic setup is pretty widely seen in nature.
Jim: Yeah. There’s actually a wonderful historical example from US history, which is the Civil War. The South was behind positionally, industrially, manpower-wise, all kinds of ways, but they had better generals for, at least for the first part of the war, and those generals, particularly Lee, realized the only chance he had was to raise the variants, and hence the Battles of Antietam and Gettysburg where Lee took big, huge risks and tried to end the war with a big roll of dice.
Jim: It turned out that he wasn’t quite successful, but at least they had a chance and when the union tried to match him smart play for smart play, they weren’t successful. But when they got a guy who saw the biggest picture of all, Grant, and realized his job was to not lose essentially, lower the variants, he gradually just squeezed the South until it collapsed. So there’s a perfect example, both sides of-
Michael: I will mention Jim, I gotta, jump in that there’s actually a whole book about asymmetric warfare, which is fascinating. And this guy looked at asymmetric warfare going back 200 years and broke it into like 50 year blocks. And what he found was that in ancient days … And by the way, it’s a really interesting thing when I say ancient days, couple hundred years ago, a lot of warfare. It was less about actually winning and more about honor. So if you went off to war you were there to serve your country and honor your family. And so people, so they would go toe to toe with the stronger thing and just get blasted.
Michael: And then people started to realize if we want to win, we need to increase variants and do weird things. All these things like guerrilla tactics you mentioned for these different techniques from the Southern generals. So this guy actually documented a 200 year sweep of asymmetric warfare, and again, it fits beautifully with a Colonel Blotto stuff, but it fits beautifully with what you just described. By the way, it’s an interesting question. Why have wars been predominantly, in fact, anything risky, why they predominantly send men to do those things? Exploration, why do they send men and not women? The answer is because men are expendable. If you don’t need that many of them to propagate the species, they’re expendable. So send them off to do all the dirty jobs. Anyway, it’s very interesting stuff.
Jim: Was that the book How the Weak Win Wars by Arreguín-Toft?
Michael: Very good man. Yes, it is. That’s the book.
Jim: And back to the ability, making men expendable. There’s a famous quote by Napoleon after a big loss in one of his battles and he said was he concerned about the longterm history of France because of these big losses? And he puffed himself up and said, “My soldiers will replace the losses tonight.” Yeah, that’s a good one. So let’s get back to the main stream of what you were talking about in the book. Sort of at the very core of it is the difference between skill and luck. Maybe you could start with describing what those two things are and what you mean by those skill and luck in the rest of the book.
Michael: Exactly. And I’ll just say that I think that there are a lot of aphorisms about luck. Luck is where success meets opportunity or preparation meets opportunity. And the harder I work, the luckier I get and things like that. I actually think a lot of those things, they have a good sentiment, which is like kind of work hard, but they’re very misfounded, I think so.
Michael: Let’s just take a moment to define both of these things. First, I would define skill right out of the dictionary, which is the ability to apply one’s knowledge readily in execution or performance. So you know how to do something, and when you’re asked to do it, you can do it on cue. So you could think about a musician or an athlete or something like that. Luck is much more difficult to define. And by the way, this gets right into philosophy. You almost have to put a stake in the ground to sort of limit the discussion. Nick Rescher, a Professor of Philosophy at University of Pittsburgh wrote a book about luck and I sort of liked his definition.
Michael: So he said luck exists when three conditions are in place. One is, it happens to an individual or an organization so you can sort of figure out the unit to which it happens. Second, it could be good or bad. And I don’t mean to leave anybody to the impression that it’s symmetrical because it’s certainly not, and we can talk more about that in a few moments, but there is such a thing as good luck or bad luck. And by the way, even if you look at the etymology of words related to locks, things like fortune and chance, many of them don’t have asymmetry. Luck, I think, would. And the final thing, and this is the trickiest one, is it’s reasonable to expect a different outcome could have occurred.
Michael: So if you rewind the tape of time and you play it again, it is reasonable to expect a different outcome could occur. Those I think are the basic definitions. And I think, Jim, if you sort of try to get to the core of this book, there’s one illustration. I think it’s easy for people to do it mentally as well, is, I call it the luck skill continuum. So you can imagine a container, just a straight line on the far left, we’ll call it all luck, no skill. So activities where skill plays no role whatsoever in your success.
Michael: And things like lotteries, fair roulette wheel. Those would be examples of that. On the far right would be pure skill, no luck, where luck has no effect whatsoever. And again, nothing’s perfect over there, but it would be things like running races or swimming races or chess is probably close to that side of the spectrum as well. And then the key is that almost everything in life is between those two extremes. And the question is, where is your activity and where does it find itself on that luck skill continuum?
Michael: And then from there, spill out the kinds of strategies or approaches you should take to thinking about how to approach that particular activity in the way that’s most effective as possible. So that’s the basic setup, like what is luck? What is skill? And by the way, by definition, this idea that you can create your own luck. If you accept what I just said, that’s just not true. There’s anything you can do to improve your likelihood of success. That by definition is skill. That’s something within your control.
Michael: So it goes back to same thing, I think. The aphorisms about locker often encouraging people to work hard or prepare, and those are all good sentiments, but they don’t really pass a test if you think about defining these things even with a modicum of rigor.
Jim: We’ll talk about this a little bit later, but my own life and how I used to have a presentation I gave the second year MBA students called My Famous Career, where I pointed out that probably 60% of my successes that were due to luck, but I also did point out that I was prepared to take advantage of my luck in certain ways. The example I gave, for especially second year MBA students was I always maintain myself and my wife being debt free or as close to it as we could with money in the bank, even when we were young and poor, so that when opportunity came up, we could just hop on it and again and again and again.
Jim: That non being nailed down was interacted with luck in a very interesting way. So you had five or six pieces of luck in your life. A lot of people could’ve taken advantage of them because they were over-committed somewhere else. I think you can manage your interaction with luck.
Michael: Yeah, I agree with that. I would just say this, that here’s another even more fundamental thing, which is to say … We’ll describe skill, what’s in your control versus what’s not in your control. And I will say luck is what’s not in your control. So very much to your point, you’re saying, “Hey, my skill was making sure that things that were in my control were good. So I that I was financially flexible, that kept building my mind and making myself learning every single day to make myself more valuable in the workplace and so forth. And then when some opportunity presented itself that was out of my control, I was in a position to take advantage of it.”
Michael: I think that all hangs together reasonably well. And like you said, this idea of being prepared or this idea of having, it’s almost, I would say it’s in your control. So hence it would fall in, at least by my definition, as a skill.
Jim: I like that.
Michael: This could be semantics, it could be semantics, but yeah, I think we’re saying something very similar.
Jim: I like that, I will frame that as the skill of taking advantage of luck.
Michael: Right, exactly.
Jim: And of course, that’s what poker is.
Michael: 100%.
Jim: To a significant degree. At least in the short term, long term, it’s just skill. But in the short term, it’s taking advantage of luck appropriately when it shows up. So let’s turn now to the very interesting part of the book and probably the one that a lot of people are going to find most useful, which are practical ways to disentangle skill versus luck, which people aren’t necessarily very good at.
Michael: Yeah. So, well, I mean, I would just say that it’s different in different things. So, I will say that … And we have a chapter we dedicate to sort of asking questions doing this, but the most formal way to do this, there’s actually a really beautiful little theorem in statistics, which says that the variance of distribution, independent distribution A plus the variance of distribution, independent distribution B equals the variance of A plus B. So it’s a really nice little property. And that’s the main thing we use to try to codify this for example, in sports, I’ll just use that as sports.
Michael: So what’s going on here? Well, how do you do this? So let’s just pick a sport. Just pick basketball, for instance. In the NBA, they play 82 games. So you have this equation and you say, “Well, how can we sort it out?” Well variance of luck is something we would know that’s a, it’s a modification of a binomial model. So you basically say, “Let’s model out if every game was decided by a coin toss, how would that look in terms of a distribution?” So that’s not that difficult to do mathematically. And that gives me one of my first variances.
Michael: The second thing we have is the actual empirical results. We actually know the win-loss records of teams over time. We usually do multiple seasons to take out some of the noise. So we have two of the components of the equation. So you have variants of what the world would look like in luck, we have the actual results. And the way to plug the gap is to say what is the variance of skill that solves for that? So I think that’s one of the classic techniques to do that when you have data that are really clean. I’ll leave it at that. And that’s a really good, clean way to think about disentangling luck and skill, especially in activities that are sort of clean like that.
Jim: Very good. And then at a higher level, you talked about Nassim Taleb’s two by two matrix. So maybe you could talk about that a little bit.
Michael: Yeah, so I think that a Nassim’s point is that the kind of thing I just described is very well suited to well-behaved distributions. So normal distributions, win-losses of basketball teams, and even to some degree things like results of businesses or investors or so forth. But where it really breaks down are these distributions that are very fat tailed. So where we have these very infrequent, but very large defects. So my argument is … A lot of the stuff on luck and skill doesn’t really do that much, it doesn’t have that much to say about those fat tail distribution parts of the world.
Michael: That said, there’s so much here that is untapped, just in the regular world, the better behaved parts of the world that it’s still incredibly relevant. So my whole thing on luck and skill is that I think very few people think enough about it and certainly in any sort of rigorous way, and that it has applications to large swaths of the world, but just to be super clear that this sort of Nassim Taleb fat tail component is going to be much less relevant to apply it there.
Michael: And again, there’s lots of fat tail things. We could talk about this a bit too, like in the world of physical worlds like earthquakes and so forth. I’m talking mostly about social things. So for example … And we could talk a little bit about this, but how do we think about things like successes of movies or successes like of books. Those are almost impossible things to model because of the way the social dynamics unfold.
Jim: Yeah. It also, in the social world, wars tend to be fat tailed, revolutions and probably even financial crises. I do think it’s worth considering fat tails as they insert themselves into our real world.
Michael: For sure. I want to draw attention to it. I completely agree with what you said. I want to draw attention to the idea and just say that that may be beyond the purview of what we can really do with these sets of tools. This actually would not be a bad time to talk a bit about this, which is … And this also for me was a very Santa Fe Institute influenced set of ideas. But when we talk about the nature of luck, there’s sort of two different ways to think about. One are things that are largely independence. These are coin tosses, or even if your model baseball players, batting averages and so forth. They’re not perfectly independent processes, but for the most part, that’s a pretty good way to think about it and it’ll get you pretty close to the truth.
Michael: But most of the interesting things, or many of the interesting things in life are these social processes, where it’s these cumulative processes. And it’s basically how information propagates across a network. And those things are inherently very difficult to predict. As you know well, predictions of which books will succeed or which songs will be popular or which movies will be blockbusters, it’s just inherently difficult, known to be inherently a difficult.
Michael: So there was one, there was one experiment that was done by one of our former postdocs, Duncan Watts, he did it with Matthew Salganik and Peter Dodds, called Music Lab where they basically got called students and this is thousands of college students, and they asked them to look at songs by unknown bands, just under 50 songs by unknown bands. And they did something super interesting, which was they had a control, which is 20% of the population. And then the other 10%, each one had eight social worlds. So you could think about these almost literally as alternate universes.
Michael: And so they said, “Listen to the songs and rate them, love it or hate it. And then if you really like the song, you can download it.” So this is just, tell us about your musical taste. In one of the extreme versions, by the way, they showed a leaderboard. So what are the most liked songs, what are the most downloaded songs. And what they found was that the control was obviously a sense of some sort of objective assessment of the songs. But what they found was the pattern of how people liked and download songs was extraordinarily important.
Michael: So just to be clear, the bad songs really did well in the social worlds, but if you were sort of in the top third, like kinda anything could happen. And there was one particular song I thought really captured the whole experiment perfectly and the independent condition, it was smack dab in the middle, so just considered the completely average song. In one of the social worlds it was the number one hit and in another one of the social worlds. It was at the very bottom decile in terms of popularity. So just goes to show that if you rewind the tape of time and played it again, ask the question, would Madonna be Madonna? Would star Wars beat star Wars? Would Harry Potter be Harry Potter?
Michael: And the answer is highly, highly unlikely. And that leads to another component that’s worth mentioning in all this, which is we are all storytellers. We are all masters at creating narratives to explain things after the fact. So now that something becomes popular, we all create a set of narratives to explain why it became popular, even though we are clueless about the underlying mechanisms. So that to me is one of the most fascinating components to this thing on luck.
Michael: And again, very close tie back to SFI and how, again, information propagating across network is that you just have to be very, very circumspect about your assessment of what has done well in a social realm, and of course, anticipating what will do well in a social realm as well. So that to me was a really big eye opener. You mentioned things like markets, and I think that you could almost use the same kind of model, like propagation of information across a network to understand how booms and crashes happen, these sort of fat tail events.
Michael: So super, super interesting stuff. As you point out the very outset, even if you don’t know the details of how these models work, appreciation of the very core idea itself is incredibly valuable.
Jim: That Music Lab result was amazing and it confirmed one of my personal biases, which is I hate to invest in those kinds of idiosyncratic pop culture type things. For instance, even though I did it once, mostly just to learn how to do it, the idea of betting on movies doesn’t appeal to me in the slightest. While betting on studios might be reasonable. Because your movie might be great and doesn’t succeed for a whole bunch of reasons that have nothing to do with the innate quality of the movie. And that’s just not the kind of thing I personally like to do. And although obviously there are people who do it. I don’t know how successful they are. That’s another question.
Jim: Much better to own a studio and have a portfolio, and maybe you’re better on average in making movies. And then the other point I want to hop on. You mentioned narrative and anyone who’s listened to the show has known that we quite often talk about the power of narrative and I’ve had various people bring their perspectives on it. And it is, as you started to allude to, it’ll probably get you to go a little further. It’s the source of a fair number of human biases. I’m particularly interested in the root neuroscience of it.
Jim: Mike Gazzaniga’s split-brain studies, which you referenced in the book, where when you separate the right brain from the left brain, if you provide something to the side of the brain that can’t see it, it’s the left brain, which is the verbal part, it’ll make up total bullshit to try to make sense of what it unconsciously knows. So maybe you can talk a little bit about how narrative is a great source of cognitive bias.
Michael: Definitely. So let me mention one thing, and maybe we can put this in the notes too, that on the website for The Success Equation, we have a little Pólya urn model and this is a really good way for people to play around with this concept and almost like to train their intuitions a bit about it. And the way the model is set up is we have more … And you can specify the conditions as you see fit. But we have marbles of different colors, five black marbles, four green marbles, three red marbles and so forth. And then it’s a pull and replace model.
Michael: So what the simulation does is it blindly pulls one of the models, the marbles, and obviously there’s going to be a probability of that happening and then matches it with a marble the same color and puts it back in and does the same thing over and over. So the reason I did it that way is the initial number of marbles is a proxy for “skill”. And then that process itself ends up being where the luck comes in. So the draw is obviously a probabilistic draw.
Michael: And I think one of the big lessons from that, Jim, just trying to riff on something you mentioned a moment ago, is that at a certain point you get locked in these processes. So clearly if the most popular marble gets picked and replaced, then probably succeeding goes up, but you do get locked in after a certain number of sessions. So that’s a really interesting thing, that even things where there’s varying skill … And that’s why it’s fun to play the simulation because you can do it a zillion times and see like when did the low skill things become the most popular and so forth.
Michael: This stuff on Gazzaniga and The Interpreter. I love all this stuff, and you’re exactly right. just to transition into that a bit. And I think you described it incredibly well. I mean Michael Gazzaniga is a neuroscientist who did work on split brain patients and these are people that have debilitating epilepsy. They failed other treatments and those last ditch effort, they separate the Corpus callosum. So very much to your point out … By the way, people do much better after this. They feel much better and it relieves the symptoms of seizures going from one hemisphere or the other, but it sets up this incredibly interesting experimental condition and modularity and that you described it extremely well.
Michael: You can give a cue to the right hemisphere through left eye, and the person would react to that cue and you’ll ask them why they’re doing it and they just make up a complete story, because the left brain is where your language resides. So they call this part of your brain in your left the interpreter and the job of the interpreter is to close cause and effect loops. So I throw some effect at you, I throw some sort of an outcome at you and you are going to come up with a narrative to explain why that thing happened. And we all do it effortlessly and naturally.
Michael: When you read it in the split brain literature, it’s actually, as you point out, very humorous. These people come up with these crazy stories to explain what’s going on. But the key, more sobering point, is the interpreters working in all of our brains all the time, the vast majority of time, very benignly, but of course, we’re making up stories for things that make no sense whatsoever.
Michael: Now, here’s the key for me, is that the interpreter doesn’t know anything about luck. So typically when you see a good outcome, your brain is going to come up with a narrative to explain things that there is something good behind it. And when you see a bad outcome, you assume that something bad is behind it. And this is where it gets into a very fundamental issue of process versus outcome. So I’ll mention one of the biases that’s a very powerful one, is this idea called outcome bias, is that we tend to think that bad outcomes are a consequences of bad decisions, which is not always the case, where good outcomes or the consequence of good decisions, and that’s clearly not the case.
Michael: So outcome bias is a huge one. You mentioned, I know you have an affinity for poker. You mentioned poker a few moments ago. Annie Duke in her book Thinking in Bets calls this resulting, so she claims that’s sort of thing the poker version of this outcome bias. And that’s a really, really bad habit to get into. And that leads to another point that leads on a luck skill thing. So when you’re on the luck skill continuum, if you’re on the skill side, you don’t need a lot of sample size to know that it’s behind process.
Michael: So a few, and I run a race, the faster guy is going to win the foot race and we don’t have to speculate about whether there was a skill or something else. By contrast, if you move over to luck side of continuum it’s all about process. And outcomes are very misleading because there’s a lot of noise or luck that’s distorting the actual signal in those kinds of situations. So that’s another thing, is to say when you’re on the luck side of the continuum where, again, skill plays a role over time, but that the short term, the signal to noise ratio is relatively low, you need to focus on process, and that’s where this outcome bias and that’s where the interpreter becomes a really, really problematic thing.
Michael: Because you’re going to start to make associations that are unfair to make. And by the way, this is pervasive. I mean, this is all the time in sports where a coach will make a decision that’s the right decision, but it doesn’t work out. It happens in business all the time. It happens in investing all the time. So the whole Gazzaniga stuff, left hemisphere interpreter, incredibly important for people to understand.
Michael: By the way, Bob Shiller has got a brand new book out called Narrative Economics. I think we talked about this at the Santa Fe Institute symposium a couple of weeks ago, and that’s a whole book that just dedicated to this idea of how narratives come to the forefront, demand our attention for some period of time and lead to certain actions and may or may not make any sense. So I think this is an incredibly important thing, and it really crucial component to this whole discussion about luck and skill and outcomes that we see in the world.
Jim: Yeah, very good. In fact, that Shiller’s books on my stack soon to be read. And then you’re absolutely right, outcome bias if you’re not subject to it, it is a wonderful tool in the poker world. You see, I’m full hit on a bad decision and he thinks that’s wonderful for him. I just sit back and rub my hands, okay, this is a biased mind for the next couple of hours. And of course to do that, you have to be very a level headed. And I don’t mind losing the right way in poker. I made the right play, I lost.
Michael: Can I tell a story? A Jim Rutt story?
Jim: Sure.
Michael: So you gave a great talk it was about how to succeed in business, it had this sort of horrible name. What was it called, Shoot the Puppy or something like that?
Jim: Yeah. Shoot the Puppy.
Michael: Okay. It was great. I’m listening there, just an awesome presentation. So fascinating. And one of the stories, and correct me if I don’t have this right, but one of the stories I believe you told me is that as a younger man, you got enthusiastic about playing poker and you learned all about tells and the mathematics of the game and so forth. And by the evenings you would play and as you’d win some, lose some, but you got better and better and you play better and better players.
Michael: And then eventually, I think it was your uncle pulled you aside, said, “Jim, you’re a smart young man. Let me give you some advice. Instead of trying to play better games, just look for easy games.” So where you know your skill is going to be the best in the group. And I just always loved that. And so I actually wrote a report called looking for easy games. And I started it with the whole Jim Rutt story. And the argument is in investing and in business and anything, if you want to win, one of the things to think about is what game am I playing and am I likely to be the smartest guy at the table?
Michael: I had this really interesting conversation with Annie Duke, the poker player, and she said, “You’re a pro poker player. You figure out kind of your winnings per hour.” So I figure out is it worth me playing this particular table based on how much money I’m going to make? And being a professional poker player is not glamorous. It’s just, you needs sample size and you to have your butt in the seat for a long time. And she’s obviously a very skilled professional, and she’s figuring out her profit per hour.
Michael: And then she looked to another table that has lower stakes. And she figures, “I’m so much better than those players that my profit per hour is higher at the lower stakes table. I’m making less per hand, but I’m going to win many more hands.” And so she says, “I’m going to allocate some my time to this other table.” And so the other poker players are like, “What’s going on?” The other poker people watching going like, “What’s going on? Why is she playing that?” And she’s like, “It’s just because I’m so much better than those people, I can make more money per hour.”
Michael: I just thought that was fascinating. And that’s another thing. I talked about this idea of looking for easy games, and I’ll relate this back to investing for one moment. This is really interesting. So one of the biggest trends in investing, as you know well, is this move toward indexing. So in the last 10 years, literally trillions of dollars have been invested in index funds and ETFs and so forth. And that’s largely been at the expense of active managers. So some people go, wow, the fact that everybody’s just taking their money away, that means it’s going to make it easier for us to active managers, less competition.
Michael: And my response is it may be the exact opposite. And here’s the reason. In investing as in poker. In poker, the amount of money that walks in the room and the money at the beginning of the night and the money that walks out of the room at the end of the night is the exact same amount. It’s just at different hands, or in different pockets. Same is true for investing, excess returns by definition net to zero. So for you to win, someone else got to lose right by an equivalent amount. My argument if it’s the weak players who are leaving the poker table and investing and putting their money in index funds, it’s like they’re showing up your house on Friday night and they’re just drinking your beer, but they’re not putting up any stakes.
Michael: So you actually want to have the weak players at your table for you to generate excess returns. So part of it is now you’re left looking at other sharp card players around the table, and that actually makes your job harder, not easier. So there’s a little provocative thought for you, which is, is all this indexing actually making life easier or harder? And if I use the Jim Rutt easy games framework, it may be the case that my life actually got more difficult, not easier.
Jim: Ah, can be. You told the story correctly. If I can add a little corollary to it. We went out to Vegas for my mother’s 80th birthdays. Unlike me, she loves casino games. To me, I just shake my head going, I’d rather stick my face in a disc sander than put a quarter in a slot machine. But when I did go down to the poker room, used all my little cutting strategies, I went down at two o’clock in the morning after having had four hours of sleep and no alcohol.
Jim: And I walked around the poker room and just assessed each table using my little rule of thumb that I use. And I picked my table that I wanted to sit down at. I went up to the [inaudible 00:45:43] and said, “I’d like to be on table number seven.” And he looked at me and he obviously knew something and he knew something that I knew something. And he just smirked and said, “There’s an hour ahead and three quarters. Wait for that.”
Michael: Exactly.
Jim: It was clearly the fish pond.
Michael: No easy arbitrage on that one.
Jim: Not at the Bellagio on the same weekend as the end of the World Series of Poker. Sorry.
Michael: Exactly.
Jim: Let’s hop back now from these very interesting stories to a great tangible story that you tell in the book, which is the difference between finding, drafting, I think it was the word you used, hunters versus receivers in football.
Michael: Yeah. So there’s a great book by of Boris Groysberg called Chasing Stars. And the question is … And this is also very, it’s just a fascinating topic in general. And the question is, if you want to improve the performance of your organization, how do you do that? And one of the logical things is you look for a star at another organization and you hire them and this halo is going to come over and improve the performance of your organization as well.
Michael: So Groysberg studied this in particular, by the way, in finance, actually, he was an analyst. And what he found was that these stars actually had a substantial degradation in performance in the new organization. So Chasing Stars did not work. Now, one of the things he does not point out, which I would just say is probably a good part of this, is simple regression toward the mean, which is to say, if someone’s to start an organization is probably because they’ve been pretty skillful and really lucky. And if luck is gonna sort of not persist, there’s going to be natural regression toward the mean.
Michael: But I think the point he makes, which I think is correct, is this idea that there tend to be organizational externalities that often people don’t take into consideration. So, if you’re a great trader at Goldman Sachs, there is something about what’s going on at Goldman Sachs that makes you very good at what you do. And when you go to bank X, those things are not around anymore, and hence you’re going to be much less effective. So that’s this idea of Chasing Stars.
Michael: And boy, there’s a really interesting … GE in the olden days used to be known as sort of one of the great management training places and Groysberg study there where there was a couple dozen GE managers who were considered to be really good. And about half of them went to organizations that were very much like GE and they did really well. And then half of them went to organizations that were very different than GE and they floundered. And so it was not, again, it’s this idea of organization being really important.
Michael: So going back to punters and wide receivers. So again, how, as you want to make your sports team better, how do you do this? So what their work found was that wide receivers when they went to new teams did regress a lot. They didn’t do as well as they did on their prior team. Now, part of this, again, I’m sure it’s just classic regression toward the mean, but part of this is you need to learn new schemes, you have a new quarterback. All the stuff you have to sort of move up the learning curve and you don’t have the benefit of all the stuff you knew at your prior organization.
Michael: The exception to the rule though, interestingly, was punters. And punters were just … Though I suspect this is true for field goal kickers as well, is that they were the same everywhere they went. And the point is that they have very few interaction effects. You do need a guy to snap the ball to you, but that’s basically it. And there are a lot of guys that can snap the ball roughly equivalent skill. And so punting was the same everywhere they went because that was the one thing where there are almost no interaction effects. So that sort of strengthens the overall argument of how important is the organization to my success.
Michael: So that’s a really good lesson in general, is this idea of being skeptical about “hiring stars” as a sure way to lift the performance of your organization and being insufficiently mindful of the role and the extra analogies of organizations and how they influence people’s performance.
Jim: Ah, that’s really useful. Now let’s move on to the next item. You know somebody who says the most dangerous equation, basically equation that states that the variation of the mean is inversely proportional to the size of the sample. And again, this is something that drives me crazy in the scientific literature, especially in the biomedical literature, even the most rough calculations. These people are nuts, why are you giving me results for the sample size of seven? I just don’t give a shit. It’s all noise. Could you talk a little bit about the importance of sample size?
Michael: Yeah. And we’ll send the article on it because there’s a nice little paper on this by a statistician in University of Pennsylvania on this. And this is an incredibly important idea. And by the way, it’s a snag that’s captured some very smart people and all about to tell the story. So this is exactly to your point, which is when sample sizes are small, the variance tends to be large. And this is quite intuitive, which is, if I flip a coin 10 times and it shows up, you get seven out of 10 or heads or whatever, that happens, I don’t know, something like 12% of the time.
Michael: So that doesn’t happen frequently, but it’s not infrequent as well. If I flip a coin 10,000 times, if it comes up 5,100 heads and 4,900 tails. That only happens about 2% of time. That’s very likely to be a bias coin. So we get confused about sample size and the variance is implied by that. Let me tell you that there’s a great story about how that snags some really smart people. So the Gates Foundation, and these guys are doing really, really trying to do great work around the world, but the Gates Foundation wanted to improve education in the United States.
Michael: There’s a pretty straight forward way to do this, let’s go find the schools with the highest SAT scores. We’re going to look at schools that are really, the kids are really doing well. So they find the kids with a high SAT scores and it turns out these are small schools. Now, we’re going to come back to this because you noticed that they haven’t asked a question, another question that they should have asked. So they go on a campaign, spending millions of dollars to break up schools to be smaller because they’ve deemed them to be more effective.
Michael: In fact, there was a school in Seattle, near Microsoft, that they broke into six little mini schools, in order to take advantage of this idea. Well, what was the question they didn’t ask, Jim? The question they didn’t ask is which schools have the worst SAT scores?
Jim: And the answer is-
Michael: … it’s small schools. And because they’re just capturing the tales of the variants, Because of the small sample size, the number of students. And so when you look at the full distribution, you start to see a very different picture. And this idea of small schools in and of itself is not the correct answer at all. And in fact, there … And then when you think about it, larger schools have advantages. Like if there are a number of kids that can take fancy AP courses and calculus or whatever it is, you’re going to have a much larger population of kids that would be availed themselves of that kind of course than you would in a small school.
Michael: It’s a really interesting thing. And I’ll just say that psychologically it’s, this is really important, is that when you think about biases that we have. We tend to be overconfident when we see small samples with strong signals. Going back to my coin toss thing, when you see seven heads and 10 tales, you start to get over confident that it’s a biased coin and you don’t take into consideration the strength of the signal, which is really sample size.
Michael: By contrast, when we have very weak signals, like signal of my 51% heads with 10,000 tosses, but you have such large sample size, you are under confident that that is biased. When in case A, that is very unlikely to be a biased coin, in the case B, it’s very likely to be biased. So that’s a really interesting set of, for us psychologically. And by the way, Kahneman and Tversky talked about this in their classic stuff, is that we use small numbers and extrapolate them in a way that’s inappropriate. So that’s an incredibly important life lesson.
Michael: And again, as you pointed out in your setup, I mean, you look around the world. Whether it’s medicine, whether it’s business. You see it all the time that people take small sample sizes and extrapolate. Related concept is recency bias. You see this in crazy places where people should know better. A baseball player has a great season or even a great few months before the end of the season and all of a sudden his stock goes up and he can sign a more lucrative contract than he would have otherwise, and it makes no sense. But people fall for this recency bias all the time.
Michael: So you just really want to think about sample size and how reliable are your results, short term results, in the context of longer term performance.
Jim: Yeah. This had some huge social political implications and perhaps even world history effects. The US, we spend 30, $40 billion at the national institutes for health and for various institutionalist reasons, basically because there’s a million biomedical researchers with their hands out looking for money in the US we tend to fund maybe barely possibly adequate sample sizes for biomedical research, but when you add in the noise from mistakes of experimental design, the reality, there’s an awful lot of it, it’s just wrong.
Jim: For instance, the guy we had on this show, Brian Nosek of the Center for Open Science, he was also the guy who started the replication crisis project for psychology. He went and tried to replicate a bunch of psychology experiments in top publications and only about 35% of them replicated. While in China, for instance, in biomedical, they have chosen to take another approach, probably because their number of researchers wasn’t as large when they started doing this, is instead of having a sample size of 40 or 50, they’ll have a sample size of a thousand, which means that they’ll do less numbers of studies, but that the value of the information from each study is much more valuable.
Jim: And I suspect if the United States could learn that discipline, our large amount of money spent in federal research would be better spent.
Michael: Yeah, I’m with you 100%. And these are fairly basic ideas, but remarkably still not … Well, it’s kissing incentives and psychology and so forth, but yeah.
Jim: And institutional biases. You’ve got all the researchers out there with their hands out and nobody wants to say no. Right?
Michael: Exactly.
Jim: Yeah, indeed. Let’s move on to a couple other interesting kind of findings that you’ve talked about in the book. One, I thought, again, it’s kind of informed by all these things we talked about, is what you call the intelligence quotient versus the rationality quotient, why smart people do dumb things. I’m personally a fine example of that.
Michael: By the way, this is another topic I love, Jim, this is so fascinating to me. So this is work done by a retired professor at University of Toronto named Keith Stanovich. He wrote a book 10 years ago called What Intelligence Tests Miss and the core of very provocative claim is that IQ measure something … This is controversial by the way, but measures inclined, something real, something we can measure and more of a tends to be better than less of it, but that should be distinguished from when he calls RQ, rationality quotient, which is the ability to make good decision.
Michael: And again, his core claim is these are only partially overlapping skills. So you might even think about them as different sliders. And by the way, we don’t test for RQ by and large. And so his argument is, we all know this, people that do great in school, they’re really book smart, but they can’t make decisions in the real world very effectively. And other people who are not geniuses, but really make good decisions day in, day out. So that would be indications of the differences on the IQ versus RQ sliders.
Michael: So this is, I think, a really interesting idea. And by the way, I think this idea of RQ is really central to being certainly a great investor, but probably a great businessperson as well. So he went out and got $1 million Templeton grant to develop what he calls the Comprehensive Assessment of Rational Thinking, CART, which is akin to an IQ test. And he wrote a book a couple of years ago about this, which laid out essentially the skeleton framework of this Comprehensive Assessment of Rational Thinking. And it’s totally cool, and it is not yet … They’re still using it only in academic settings, but I think it’s an incredibly interesting framework.
Michael: And it tests things like logic, numeracy. Often people who do well don’t believe in things like fate and so forth. So I just think this is a really interesting area in psychology that will continue to be explored. And I think a lot of practical implications. Now, I mentioned one of the things that as a side note, this is just my belief on this, and by the way, I guess I should be more clear. So the key to RQ is two things, and we’ll use some fancy language.
Michael: One is instrumental rationality, which is you can achieve your objectives given your constraints. And the classic way we think about that in economics is the axioms utility theory. But the second dimension is one much more important, which is epistemic rationality, which is a fancy way of saying your beliefs map accurately to the world. And that’s not easy, because the world changes, especially in the world of business or investing, the world changes. So one of the keys is to be a good Basean and update your views as the world changes in front of you.
Michael: So a lot of this also is things like calibration, that you’re well calibrated that when you say something’s gonna happen with a 70% profitability, on average, it happens 70% of the time. So the other link I’ll make, which I love, is there’s a book, and I’m a big fan of both of these guys, but there’s a book by Phil Tetlock called Superforecasters, you’re probably familiar with it, where they did a forecasting tournament and it was open, and thousands of people participated.
Michael: And what they found was a small sliver of that population, 2%, they were so called superforecasters and these are people making very good forecasts. They were persistent, way beyond what chance would dictate, by the way. They tested, obviously for just luck. And they actually asked the … Some people in control, they gave training, others they didn’t, but they did do personality profiles of all the forecasters and they created, came up with a sort of a model, sort of moral characteristics of a super forecaster.
Michael: I was going to circle back to things we talked about the very beginning. These were people who were actively open-minded, but it turns out, I think that the skills they had overlap almost perfectly with that of the RQ test. So they’re just people who are willing to change their mind. They’re open to new information. And in fact, the thing that correlates best with everything is John Baron’s idea of actively open-minded, which means not only are you willing to entertain points of view, they’re different than what you believe, you actively seek them, which by the way, takes a lot of cognitive energy to do.
Michael: I mean, most of us just have a point of view on something. Once we’ve decided on it, we just want to stick with what we believe and never change our view. By the way, I’ll just say I’ve taken to, when I work out in the gym, I often watch television and I’ve now taken into the habit of watching CNN for half the time and watching Fox News the other half of the time cares to see how the world works. And it’s a fascinating exercise, because people will be talking about the same topic, but very different points of view. And it’s definitely eye and mind opening, to see different points of view.
Michael: So I love this IQ versus RQ. I would put this on the lookout because I think that this is something that could very well become more and more mainstream. And I think for people, it’s just a great way of thinking about assessing other people’s capabilities beyond just, “You went to MIT so you’re super smart guy.”
Jim: Absolutely, and that’s something I learned in the business world relatively quickly, is yes, the credentials are interesting and are useful filter, but don’t let them over filter, but things, I didn’t have the terminology, but something like the RQ is actually more important in the long run, it seems to me. Interestingly, you mentioned Tetlock. That was going to be next up on my questions. And while you talked about what he found in his Superforecasters, you didn’t touch on the part that I thought that was so interesting, is that in field after field after field, his research showed that most people forecasting behaviors of complex systems just plain suck at it. It’s a matrix.
Michael: Yeah. I first met Phil back probably 2006 or so, and he wrote a book very much to your point called Expert Political Judgment. So Phil’s a psychologist by training. He’s now at the University of Pennsylvania. He [inaudible 01:02:05] political science. And as he tells a story, he was hanging out with a bunch of political scientists back in the 1980s. And they were all pontificating about different things. And he wondered himself like, does anybody actually keep track of what these guys say? And the answer was pretty much no.
Michael: So he embarked on what ended up being a 20 plus year study of expert forecasts. And very much your point, it was political, economic, and social outcomes. They’re sort of canonical complex systems. And these are experts, these are masters, PhD level people. And he asked them to make very specific probabilistic predictions. And he did something very unusual, which was he actually kept track of them. And the results are very much what you described is that they’re very bad at this. In fact, simple extrapolation algorithms tend to do as well or better. And very many of them don’t do it much better than the chance.
Michael: The other thing I’ll just remark about these experts is they’re just like you and me, when they get something wrong, and they have a whole litany of excuses as to why they got it wrong. You just wait, or I almost got that right. My favorite excuse, by the way, it was, my forecast was so important it changed the course of world events.
Jim: I love that one. I’m going to use that one. I like it.
Michael: You can’t park that car too often. But this is actually an important point that … And I think this, we had that symposium on complexity economics a couple of weeks ago at SFI. I think that many of these kinds of questions about expertise, for instance, really are incredibly domain specific. So there are areas where experts actually are really good and they tend to agree. In fact, in my book Think Twice, I have a chapter called the Expert Squeeze, and I discuss where experts are likely to continue to be important and where they’re not.
Michael: So look, if you have a tricky math problem or you need your plumbing fixed or something, there are going to be experts you can call them that can do that and the results are going to be consistent with one another and they’re going to agree for the most part. But when you get into complex systems, as you point out, predicting markets or oil prices or geopolitical outcomes, it’s the complex systems Wild West. And there you will find experts, as you point out, equally credentialed, who will come up with diametrically opposite conclusions. Oil prices are going to skyrocket. Oil prices are going to plummet.
Michael: There we know their track records really poor. Part of my thing is to say, and by the way, the other thing I’d just say that we know psychologically, we love to defer to experts. Why do we have the talking heads on CNBC every day is because those people are really … We love to listen to them. And we love to think that they know what they’re talking about. There’s something endearing about our desire to listen to the fortune teller to some degree.
Michael: So my whole thing is be super sensitive to the domain that you’re thinking about and to recognize that in some domains experts are absolutely going to be great and they’re going to be useful and you’re going to want to defer to them, but you can’t extrapolate the abilities in some domains and into the complex systems world because it just doesn’t work. And again, your mind is going to want to believe, but you should suspend that, that belief and recognize that no one knows what the heck is going on.
Michael: So Tetlock’s work, I think the first thing was … And by the way, I think he’s a very thoughtful guy. So even there were some silver linings in the original work on expert political judgment, but to recognize that there are very few experts that can predict well. Now I will say one thing, Jim, this goes also back to our conversation about the Santa Fe Institute. There were two remarkable things that were in Tetlock’s findings that were significant. One was, I love this, the more media mentions a pundit head, the worst his or her predictions were.
Jim: I love it.
Michael: The people you hear and see the most frequently are the worst when you actually objectively keep track of their record. Now, part of that is because if you’re in the media, so you’re a producer for a TV show or you’re an editor of a newspaper, you kind of want to have people to have kind of crazy views of the world. You want boomsters and doomsters to be on your talk show in order to generate interest.
Michael: But the second thing he found was that what predicted success of forecasts had less to do with your gender or age or political persuasion and much more with your way of thinking. So you harken back to the famous Isaiah Berlin essay on foxes versus hedgehogs. Hedgehogs are those people that know one big thing and pretty much anything you’d chuck at them, they’re going to pound into their worldview one way or another. And we all know the sort of political hedgehogs.
Michael: And the other, the contrast is the foxes. These are people that know a little bit about a lot of different things. They tend to hold their views lightly, and they’re constant learners and they tend not to be too wedded to any particular point of view. And I think for me at least, I think that SFI tends to be the kind of place that draws foxes, which is always super fun. And what Tetlock found was that hedgehogs will have their 15 minutes of fame.
Michael: So if you’re a boomster or doomster and the market goes up a lot or down a lot, you’re going to, you’re going to be in the spotlight for a moment, but over time it’s the foxes that tend to be better forecasters because they’re more malleable, cognitively malleable, and so as new information comes in, they will actually update their views in both the direction and magnitude that makes sense. So I love all that stuff. I think it’s really important.
Michael: Again, I think temperamentally, some people are better at it than others, but all these things, and I should have said this about RQ. I think it’s actually quite difficult to change people’s IQ’s that much, but there are some, I think reasonably convincing evidence that that RQ is something you can cultivate both as an individual and even as an organization. And this idea of being foxy is also something you probably can cultivate as an individual. That’s sort of an encouraging note right at the end that this is really difficult, but we can probably do better than we’re doing.
Jim: Indeed. Let’s go on to another topic, and this is something, at one level, and once I read it and thought about it, I said, “Sure, of course, that’s intuitively obvious.” But it’s an idea I had never come across before, I think it would be very useful for people to add to their intellectual toolkits. And that is that you point out and give some examples that when skill starts to converge in a domain, luck actually becomes more important. Could you unpack that a little bit?
Michael: Jim, these are all the most fun topics. I want to say that I learned this originally from Stephen Jay Gould, who wrote a book called Full House in the mid 1990s. And Full House is really about this idea of distributions. I gave it the name the paradox of skill, but I want to be very clear that I got that idea from Gould and I’m sure gold may have gotten this from someone else. And just to rephrase what you just said, the idea, the core idea is that … And activities were both skill and luck contribute to outcomes, which is most of course, that it can be this case that if skill increases, luck becomes more important. And that seems like a really weird finding.
Michael: And the key insight to this, and as you said, once you explain it, it becomes extremely intuitive. The key insight is you have to think about skill in two dimensions. The first dimension is absolute skill. And I think that we would all agree, if you look around the world, whether it’s business or sports or investing or anything, the level of absolute skill has never been higher than it is today, which by the way, you would expect because of cumulative human knowledge and technology and so on, so forth.
Michael: To be concrete, if I put you as an investor, if I put you back in the 1960s with the tools at your fingertips today, you could run circles around your competition. You’d simply have better formation, faster trading, and so on and so forth. And we see this, by the way, in sports, especially things measured versus a clock as we know. People are constantly setting records for the fastest running time or swimming time, or what have you.
Michael: The second dimension though where it gets super interesting, and that is relative skill, which is the difference between the very best and the average. Now, Gould introduced this idea with the story about Ted Williams. Ted Williams was last baseball player to hit over 400 for a season. He did that in 1941 and he hit 406. And Gould’s asking the question like, why has no one hit over 400 since 1941? And he’s like, certainly the players today are better than they were before, they train better, they have better nutrition, they have better coaching and so forth.
Michael: And his argument was, at the end of the day, it turns out that they’re more uniformly excellent and as a consequence of that, standard deviation of skill has gone down. So to say this slightly differently and a little bit more technically, you can think about batting averages as a normal distribution. The mean is usually around 260, something like that. And by the way, the power to major league baseball will actually change the rules and have changed rules over the decades to make sure that there’s no advantage going to the pitchers or the hitters, so 260, and then you can calculate the standard deviation of that, around that mean.
Michael: And so it turns out that Ted Williams was almost exactly a four standard deviation event in 1941, which was extremely rare, and you would just say a lot of luck and a lot of skill. There’s no other way to get there without abundance of both of those things. Now, in the most recent season, well, I didn’t do 2019, but 2018 season, if you were a four standard deviation hitter, which would be spectacular, you would hit something like 385, something like that, which is awesome. That wins the batting title, hands down. But you don’t come near breaching the 400 level. And the reason is because the standard deviation of batting average has gone down over time.
Michael: So the difference between the best player and the average player is less today than it was in prior generations. And you can test this in a bunch of other ways as well. Go look at the gold, silver, bronze winners and the marathon, or the 100 meter dash and the Olympics going over time. And what you’ll find is that those guys are a lot closer today than they were in prior generations. So we see it in basically parody in athletics in general. If you look at the sports leagues. NBA is not an example, hockey, NFL, baseball. They’re all grindings or parody. Part of it is because there are things like salary caps and so forth, but the bigger phenomenon is just we’re drawing from larger populations of potential players, we train them better, we coach them better, they eat better.
Michael: As a consequence, excellence has become more uniform, which is really interesting. Perhaps where this is the most powerful is in the world of investing and investing in this area where the cumulative knowledge of everybody goes into asset prices and wanting, as we track, as a standard deviation of excess returns. And that has been marching steadily lower for the last 50 years. By the way, there was one market, a departure from that trend, and that was around the dot-com period.
Michael: And interestingly, the dot-com period is when mom-and-pop came back into the stock market in droves. And that set up, and this goes back to your discussion about easy games, that set up a situation where there was skill versus non-skill. And it turns out that the institutions ended up fleecing mom-and-pop, who then retreated after the dot-com thing busted. And then the standard deviation of skill went right back to trend of shrinking over time. Isn’t that a fascinating thing?
Michael: So society of the paradox of skill. And once you hear about this idea and sort of internalize the basic framework and look around, you see this everywhere where again, absolute skill, never higher, relative skill shrinking and luck becoming more and more important in determining outcomes.
Jim: Yeah. Then the takeaway is even more reason to look for the easy game.
Michael: That’s really an overarching thesis in all this, is that if you have skill or you believe you have differential skill, you want to figure out where to apply it, and that’s a really big deal. And I think people don’t think enough about that and right, that’s why I think that Annie Duke story stuck with me so much. It was, people thought she had sort of lost it, that she was going to play in the lower stakes table, but she was doing a cold hard calculation of profit per hour and that ended up being the best place to be. So super interesting.
Jim: Indeed. Let’s hit a couple of other things here before we’re out of time. You had a section on that people lose skill with ages, but so do organizations. What can you tell us about that?
Michael: So this is a little bit depressing, but it turns out that what I tried to do in two chapters, one on luck and one on skills. So sort of characterize what they look like. And characterizing skill is actually not that difficult to do and we can even do it by talking, which is still tends to follow an arc. And in athletics is a good way to illustrate this, but we can talk about lots of other stuff as well. So let’s say you’re an athlete. When you’re young, you’re obviously not very skilled, you’re not very strong, and you improve, improve, improve through practice and strengthening and maturation, so forth. And then you peak at some level and then you come down the other side.
Michael: And by the way, different pro sports, the peak age of performance varies a little bit, but it’s usually somewhere in the mid-20s to 30s, something like that. So that’s usually the age of peak athletic performance. And then there’s a degradation. So older athletes tend not to perform as well. And by the way, a lot of it is just basic plumbing that as you get older, your reaction times just slows just a tiny bit. And of course, when you’re a professional athlete, for instance, those things actually make a huge difference in how you perform.
Michael: By the way, the same is true in cognitive performance. One of the theories on this is your general intelligence function of your crystallize and fluid intelligence. Crystallized intelligence, exactly what it sounds like. It’s your understanding of facts and knowledge and so forth. That tends sort of grow through life and absent, sort of a. Any kind of cognitive impairment, that tends to grow until you’re well into your 70s. But fluid intelligence, which is your ability to deal novelty tends to peak when you’re young, actually in your early 20s and declines over time. So you have these countervailing forces of crystallized going up, fluid going down. And that also creates an arc, which is interesting.
Michael: And then you mentioned organizations, and we see the same type of thing as an organization performance tends to also follow an arc. That’s harder to quantify. But as you know, when you’re young and you tend to be filled with sense of purpose, you tend to be more nimble and then as you get larger, you just become more and more bureaucratic. Part of that is out of necessity. So one of the ways we can track that and things like return on capital patterns in organizations and companies. So return on capital tends to start low as you’re getting going and you’re in sort of an investment phase and tends to grow.
Michael: Then you get to a peak and then you come down reflecting both maturation, typically of the markets in which you’re participating and also sort of the bureaucracy that tends to almost inevitably grow within your own organization. So that tends to be the pattern. So that’s always thing to think about is, where is a company, where is an individual on that pattern of skill, are they on the ascend or they on the descend, and what does that mean for the future?
Jim: Well, that might be a wonderful little tool for doing in vestment analysis. Say, “All right, here are the towels that let me know this company is past its prime.”
Michael: Yeah. And we actually do some work on that, and what we typically do is look at it’s basically the degradation, you call it the fate of return on capital over time. So if you have a high return on capital business, how quickly will that regress back towards some sort of opportunity costs. And one of the things that the empirical work would tell you, which is going to be very intuitive when I say this, which is that differs by industry and sector, to state the obvious. Things like energy and parts of financial services tend to have very rapid regression, because it’s easy for people to mimic what you’re doing and it’s often a commodity type of business.
Michael: So replication is easy. And so those returns get competed away quite quickly. By contrast, other businesses, the sort of classic examples, consumer staples, is that you probably used Procter and Gamble products 30 years ago and you’ll probably use them 30 years from now, those things tend to be much more stable. As a consequence, the fades tend to be a little bit slower. So you’re right. So part of it is just understanding what those curves look like and also recognizing that those will vary by industry, justifiably so.
Jim: Yeah. You’re right. You wrote about that, and in fact, I even wrote about it. The only thing I ever wrote under SFI’s name was talking about where to think about exploration and exploitation. Some people say, “Oh, there should be some set, mix wrong.” It very much depends on the environment you’re in. If you’re operating a piton factory, you’re probably all about exploitation. If you’re working in the internet industry in 1998, you’re probably all about exploration and it’s very important to get those things straight.
Michael: I don’t want to reread your paper Jim, cause I also want to come back to that topic. I think that’s an … First of all, what you said is spot on and it’s an absolutely fascinating concept. And then the question is, can you sit in a board meeting or talk to a CEO and actually articulate this in a way that can really get them thinking about it. And it goes back to what you’re talking about, institutional bias and inertia and all these kinds of things. It becomes really difficult and at least perceived expectations of the financial community, it becomes very difficult for these guys. They sort of paint themselves in a corner and overdo it on one side or the other.
Jim: Interesting. Well, we’ve got time for one last item before we have to run out and I’ll give you your choice. I got lots of topics which we did not get to that your book was just such a mine of interesting things, but I’ll give you a choice of which of these you want to exit on. One, earnings per share, why is it such a shitty a measure? Or second, and this would probably be my preference. It is the term I did not know until I read your book. It was deliberative practice. You mentioned it many times throughout the book, and you actually spent a fair amount of time late in the book describing it in some detail.
Michael: Let’s start with the latter, on deliberate practice. And I think this goes back to a skill and skill acquisition and even spills into this whole discussion about 10,000 hours. By the way, at SFI we have a really fascinating program on limits, you know this well, so limits to all sorts, limits to knowledge, limits to performance and so forth. And we did a conference a couple of years ago on limits of human performance. And we had everywhere from talking about athletic performance, we had special forces folks, we had neuroscientists and so forth.
Michael: But one of the guys that sort of kicked off our discussion was Anders Ericsson, who was the guy who was most famous for the discussion about 10,000 hours and what Ericsson argues, I’m going to make this a little bit of a cartoon version of it, but I don’t think it’s that far off. What Ericsson argues is this idea of talent or skill doesn’t really matter. What matters is you apply yourself for 10,000 hours of practice and you will become elite in that particular activity.
Michael: So if you, Jim Rutt, want to get to Carnegie Hall playing the violin, it’s just 10,000 hours of deliberate practice and we’re going to get you there right. Now, I think anybody in the real world recognizes that the notion that there is no talent or skill doesn’t make any sense. So we all know people have musical abilities or mathematical abilities or language abilities and so forth, or athletic abilities. So I just think that the point is a bit of a nonstarter, but this idea of deliberate practice I think is really important.
Michael: And by the way, the other thing, I’ll just say, there’s a great book called The Sports Gene by David Epstein. David’s a great guy. His new book, by the way, also I love is called Range. I would highly recommend that. It’s a very, also a very SFI type of thing where it’s an argument about specialists versus generalists and he argues very, I think, persuasively in favor of generalists. But in The Sports Gene, David basically dismisses this idea and shows empirically this idea of 10,000 hours is not really true. That’s maybe an average, but this idea of deliberate practice is valid.
Michael: So what is deliberate practice? It just says, if you want to get better at something, and this comes from Anders Ericsson, you want to do what’s called deliberate practice, which is operating at the edge of your capabilities. You’re getting very accurate and timely feedback, typically from a coach, and it’s actually not fun. So it’s easy to apply to things like sports. You’re a tennis player and you practice every day at the edge of your performance, you have a coach giving you timely and accurate feedback, and it’s hard work. And also there are components of you need rest and so forth and how those things fit into that.
Michael: So the idea is that most of us don’t actually ever get to the point of deliberate practice on most things. So if you want to learn to drive your car, you get to be competent at it and you’ll leave it there. So good enough for most things is good enough. But to truly become outstanding at something, especially in a fairly defined domain, you need this idea of deliberate practice. So one of things I, and we could even use about this together, Jim, right now, which is this idea of, if that’s a valid construct for how to get really good at something, what is the equivalent of deliberate practice and business?
Michael: So you’re going to coach an executive, like how do you coach that individual so that he or she can actually become more skillful. In investing, what does deliberate practice mean in investing? Investing is an inherently noisy exercise. As a consequence, it’s very difficult to give people clean feedback. The business is very noisy, hard to get people clean feedback. So that’s a really interesting open question, which is, are there methods or techniques that we could use to get more effective at prescribing deliberate practice in domains that are not, again, very tidy, where the feedback is clear and so forth.
Michael: So think it’s an incredibly interesting topic. And it gets into things like, how do we train, how do we … At SFI we did a little fun session on agent based models and part of the power of agent based models is it allows you to expand your mind and understand how certain causes and effects come out of these simple little models. It just changes the way you think about things. And I think that whole idea of deliberate practice is another really cool area that we could probably explore.
Jim: I hear your question about, in business, as you know, I do a little CEO coaching from time to time. And what I have decided for my CEO coaching, which is the closest thing to this idea of deliberative practice that I can think of it in my own experience, is I focus relentlessly on time management. Interestingly, when you say time management, that sounds boring as shit. But in reality, I’ve concluded that time management is really how we operationalize our priorities, and our priorities are critical. But writing your priorities on the whiteboard is one thing, actually living them is another.
Jim: So whenever I’m doing a CEO coaching thing, I start off with a priorities exercise and then I do a time management exercise, seemingly separate. And now I point out the contradictions to the person I’m coaching, and then I provide a prescription on how they should approach their time management and I give them things like, “Take this piece of paper that you just created and tape it to your computer so you see it several times a day. And think of time management as the deliberative practice that you have to force yourself to do against your own inclinations and likes and dislikes if you’re going to actually achieve the priorities that you really care about.”
Michael: Yeah, I love that. And I’ll just mention, this is completely related, which is, there was a really interesting survey of investment committees. These are endowments and pension funds and so forth, making investment decisions. And they asked the each member of the investment committee, “What should we be doing? And rank these things in order.” And then they actually kept track of what they actually did, and they were almost flipped to list. And so it’s the same thing. It’s like time management, like why are we spending on the time on things that we shouldn’t be spending time on and why are we not spending time on the things we should be spending time on?
Michael: But it goes back to a little bit, it’s a part of the challenge, I’m sure you see that even in your CEO coaching, which is some of the challenges. Sometimes you make the right decision. Now you’ve got your time priorities correct. Sometimes you make a decision, which is the right decision that doesn’t work out, and that’s where you have to say to people, “You did the right thing, dust yourself off and go do it again tomorrow.” And sometimes you make a dumb decision, it works out, and you go, “Wait a second, like, recognize that you were just completely lucky on this next time. Think about it more effectively.”
Michael: That’s also this idea of giving people sort of seeing through the outcomes and saying like, “Let’s make sure you’re focused on the proper process.” Anyway, I think that’s another area that could really be helpful to get people to continue to focus on process as a means to the longterm outcomes, successful longterm outcomes.
Jim: In fact, I do exactly that. And that’s one of the advantages I have, having had a zillion years of business experience with a large number of businesses, is I’ve said both of those things, definitely, which is okay, you fail, but you fail honorably and intentionally and next time that comes up, do the same thing, i.e., do the same process and over time you will be a winner. And on the other hand, I definitely have told people, “Jesus Christ. Did you luck out? Don’t do that again because probably next time you’re going to fall on your ass.”
Jim: Well, anyway, Michael, we’re at about the hour and 30 minute Mark and I found that’s about as long as I can do a good job. I’m sure you could go on for hours, but I have to keep my focus up too and frankly, looking at the statistics, I think that’s about as long as our audience can tolerate. But as I expected, this has been intense, interesting, valuable, and all those kinds of things that we’re trying to achieve here.
Michael: Well, thank you so much, Jim. It was a real blast, and we did talk about a lot of really fun topics and hopefully useful topics for people out there in the real world.
Jim: Yup. And people, don’t forget to buy Michael’s book. Why don’t you give them the title?
Michael: The Success Equation. Easy to find anywhere. Amazon.com is a good place to go.
Jim: Definitely read it. I found it very worthwhile. Production services and audio editing by Jared Janes Consulting. Music by Tom Muller at modernspacemusic.com.