Why you should not embrace risk

Don’t ask me why, but I was reading this goofy article by John Snobelin this morning. (Snobelin, for those who don’t remember, is the high-school dropout who became Minister of Education in the Ontario government of Mike Harris, signalling the triumph of “common sense” over the petty reign of us pointy-headed intellectuals.)

In it, Snobelin tells us about a meeting he recently attended, “a small gathering of business leaders from across North America. We huddled for a couple of days in New York to work on our futures.” There is a bit more blah-blah, until he gets to this part: “Extraordinary leaders, they share three characteristics: They have had great success, they embrace big risks and they are highly self-aware.”

The bit about “embracing big risks” is what caught my eye. This is a line that I must have heard a thousand times, in stuff on leadership and success. People are constantly being told to take bigger risks. (One example, picked almost at random: Certified Core Belief Engineering Practitioner Esther Bartkiw assures us that “To have an amazing life, embrace risk”.) Why? Because if you look at successful, amazing people, one characteristic they all share is having taken big risks.

So what’s the problem with this? It may sound like common sense, but in fact it’s terrible advice. The fact that successful people all share this quality is a perfect example of selection bias. To see why, you only have to stop and consider how many people took big risks and didn’t succeed.

Suppose you have 100 people, 80 of whom are perfectly rational, 20 of whom are not. Those who are rational only take risks that maximize their expected utility. Those who are irrational, by contrast, are drawn to the lure of the “big score.” As a result, they are inclined to take big gambles, where the reward is enormous, but the probability of success is so small that they would actually be better off, ex ante, taking a safer option (with smaller potential reward, but greater chance of success).

For concreteness, imagine that these 100 people are all account managers, trying to cultivate new clients, deciding whether to accumulate a number of smaller clients, or go for the one “big fish.” (Or suppose they are 100 academics, trying to decide whether to publish a number of articles in mid-range, respectable journals, or to invest all their energy in a single paper, to be published in the very very best journal.) Let us assume that the 80 rational ones all accumulate a modest record of success. Of the 20 irrational ones, let us suppose that 19 of them flame out, ending up with nothing. One of them, however, the lucky one, gets the big score (the big client, the marquee publication, etc.)

And now consider the senior manager, trying to decide which one of the account managers to promote. (Or consider the academic department, trying to decide which junior scholar to hire to the tenure-stream position.) Obviously you promote the one who got the big score, and you fire the other 19 who wound up with nothing to show for their efforts – because, typically, you do not have enough information to assess the ex ante rationality of the choice. So the irrational have an advantage over the rational when it comes to promotion. But this hardly makes it good advice, for the aspiring account executive, to be irrational.

Incidentally, I think there are many competitive environments in which being rational can put one at a disadvantage. This can have important aggregate effects. Suppose, just hypothetically, that women are more rational than men when it comes to assessing certain sorts of career risks (for example, suppose, again just hypothetically, that men are more inclined toward irrational overconfidence). This could easily produce a situation in which men dominate the upper echelons of an organization (and are more likely to flame out – but that is less visible). Again, this is just a hypothetical example, that I’m sure bears no relationship to the real world.

All of this is something that I have a lot of personal experience with, since I took a huge risk earlier in my career, which has paid off quite handsomely. While all my friends were playing it safe, doing degrees in law, medicine, or commerce, I took the ultimate career risk. I did a PhD in philosophy – which as everyone knows, is a far more competitive field, but one that offers much greater rewards (i.e. being paid to sit around and think great thoughts all day).

And of course, if you look at my department, one characteristic that the professors all share is that they took great risks – after all, everyone has a PhD in philosophy, that’s about as risky as it gets, in terms of career development strategy. But does that mean that “in order to be a great philosopher, you must embrace risk?” Um, no. Actually, most of us are here because we were either oblivious to the consequences of the choices we were making, or else irrationally self-confident (that would be me). My guess is that a lot of the guys in Snobelin’s little conclave are the same.

 

Comments are closed.