Monday, March 31, 2014

Charlie Munger's #1 Intellectual Sin

Ignoratio Elenchi: The Fallacy of Irrelevance


If you're going to use a latticework of mental models as Charlie Munger suggests, you'd better know which model(s) to apply and when. British philosopher Michael Oakeshott uses the concept of ignoratio elenchi in a way that's helpful to determining how to avoid the biggest mistake you can make in using mental models.


Munger's advice in trying to figure out what to do is to, "Quickly eliminate the big universe of what not to do, follow-up with a fluent, multidisciplinary attach on what remains, then act decisively when, and only when, the right circumstances appear." Michael Oakeshott points to one of the chief obstacles to eliminating what not to do (and, as a result, opens up decisive action in the right circumstances).


Oakeshott writes, “confusion, ignoratio elenchi, is itself the most fatal of all errors, and that it occurs whenever argument or inference passes from one world of experience to another.” Oakeshott calls this "the most subtle and insidious form of all forms of error—irrelevance”; Ignoratio elenchi, literally “ignorance of refutation” is a type of logical fallacy that doesn’t necessarily entail an invalid argument. Oakeshott often refers to it as the “fallacy of irrelevance,” it is sometimes also known as the “fallacy of missing the point.”
Army form Military Identity Card No. A 501245. Surname Oakeshott
Michael Oakeshott's WWII papers. Famous for arguing that ignoratio elenchi was the most fatal of all errors, such confusion in his line of work with the British “Phantom” reconnaissance unit might literally have proved fatal.

Few things, I imagine, must grate on Charlie Munger's ears than people making this kind of mistake. In a previous post on Munger and Harvard Business School Professor Clayton Christensen, I argued that one of the hardest parts of using Munger's mental models system is knowing when and how to apply models from across disciplines. Without going too far into Oakeshott's idea of human experience being understandable in various 'modes' (an idea that predates but strongly resembles Stephen J. Gould's concept of “non-overlapping magisteria”), Oakeshott's peculiar use of ignoratio elenchi can teach us a bit about how to assemble, understand, and deploy something like the Munger's suggested latticework of mental models.
Many who study philosophy take Oakeshott’s ignoratio elenchi to mean the same thing as Gilbert Ryle’s ‘category mistake.’ Munger would be wary of both kinds of errors, but it's Oakeshott idiosyncratic use that, I think is more instructive. The guidance that Mason Meyers got from Clayton Christensen's class at Harvard was designed to explicitly instruct people on how to avoid Ryle's category mistake. Though this type of mistake is common, I think it's pretty easy to eliminate. But it's the error in judgment that Oakeshott points to that's more difficult.
History Yer Doing It Wrong
A humorous depiction of Oakeshott's fallacy of irrelevance
 


Ignoratio elenchi is usually thought of as a logical, not an epistemological, term that indicts any argument that fails to establish its relevant conclusion. Of course, many instances of this kind of error result from ignorance—people just don’t know better. (It’s this kind of ignorance that Clayton Christensen’s Harvard business School class was designed to eliminate). Because of this fact, it’s easy to think of this kind of mistake in epistemological terms. But it doesn’t involve epistemology.

One can commit Ryle’s category mistake, for example, without making any argument at all; simple propositions can do this: I hear the Zebra light warm on my skin, for example. While single propositions can make category mistakes, those mistakes can’t be instances of ignoratio elenchi.

The unusual way Oakeshott uses the term ignoratio elenchi seems to me to resemble the way Munger thinks of mistakes in practice. It is impossible to hear light—of a Zebra variety or otherwise—on one’s skin. Oakeshott, though, thinks of the fallacy of irrelevance in a different way. He thinks that fallacies of irrelevance often occur because of conceptual misunderstandings. Oakeshott regards the ‘philosophy of history’, for example, as such a fallacy. Paul Franco describes the reason, “The philosopher can have nothing to do with the abstractions of history. And the philosophico-historical speculations of the philosopher of history can only be irrelevant to the historian (or possibly the object of legitimate derision). Because history is a world of abstractions, ‘those who have made it their business to press experience to its conclusion can choose only between avoiding it or superseding it.’”

Oakeshott, like Munger, is no fan of too-strong ideology. Oakeshott’s aversion to ideology is evident in his assessment of the fallacy of irrelevance. Political ideology, he thinks, is not the spring of political activity but only (and always) the product of subsequent reflection on such activity. Oakeshott argues against the understanding of political activity that ideological politics implies—defective because it misdescribes what really goes on in political activity. Ideological politics are not just undesirable—they’re impossible. And “to try to do something which is inherently impossible is always a corrupting enterprise”, for Oakeshott just as much as Munger.

Nate Silver’s great new site 538 had MIT professor KerryEmanuel respond to a recent piece the site had run on disaster costs and climate change. Here’s a passage that points to both Ryle’s and Oakeshott’s fallacies of irrelevance. It demonstrates both how easy it is to make such mistakes…and how potentially, well, disastrous making them can be:


from: http://knowmore.washingtonpost.com/2014/03/31/this-map-shows-how-the-world-has-been-hurt-by-climate-change-so-far/
The Washington Post weighs in on how and where climate change has hurt us.


“Let me illustrate this with a simple example. Suppose observations showed conclusively that the bear population in a particular forest had recently doubled. What would we think of someone who, knowing this, would nevertheless take no extra precautions in walking in the woods unless and until he saw a significant upward trend in the rate at which his neighbors were being mauled by bears?

The point here is that the number of bears in the woods is presumably much greater than the incidence of their contact with humans, so the overall bear statistics should be much more robust than any mauling statistics. The actuarial information here is the rate of mauling, while the doubling of the bear population represents a priori information. Were it possible to buy insurance against mauling, no reasonable firm supplying such insurance would ignore a doubling of the bear population, lack of any significant mauling trend notwithstanding. And even our friendly sylvan pedestrian, sticking to mauling statistics, would never wait for 95 percent confidence before adjusting his bear risk assessment. Being conservative in signal detection (insisting on high confidence that the null hypothesis is void) is the opposite of being conservative in risk assessment.”
I don't believe in the old saying, "there's lies, damned lies, and then there's statistics," but where there are fallacies of irrelevance, there is likely to be only lies, regardless of whatever rigor the analysis possesses, statistical or otherwise. 

No comments:

Post a Comment