Monday, March 31, 2014

Charlie Munger's #1 Intellectual Sin

Ignoratio Elenchi: The Fallacy of Irrelevance


If you're going to use a latticework of mental models as Charlie Munger suggests, you'd better know which model(s) to apply and when. British philosopher Michael Oakeshott uses the concept of ignoratio elenchi in a way that's helpful to determining how to avoid the biggest mistake you can make in using mental models.


Munger's advice in trying to figure out what to do is to, "Quickly eliminate the big universe of what not to do, follow-up with a fluent, multidisciplinary attach on what remains, then act decisively when, and only when, the right circumstances appear." Michael Oakeshott points to one of the chief obstacles to eliminating what not to do (and, as a result, opens up decisive action in the right circumstances).


Oakeshott writes, “confusion, ignoratio elenchi, is itself the most fatal of all errors, and that it occurs whenever argument or inference passes from one world of experience to another.” Oakeshott calls this "the most subtle and insidious form of all forms of error—irrelevance”; Ignoratio elenchi, literally “ignorance of refutation” is a type of logical fallacy that doesn’t necessarily entail an invalid argument. Oakeshott often refers to it as the “fallacy of irrelevance,” it is sometimes also known as the “fallacy of missing the point.”
Army form Military Identity Card No. A 501245. Surname Oakeshott
Michael Oakeshott's WWII papers. Famous for arguing that ignoratio elenchi was the most fatal of all errors, such confusion in his line of work with the British “Phantom” reconnaissance unit might literally have proved fatal.

Few things, I imagine, must grate on Charlie Munger's ears than people making this kind of mistake. In a previous post on Munger and Harvard Business School Professor Clayton Christensen, I argued that one of the hardest parts of using Munger's mental models system is knowing when and how to apply models from across disciplines. Without going too far into Oakeshott's idea of human experience being understandable in various 'modes' (an idea that predates but strongly resembles Stephen J. Gould's concept of “non-overlapping magisteria”), Oakeshott's peculiar use of ignoratio elenchi can teach us a bit about how to assemble, understand, and deploy something like the Munger's suggested latticework of mental models.
Many who study philosophy take Oakeshott’s ignoratio elenchi to mean the same thing as Gilbert Ryle’s ‘category mistake.’ Munger would be wary of both kinds of errors, but it's Oakeshott idiosyncratic use that, I think is more instructive. The guidance that Mason Meyers got from Clayton Christensen's class at Harvard was designed to explicitly instruct people on how to avoid Ryle's category mistake. Though this type of mistake is common, I think it's pretty easy to eliminate. But it's the error in judgment that Oakeshott points to that's more difficult.
History Yer Doing It Wrong
A humorous depiction of Oakeshott's fallacy of irrelevance
 


Ignoratio elenchi is usually thought of as a logical, not an epistemological, term that indicts any argument that fails to establish its relevant conclusion. Of course, many instances of this kind of error result from ignorance—people just don’t know better. (It’s this kind of ignorance that Clayton Christensen’s Harvard business School class was designed to eliminate). Because of this fact, it’s easy to think of this kind of mistake in epistemological terms. But it doesn’t involve epistemology.

One can commit Ryle’s category mistake, for example, without making any argument at all; simple propositions can do this: I hear the Zebra light warm on my skin, for example. While single propositions can make category mistakes, those mistakes can’t be instances of ignoratio elenchi.

The unusual way Oakeshott uses the term ignoratio elenchi seems to me to resemble the way Munger thinks of mistakes in practice. It is impossible to hear light—of a Zebra variety or otherwise—on one’s skin. Oakeshott, though, thinks of the fallacy of irrelevance in a different way. He thinks that fallacies of irrelevance often occur because of conceptual misunderstandings. Oakeshott regards the ‘philosophy of history’, for example, as such a fallacy. Paul Franco describes the reason, “The philosopher can have nothing to do with the abstractions of history. And the philosophico-historical speculations of the philosopher of history can only be irrelevant to the historian (or possibly the object of legitimate derision). Because history is a world of abstractions, ‘those who have made it their business to press experience to its conclusion can choose only between avoiding it or superseding it.’”

Oakeshott, like Munger, is no fan of too-strong ideology. Oakeshott’s aversion to ideology is evident in his assessment of the fallacy of irrelevance. Political ideology, he thinks, is not the spring of political activity but only (and always) the product of subsequent reflection on such activity. Oakeshott argues against the understanding of political activity that ideological politics implies—defective because it misdescribes what really goes on in political activity. Ideological politics are not just undesirable—they’re impossible. And “to try to do something which is inherently impossible is always a corrupting enterprise”, for Oakeshott just as much as Munger.

Nate Silver’s great new site 538 had MIT professor KerryEmanuel respond to a recent piece the site had run on disaster costs and climate change. Here’s a passage that points to both Ryle’s and Oakeshott’s fallacies of irrelevance. It demonstrates both how easy it is to make such mistakes…and how potentially, well, disastrous making them can be:


from: http://knowmore.washingtonpost.com/2014/03/31/this-map-shows-how-the-world-has-been-hurt-by-climate-change-so-far/
The Washington Post weighs in on how and where climate change has hurt us.


“Let me illustrate this with a simple example. Suppose observations showed conclusively that the bear population in a particular forest had recently doubled. What would we think of someone who, knowing this, would nevertheless take no extra precautions in walking in the woods unless and until he saw a significant upward trend in the rate at which his neighbors were being mauled by bears?

The point here is that the number of bears in the woods is presumably much greater than the incidence of their contact with humans, so the overall bear statistics should be much more robust than any mauling statistics. The actuarial information here is the rate of mauling, while the doubling of the bear population represents a priori information. Were it possible to buy insurance against mauling, no reasonable firm supplying such insurance would ignore a doubling of the bear population, lack of any significant mauling trend notwithstanding. And even our friendly sylvan pedestrian, sticking to mauling statistics, would never wait for 95 percent confidence before adjusting his bear risk assessment. Being conservative in signal detection (insisting on high confidence that the null hypothesis is void) is the opposite of being conservative in risk assessment.”
I don't believe in the old saying, "there's lies, damned lies, and then there's statistics," but where there are fallacies of irrelevance, there is likely to be only lies, regardless of whatever rigor the analysis possesses, statistical or otherwise. 

Tuesday, March 25, 2014

Many now Benchmark physical progress...Benchmarking Mental Models?

Benchmarking Physical progress

Billed as "the easiest way to track your performance on benchmark workouts for CrossFitters, weight lifters, runners, rowers, and other serious athletes. Get some!," the website Bnchmrk.me let's people easily track their physical performance over time.

Capitalizing on the CrossFit Games Open's first workout (and some sparkling photography courtesy Ali Samieivafa), Bnchmrk created a nice graphic arguing for the power of data collection. When a person can take a look across time at how they perform at various workouts, they have real access to a and are in a position to take a critical look at what's working and what isn't. Variables in performance can be isolated and meaningful changes to work, rest, nutrition can be implemented.

Knowledge, indeed, is seeing how much you've improved between 11.1 and 14.1. The CrossFit Games Open included a workout it had previously demanded of its competitors. I'll bet most of the people who did both workouts will do better this next time around. And I'll also wager that CrossFit selected that workout again as a way to make the not-so-subtle point that it works as an exercise regiment.

Setting the bar high.


One philosopher has asserted that, in philosophy at least, to know only the gist is to know nothing. That's a pretty high bar to attain. And, I'll admit it was a little daunting, though not really discouraging, to consider what he meant. The CrossFit Games Open workout 14.1 doesn't set the bar too high in terms of the work it demands: 75 pound power snatches and jump-roping double-unders (when the rope swings under the feet twice each jump) are attainable by a large segment of the general population. Of course, each person set her own bar quite high in trying to reach her maximum potential in completing these exercises.

Recently, a man I admire set the bar even higher regarding what it would take to have a handle on what I've taken to calling CharlieMunger's Mental Models. This very smart man counseled me that I shouldn't focus "so much on the individual items of your checklist as on the general idea behind them." So far so good.

Like the philosopher I cited above, though, knowing 'the gist' of the general idea isn't nearly enough. Instead, my friend cited the Japanese proverb (he also cited Wittgenstein, but this post is already a little heavy on philosophy): “The frog in a well knows nothing of the mighty ocean." He offered the New York Times Crossword puzzle as a good metaphor for the real multi-domain, holistic fluency mastering Munger's Mental Models requires. "Those [who] can complete it must have massive knowledge across multiple domains: the English language, literature, pop culture, etc. Bring anything less than that broad domain base of understanding, and it can’t be done." He thought this a good metaphor because it's easily recognized as a yes or no proposition: one either has sufficient knowledge or one doesn't. "If you don’t," he added, "no dice."  

Across many domains & disciplines.

My friend described the challenge of attaining what Munger would call 'worldly wisdom' this way: "understanding life realities requires actually knowing the mighty ocean, at the NY Times crossword level or 'no dice'."  Learning the mighty ocean isn't easy. Doing so demands that we digest "the big ideas from all domains." 

Yes or No. And BroReps are no.

The idea that you either have the capacity or you don't might have been penned by Greg Glassman. His definition of fitness was radical in that it was a) empirical, and b) aimed at comprehensiveness. 

Contrast Glassman's notion of fitness with this: "true levels of fluency required -  across broad domains -  in order to genuinely understand the world’s complex systems is well beyond the scope of structured courses within an educational institution.  Rather, a full commitment of one’s entire life, within and outside of school years, is required."


One thinker puts the matter this way, "'Judgement', the ability to think, appears first not in merely being aware that information is to be used, that it is a capital and not a stock, but in the ability to use it — the ability to invest it in answering questions."

To have any hope of grasping complex systems as Munger has requires us to "agree to approach [our] learning in a way fundamentally different than the rest of the world, i.e, not by the usual route of being devoted to one main specialty, with some things thrown in on the side, but rather holistically, and committing to that journey for a lifetime." 

The way people benchmark in CrossFit is pretty clear. And pretty amazing. I'd love to hear how people working with Munger's mental models benchmark their intellectual progress. Whatever it looks like, I'm sure the intellectual equivalent of BroReps are 'no dice.'




Tuesday, March 18, 2014

Mobility and Munger

Munger and Mobility.

No, not upward mobility or how to get rich. But Kelly Starrett's recent comments on RadioWest made me think about Charlie Munger. And how we can better understand his insights, especially into human judgement and psychology.

Starrett's insight, "We need to stop focusing on pain to know if we're in a good position and functioning well as a human body," calls to mind Munger's attempt to develop a system to understand human misjudgment. In this talk he said, "I am very interested in the subject of human misjudgement. And Lord knows I've created a good bit of it. I don't think I've created my full statistical share. And one of the reasons was that I tried to do something about the terrible ignorance I left the Harvard Law School with."



Partly by casual reading and partly by personal experience. But Munger makes pretty clear that the personal experience part caused him to take some lumps. What Munger, I think, was really trying to do was to avoid the pain that comes with forgetting the maxim, "Experience keeps a dear school, but it's a fool who will learn by no other."

CrossFit, like life, will be full of pain. And mistakes. As Munger has argued, "“I don’t want you to think we have any way of learning or behaving so you won’t make mistakes." Munger's practice, despite the mistakes, looks a lot like what Starrett advises, wherever possible, he figured out whether he was in a good position and functioning well mostly without having to be told by the pain that results from stupid decisions. Just as it does from poor physical movement.

Helped along by Robert Cialdini's book Influence, Munger developed a system. He filled in a lot of holes in the 'crude' system he developed.



Applying Starett's insight to Munger, we can get to the point--both intellectually and physically--when we “learn to make fewer mistakes than other people- and how to fix [our] mistakes faster when [we] do make them.” Munger's thought, especially in his admonition that Harvard University should be more like a pilot school and employ a rigorous checklist system, asks that we approach complex analytical situations by reducing variables.

Compare what you do, if you use Munger's 'mental models' to what Kelly Starrett has called the key to getting his athletes to function well--and pain free--in demanding and varied athletic environments: "You want the same set up, the same organization so that you can minimize variability."



Sounds a lot like Munger's hero Ben Franklin's argument that it's easier to prevent bad habits than it is to break them.