Saturday, July 19, 2008

Some Normative Epistemology

If you ask most philosophers, they'll tell you that there are (roughly) four main branches of philosophy: metaphysics, epistemology, ethics, and logic. These (again, roughly) correspond to the questions: "What's out there?" "How do you know?" "What should I do about it?" and "Can you prove it to a math major?" Tongue-in-cheek definitions aside, metaphysics deals with questions relating to the nature of reality, the existence of various entities, properties of those entities, and the ways in which those entities interact. "Does God exist?" and "How do the mind and brain relate?" are both metaphysical questions. Epistemology deals with knowledge claims and how humans go about knowing things in the first place--"What is an appropriate level of evidence to require before changing a belief?" and "How can we be sure that our senses are reliable?" are both epistemic questions. Ethics deals with questions of right and wrong (metaethics) and how we ought to live our lives (normative ethics). "What moral obligations do we have to our fellow man?" is the canonical ethical question. Logic sort of flits in and out the other disciplines, popping its head in to be used as a tool (or to confound erstwhile plausible theories) in any and all of the above, but it also has some questions of its own. Modal logic deals with necessity and contingency, and asks questions like "What does it mean for some truth to be true necessarily rather than by chance, or contingently?"

This blog deals mostly with metaphysical questions, but I had a very interesting discussion about epistemology with a colleague the other day, and I want to relate some of the highlights here and (hopefully) get some comments.

The discussion revolved mostly around what counts as a good reason for believing some proposition, but I want to specifically focus on an epistemic maxim of my own invention: "we ought to be more skeptical of propositions we wish to be true." Let me give a brief summary of the thought process that led me to adopt this maxim.

First, I take it as axiomatic (that is, not requiring proof of its own) that holding true beliefs about the world is a good thing, and that holding false beliefs a bad thing--I don't mean 'good' and 'bad' in any kind of moral sense here, only that, in general, the accumulation of true beliefs and the expunging of false beliefs (i.e. the search for truth) is a goal that can be valued in itself, and not necessarily for any pragmatic results it might deliver (though it certainly might deliver many). If you don't agree on that, feel free to shoot me an argument in the comments, and I'll do my best to address it.

With that axiom in place, then, it seems reasonable that we should do whatever we can to avoid taking on new false beliefs, as well as strive to take on as many true beliefs as we can. That last part is important, as it saves us from going too far down the path of Radical Skepticism. If we were to adopt something like Descartes' method of doubt in his Meditations--that is, adopt the maxim "we should withhold assent from any proposition that is not indubitable just as we would any proposition that is clearly false"--we would certainly minimize the number of false beliefs we would take on, but at the expense of likely rejecting a large number of true ones. Radical Skepticism results in too many "false epistemic negatives," or "avoids Scylla by steering into Charybdis," as another colleague said. To continue the metaphor, it also seems to dangerous to stray toward Scylla, lest I simply believe every proposition that seems prima facie pluasible--too far in the direction of naive realism, in other words. While I certainly consider myself a naive realist in the context of perception--I think that the way our senses present the world to us is more-or-less accurate, and that when I (say) perceive a chair or a tomato, I really am perceiving a chair or a tomato, and not my "sense datum," "impression" or any other purely mental construct that is assembled by my mind--I think we ought to be somewhat more skeptical when it comes to epistemology in general.

My colleague pressed me for the exact formulation of my response to the question at hand ("what counts as a good reason for forming or changing a belief?"), but I demurred, and on further reflection--both then and now--I'm not sure I can give a single answer. Rather, it seems to me, that there are (or at least ought to be) a variety of heuristics in our "epistemic toolbox" that either raise or lower "the bar of belief" in various circumstances. "Naive realism" is a cluster shorthand for a bundle of these heuristics, it seems to me, including (for instance) "We should be more skeptical of propositions that would have the world operating in a way that is radically different from how it seems1." I'm most interested right now in the general heuristic mentioned above, though: "we should be more skeptical of propositions we wish to be true." So let's continue with our justification of it.

I'm not a perfect reasoner; unlike, say, Laplace's Demon, it is possible for me to make a mistake in my reasoning--indeed, it happens with alarming frequency. These errors can take many forms, but they can include assenting to arguments which, though they might seem sound to me, in reality either lack true premises or are somehow invalid. If I strongly desire some proposition p to be true--if, for example, a close family member is in a coma and I hear about an experimental new treatment that might allow him to awaken with full cognitive faculties--I am more likely to make these errors of judgment, as I will not necessarily apply my critical faculties with the same force as I would to another proposition p1 on which such strong hopes were not resting. My colleague objected that I would, given enough care, certainly be aware of when this was happening, and could take more care in my reasoning to ensure that this result did not occur, but I am not so certain: a corollary of the fact that I am a fallible reasoner seems to be that I might not always know when my reasoning is being faulty. It is no solution, therefore, to say "we need not universally require a higher standard of proof for propositions we wish to be true, we just need to be sure that our reasoning is not being influenced by our desires," as it is possible--in just the same sense that it is possible for me to make a mistake in my reasoning--that I might make a mistake in evaluating that reason itself, no matter how much care I take to be certain that my desires not influence my judgment.

What do I mean by "skeptical," then, if not for a more careful logical rigor, one might ask. It seems to me that whenever I am thinking clearly (i.e. I am not drunk, asleep, distracted, etc.) and applying my logical faculties to the best of my ability (i.e. critically questioning my beliefs or trying as hard as I can to puzzle out a problem)--as I should be when I am seriously considering adopting a new belief or changing an existing one--I am already being as rigorous as I possibly can be; unless, for some reason, I have already lowered the "bar of belief" in a specific instance (e.g. suspending disbelief while watching an action movie) I should normally be as logically rigorous as I can be. If I'm critically examining adopting some belief that I greatly wish to be true, then, I should not only be as logically rigorous as I can be--that is, I should set the bar of belief where I normally do--and then also factor in the possibility that my belief might be affecting my logical reasoning--might be lowering the bar without my knowledge--and so I ought to require more evidence than I otherwise would: that is, I ought to be more stubborn about changing my position. By "skeptical" here, then, I just mean "requiring of more evidence," in the same way that if I'm skeptical of a student's claim that her computer crashed and destroyed her paper I will require more evidence attesting to the truth of it (a repair bill, maybe) than I normally would; her claim to the effect counts as at least some evidence, which might be enough if I had no reason to be skeptical.

Let me make my point briefly and clearly. In making desire-related decisions--particularly when deciding to assent to a proposition you wish to be true--the possibility that my desire might negatively affect my reason, combined with the fact that I might not be aware of this negative effect means that I ought to apply my normal reasoning faculties with my full ability and require more evidence in favor of the proposition than I normally would.

Even more succinctly: we ought to be more skeptical of propositions we wish to be true.



Thoughts? Does this make sense? What standards do you apply when trying to make up your mind about your beliefs in general?









1. This is not to say that propositions which say that the world operates in radically different ways than it seems to use are always (or even usually) going to be false--the atomistic theory of matter, relativity, and quantum mechanics are all theories which seem to be at least mostly true, and which describe the world as being in fact much different than it seems. My point is that we should hold claims like this to a higher epistemic bar before assenting to them than we would claims like (say) there is a tree outside my window, which correspond with reality as it seems to us.






Edit: Because this discussion took place in class with my co-instructor, and because the kids all have cameras all the time, we get a picture of me thinking about this point and a picture of me arguing it with him. Enjoy.




4 comments:

RaplhCramden said...

Absolutely you ought to be more skeptical of hypotheses you want to be true! If you are an experimentalist, you can set up an experiment to prove that you will more easily conclude something false is true if you want the thing to be true (and vice versa I'm sure).

One of the smartest men in the world is Charlie Munger, 2nd in command to Warren Buffett at Berkshire Hathaway. His friends have a compiled a book "Poor Charlie's Almanack." In it, especially in an essay near the end, he compiles and explains a list of tendencies in thinking that you have to watch out for if you want the truth. I think it would be fantastic for philosophy if someone started linking that kind of thing in to epistemology and perhaps even metaphysics and ethics (using logic of course, we wouldn't want to leave out any useful categories).

There are other great books about defects in thinking. "Black Swans" by Nassim comes to mind. This comes from the world of investing, especially quant investing. But it certainly addresses a lot of beliefs from outside investing. It dives deep into the questions "what can we know and why do we keep making these other mistakes about that over and over and how can we stop."

Great blog entry and great blog. I look forward to your eventual books. If you have something written already and want a free proofreader (or even just an early fan) I would love to read your stuff.

Mike

RaplhCramden said...

There are many tendencies that lead towards wrong answers, and I doubt I'll get to even all the ones I know about in these comments.

But there is another one which I think is so powerful that it deserves a comment.

We tend to go with the crowd. Even those of us who think we don't go with the crowd usually just go with a different subcrowd which identifies itself with its "independence" from the main crowd.

Of course going with the crowd is USUALLY right. And this is at least part of what is tricky about the tendency.

I am toying with the policy of NOT firming up my beliefs one way or another, essentially making them only as firm as they need to be and no firmer. Interstingly, I think this is quite consistent with the idea of "presence" that is so appealing Buddhism, especially Zen. The experience itself is primary, beliefs are a method for making an actionable model. But that it is a model best not forgotten, the thing you are modeling, which is ultimately your present experiences, is best kept primary to the model.

I also think there is great value in avoiding being dismissive of people with whom I disagree. Especially given my "model" view of beliefs, basically everybody who is at all effective in life can be inferred to have a useful model, i.e. a useful set of beliefs. Working to understand how their beliefs are useful (rather than how they are wrong) makes me more effective in life, it seems to me. This also to me seems related somehow to a Zen approach, but I am weaker making that connection.

Cheers,
Mike

Jon said...

It's interesting that you bring up Zen, Mike, as I've had some of the same thoughts. When I was younger--high school, before I got into philosophy seriously--I was very interested in Buddhism in general and Zen in particular. While I think rigorous analytic philosophy has more to offer in terms of likelihood of uncovering truth, there's still something to be said for Zen. Before I dropped Ethics as a specialty in favor of mind, I was working on a "disjunctive" normative theory, arguing something similar to what I'm arguing epistemically here--that is, that it doesn't make sense to have a single ethical theory about every situation, but rather to have a number of more general heuristics that help you gauge various aspects of a given situation and act accordingly. I think there's definitely something to it.

Thanks for all the kind words, and glad you like the blog!

RaplhCramden said...

Commenting on this and the newer blog, I wonder what happens with "descriptive ethics." My own start in philosophy was as a freshman taking "Ethics." I was a very guilty child (ex-catholic) who sincerely and desparately wanted to know what I should be doing. The Catholics had blown their authority to me. But it still seemed possible I just needed to go to some other smart rigorous thinkers to get the answers I needed.

I was wonderfully disappointed, although it certainly did not seem wonderful at the time. All theories of ethics struck me as "geometry," if you bought the assumptions then you could derive the rest, but no one was going to tell you why you had to buy the assumptions. I loved Descartes (who I didn't seriously learn until Junior year) because he actually cared about the assumptions. "I think..." how can you deny it? I knew I couldn't and read on avidly as here was something beyond geometry!

Descriptive ethics, you may wind up being able to tell us a lot about Why people think they should do various things. But will you ever be able to tell us Why we should do those various things? If you come up with an ethics that seems to fit human minds, all you can do is describe, not prescribe.

Since my freshman year I have been largely morally adrift. Very aware that whatever I DECIDED was right and wrong was just no more than a decision. That if I imposed my decision on others (and a moral system which can't be imposed on others is pointless) that, really, I had no more justification than "because I think so."

I liked utilitarian approaches, because even though it is still just axiomatic (we just assume "greatest good for greatest number" and prove theorems from there), at least it is an attractive axiom.

Have a good one,
Mike

PS I have visited friends in Lancaster hundreds of times. My bud got his chemistry degree from F&M.