Monday, January 12, 2009

Musings on Embedded Epistemology

I took a course in epistemology last semester, and (surprise) it made me think about epistemology.  What follows is an attempt to summarize my random musings and conversations I've had over the last few weeks into something that begins to approach a coherent theory.  It is, as I cannot emphasize enough, very prelimary so far, and very much a work in progress.  Still, I find these considerations very interesting, and I hope you do as well.

Belief justification is like a progress bar on a download--it can be filled or emptied to various degrees by things that we encounter out in the world. For instance, if I trust some individual a great deal, his words will tend to fill my "truth bar" a great deal; this weighing is based (among other things) on my past interactions with him, my knowledge of his epistemic state, &c.--certain; contextual variables about our relationship lead me to weigh his words highly when making (or contemplating making) epistemic actions like belief revision. The degree to which my truth bar is filled is also going to depend on the nature of the proposition this hypothetical interlocutor is informing me about: even from a trusted friend, I'm going to more readily assent to the proposition 'there is a brown dog around the corner' than I am to the proposition 'there is a child-eating clown around the corner.' Again, this reflects the contextually-influenced nature of epistemic action: based on other beliefs I have about how the world works, I'm going to be more or less likely to assent to a new belief (or to change an old one). 

It's important to emphasize that the truth-bar is almost never entirely full, except in some very special cases (e.g. conscious states to which you have immediate, incorrigible access). Take the case of a proposition based on basic sensory information--e.g. 'there is an apple on my desk.' In normal circumstances--good lighting, I can feel and see the apple, other people see the apple too, &c.--I; have very good reason to suspect that there really is an apply on my desk; the truth-bar for that proposition is (say) 99% full. Still, there are potential defeaters here: it might be the case that I am actually in some kind of Matrix scenario, and therefore it might be the case that there is no desk or apple at all. Still, based on other (fairly strongly justified) beliefs I have about the world, this Matrix scenario seems rather unlikely--that is, the truth-bar for 'I am in the Matrix' is very, very close to empty (though not entirely empty, as the proposition is still a logical possibility). Because this defeating proposition ('I am in the Matrix') has a very weak truth-bar, it doesn't weigh very heavily in my epistemic considerations--it's enough to keep the bar for 'there is an apple on my desk' from being 100% full, but that's about it. 

This goes sharply against established epistemic tradition, according to which the primary goal of epistemology is truth. If we define truth as a 100% full bar, there are going to be very few propositions (aside from tautologies like 'all black things are black') that will enjoy an entirely full bar. Instead, the right way to think about epistemology--and about our epistemic responsibilities--is as a quest for justified belief, a quest for a reasonably full bar. What counts as 'reasonably full' is, again, going to vary based on contextual variables: when the stakes are rather low, I might assent to a proposition when (say) the truth bar is over 50% full. This might be the case when, for example, a friend tells me that there is a brown dog outside my house; I believe him, and if someone asks me 'is there a brown dog outside your house?,' I will be inclined to answer in the affirmative. My friend might be wrong or lying, but the stakes are low and I have very few strong defeater propositions in play--few good reasons to suppose that my friend speaks falsely, in other words. In more important cases (such as when engaged in technical philosophical deliberation, or when designing a passenger jet), I'm going to be inclined to withhold assent from propositions until the bar is almost entirely full: the consequences for assenting to the wrong belief are so potentially dire, that I will demand a higher standard of justification, investigation possible defeaters more thoroughly, &c.; 

The emphasis here is on the contextually-dependent nature of epistemic action; rather than doing a lot of complex deliberating for every possible belief change entirely in our heads, we "offload" a certain amount of the work into the existing epistemic environment; that is, we use the existing epistemic landscape to simplify our decision-making by heuristically assigning various "values" to propositions that are related to the one under consideration, and performing a kind of Bayesian calculation to get a rough approximation of truth or falsity. We can make a direct parallel here with other work being done in extended/embedded cognition and extended mind theses--in just the same way that we use external props (e.g. written notes) as props to support certain cognitive processes (e.g. memory), we use our intuitive grasp of the existing epistemic landscape as a prop to support our own decision making. I call this approach "contextually embedded epistemology." 

Statisticians or those with a background in math will recognize that I'm describing something very much like a Bayesian network here--I suspect that our beliefs, were they to be mapped, would look much like this. There are multiple links between multiple different beliefs, and one belief might depend on many others for support (or might be partially defeated by many others). The picture is constantly in a state of flux as shifts in one node (i.e. a single belief) influence the certainty (i.e. the fullness of the truth bar) of many other nodes.  The Bayesian way of looking at things is far from new, but the emphasis on partial-completeness and environmental support, as far as I know, is.  These are just some random thoughts I've had about this in the last few days, so comments and criticisms are encouraged.  This needs a lot of tightening up.