Pages

Thursday, April 14, 2011

This question may seem innocent...

But it's actually yields quite the philosophical prowess. The question is, what is the rational thing to believe, given that a person you judge to be an epistemic peer (someone with the same inferential skills and evidence) disagrees with you? Like I said, that sounds simple enough, but attempting to answer the question can instill a good bit of humility in a person. This post will be an exposition of the discussion and will draw heavily from an article by Adam Elga. I'm planning on writing a little series of posts applying the issues discussed here to certain beliefs we have, such as political beliefs (isn't that exciting!)

Alright, so first thing is first - we need to break down the question to understand what it's asking because it set out a very specific situation. First of all, the question asks, "What the rational thing is to believe...". Emphasis on 'rational' and 'believe'. The word 'rational' implies that we are going to be dealing with some normative rules ("you ought to...") of inference. What these rules are exactly, I'm not sure, but we'll see. 'Believe' on the other hand is more obvious, we'll be dealing with forming beliefs, and not acquiring knowledge or the like (as is seen in epistemology.) Okay, so far we have "What ought you to believe given 'X'". Simple!

The next part of the question is key, "given that a person you judge to be an epistemic peer disagrees with you?" An explanation will be easier to give if I talk about epistemic peers first. Don't worry about the word epistemic, it just tells us we're dealing with beliefs (in this instance).

As Elga lays out in his article, there are people we know as advisors, people who effect our beliefs given their own. There are those who are superior to us: experts and gurus. Experts are superior to us both in their inferential skills (belief forming skills) and evidence; we adopt their beliefs as our own. A good example is the weatherperson. Whatever she believes the weather will be like, I'll agree wholeheartedly. Why? She is schooled (I hope) in the methods of meteorology and has access to evidence I don't. Thus, I judge her to be an expert an defer to her judgments unconditionally (assuming her head is on straight). A guru, on the other hand,  is superior to us, but there is catch. We still defer to their beliefs entirely, but conditionally. We treat someone as a guru when we believe we have some evidence that they don't. Then, instead of adopting their judgment on a given issue right away, we ask ourselves what would they believe if they had our relevant information and defer to that opinion instead.

Then we have those who are equal to us, our epistemic peers. Roughly, this means that a person has the same inferential skills and evidence as we do. A good note on evidence: this means more than just whatever physical evidence we can collect. We're also talking about pre-held beliefs and such. That's all we really need to know regarding advisors, except that there are epistemic inferiors.
An epistemic inferior
You've probably realized something queer by now. Experts and gurus are easy to come by, most definitely, but what about peers? There can't possibly be an exact peer, most people will be off from us by a bit - either slightly superior or inferior to us, right? When I was learning about this and in subsequent discussions with friends about the matter, this is a key point. Sure, maybe there is no such thing as actual peers, and that makes this question pointless. Well, not quite! The question is asking if someone who is in fact a peer disagrees, but what if someone you judge to be a peer disagrees! This an important and simple difference, take a moment to understand it. Lastly, note that I emphasized the word "disagrees". I just want to make a point that the disagreement here is rational, and not found in the evidence or inferential skills because you judge the other to have similar enough evidence and skills!

Now we understand the question, and now to see how it's as powerful as I say it is! Elga provides the answer in terms of a dilemma: the rational thing is either to suspend judgment (basically say "I don't know"), give your view more weight, or give the peer's view more weight. After criticizing the latter two and defending and expanding on the equal-weight view, he concludes that the equal weight-view is what leads to rational belief in peer disagreement.

I'll spare you all but the gist of the criticisms. For the most part, the greatest criticism of the other views is that giving extra weight to either person's judgment is quite unjustified. The justification for doing so is what Elga refers to as 'bootstrapping'. The idea is that in an initial disagreement with a peer, you might give your judgment for weight simply because it's yours. This diminishes your judgment of your peer's standing slightly. In the next disagreement you are now more inclined to give your view extra weight because part of your evidence is that your peer is at least slightly worse at making judgments than you. Repeat this process with every disagreement you have, and eventually you'll no longer come to judge your peer as a peer - simply because you gave your view extra weight in the first place! That's not good reasoning. Good reasoning would demote your peer on epistemic grounds instead. All of this could be said, but in reverse, for giving your peer's judgment extra weight. This leaves us with the extra-weight view.

This view basically says that if a person you judge to be a peer disagrees with you on a matter, you should suspend judgment on the issue. Why? You should suspend judgment because you have no way of telling who has the correct judgment here, if either of you do! Was there bias involved? On whose part? Miscalculations? Who did it? Several questions like these give you plenty of reason to suspend judgment. Here's an important point though. Surely, you and your peer will discuss things over, maybe review evidence and reasoning and come to a single conclusion together. Obviously, you don't suspend judgment then. You only do so prior to figuring all of that stuff out! This is a point that people usually miss in discussion.

An answer I often get to the question is, "The rational thing to do would be to discuss evidence, experiment, etc." Notice how the reply is often "..thing to do." That is the rational action to take, but not the rational thing to believe!


SO! Given that the rational thing to believe is actually a suspension of judgment, when a peer disagrees with you, prior to figuring out who reasoned incorrectly, I'll leave you with some questions.

  • There is a high probability that a person you would judge to be a peer exists and that this person disagrees with you on abortion (or any other issue). What should you believe in this case?
  • Our future-selves are often better informed than our present-selves (lest they're drunk). Given that and the equal weight-view, how should that effect our current beliefs?
  • Does the equal-weight view apply to all issues? What if you and your peer disagree on what "2+2" is?
Hopefully these questions will help you realize this stuff has actual implications for our everyday beliefs. I'll be referencing this material in the future, another good reason to know this stuff! Also, I'm attaching Elga's article.  Obviously, he explains things much more thoroughly than I do considering he is an expert on the subject (ie, an actual philosopher). It's a good read, so enjoy!