You do not have Javascript enabled. Some elements of this website may not work correctly.

In some decisions, the causal and evidential expectations of our actions can come apart. Decision theorists are divided on how to rank actions when this happens.

Both evidential and causal decision theories say that the value of action A is the probability that outcome O will result if A is performed, multiplied by the value of outcome O, summed for all possible outcomes. However, the two approaches disagree about which sort of probability should be used. Evidential decision theorists look at a person’s subjective probability that an outcome will occur if they take a given action. In contrast, causal decision theorists look at the subjective probability that a person’s actions will cause or produce a given outcome. For an example of how these can come apart: a person might have a higher credence that their friend has a broken leg if they are in a hospital, but they won’t think that it is the case that being in a hospital caused them to have a broken leg.

The classic example where evidential and causal decision theory diverge is Newcomb's problem. Imagine that a treasure hunter enters a long-deserted cave that contains only two boxes: a transparent box containing $1,000, and an opaque box made of metal. Thousands of years ago a near-perfect predictor predicted whether the treasure hunter would take both the opaque box and the transparent box, or just the opaque box. If the predictor predicted that the treasure hunter would take only the opaque box, then he put $1m into it. If he predicted that the treasure hunter would take both boxes, then he left the opaque box empty. The predictor has long since departed and can no longer influence what is in the boxes. Knowing all of this, should the treasure hunter take both boxes, or just the opaque box?

In this case, causal decision theorists will generally say that the treasure hunter should take both boxes: they cannot influence the predictor's prediction or the content of the boxes (there is no causal link), and so leaving the transparent box would simply be a needless loss of $1000. Evidential decision theorists will generally say that the treasure hunter should take the opaque box only, since doing so makes it more subjectively likely that it will contain $1m.

Further reading

Arbital. 2016. Causal decision theories.

Joyce, James M. 2009. Causal decision theory. Cambridge: Cambridge University Press.

Wikipedia. 2016. Causal decision theory.

Wikipedia. 2014. Evidential decision theory.