You do not have Javascript enabled. Some elements of this website may not work correctly.

People have a natural tendency to rely on simple rules, or heuristics, when reasoning about complex situations. Sometimes these heuristics are fairly reliable, and can save people from having to analyze the situation in a more thorough way, but in other cases they may systematically lead to false conclusions.

When a heuristic systematically misleads people, it is known as a cognitive bias. For instance, one bias is the ‘gambler’s fallacy’, which leads people to believe that if some event, such as a gambler rolling a six, has not happened in a while, then it is especially likely to happen again soon. Biases are not limited to empirical questions: they may also occur in ethical decision making.

Members of the effective altruism community often try to become more aware of such biases, in order to better develop the capacity to form accurate beliefs.

It should be noted that conclusions produced by cognitive biases are not necessarily wrong, so a bias noticed in another person’s argument is not sufficient for rejecting their position. This is particularly important to keep in mind, since there are so many potential biases that even the most reasonable positions can likely be accused of arising from one.

Further reading

Effective Altruism. 2014. Comparative bias.
Criticisms of evaluating positions by their biases.

Hurford, Peter. 2013. Why I’m skeptical about unproven causes (and you should be too).
A list of reasons why people might be biased towards existential risk prevention.

Wikipedia. 2016. Heuristics in judgment and decision-making.
An overview of types of heuristics and biases, and theories about them.

Wikipedia. 2016. List of cognitive biases.

Yudkowsky, Eliezer. 2008. Cognitive biases potentially affecting judgment of global risks. Global Catastrophic Risks, edited by Nick Bostrom and Milan M. Ćirković 91-119. New York: Oxford University Press.