Sometimes we find that we disagree with smart, well-informed people about how the world is or ought to be, even if those people seem to share our prior beliefs, values, our evidence. If we think that there is a restricted range of credences that are rational on the same prior beliefs and evidence, then disagreement with others is evidence that at least one of us has made a mistake in our reasoning (White 2005). This is not so surprising in cases where the evidence is complex and difficult to assess.
What should we do when we encounter this kind of disagreement? The responses to this problem range from more “conciliatory” responses, which say that we ought to move closer to the views of our peers in cases of disagreement, to less conciliatory responses, which say that we ought to stick to our guns if we haven't made a mistake.
More conciliatory responses, such as the view that we should treat ourselves and others like truth thermometers (White 2009), offer more actionable advice than radically non-conciliatory views. The latter say that you should stick with your own views if you are right and move to your interlocutor's views if they are right, even though you don't know which of you is right. And sometimes we want to treat the testimony of others as evidence, even if we don't have access to their reasoning. Results like Condorcet's jury theorem (Wikpedia 2016) suggest that if many independent agents converge on the same answer to a question, we should treat that as good evidence that the answer they have converged on is correct.
Blanchard, Thomas & Alvin Goldman. 2015. Peer disagreement. In Edward Zalta (ed.), Stanford encyclopedia of philosophy.
White, Roger. 2009. On treating oneself and others as thermometers. Episteme 6(3): 233-250.
White, Roger. 2016. Epistemic permissiveness. Philosophical perspectives 19: 445-459,
Discusses how much evidence determines appropriate beliefs, and whether it might be epistemically permissible to believe a range of things.
Wikipedia. 2016. Condorcet's jury theorem.