The first or peripheral cognitive system is characterized by the use of heuristics or rules of thumb; it is the basis of what is ordinarily considered intuition. This method for reaching judgments has of course some likelihood of going wrong, but at the same time is frugal, not expending scarce cognitive resources like time and attention. The second cognitive system, sometimes called central processing, is characterized by a higher degree of elaboration in reasoning, and involves what is normally considered critical thinking. Although it is effortful and slow, this method for reaching judgments is also likely to correct some of the errors of the easier peripheral processing.
Much of the current literature on science communication focuses on techniques that affect an audience's peripheral processing. For example, Matthew Nisbet and his collaborators advise communicators to carefully frame their messages in order to invoke some desired associations and experiences. Randy Olson's recent Don't Be Such A Scientist can also be read as a useful and amusing compendium of methods for increasing trust by appealing to an audience's peripheral cognitive processes, projecting cues such as likeability, physical attractiveness, and dynamic delivery.
Appeals to peripheral cognitive processes, however, are unlikely to be completely successful in increasing trust in climate scientists. Some cognitive heuristics such as confirmation or my side bias will tend to further entrench the positions of those who already distrust scientists' messages. Further, in a controversy as heated as that over global climate change, appeals to peripheral processing may be ineffective because when detected and called out by opponents, the communication techniques may appear manipulative and even fallacious. Not only will such messages be unpersuasive, they will tend to further increase distrust in the communicators.
In this paper, we therefore aim to supplement previous discussions of appeals to peripheral processing with a discussion of how climate scientists can give their audiences good reasons for trust, thus appealing to their audiences' central processing/critical thinking. What are good reasons for trust? This has been the subject of significant recent scholarship in philosophy, political theory, and argumentation theory. These fields use humanistic methods such as conceptual analysis and pragmatic reconstruction to build theories of the kinds of reasons for trust which are likely to survive even harsh critical scrutiny. While social scientific methods can show us what heuristics audiences in fact are using--an empirical question--philosophical methods can show us what reasons audiences ought to accept as good--a normative or value question.
In particular, we will apply the line of research one of us (Goodwin) has developed in the communication subfield of argumentation theory, which provides an explanation of how trust can be secured even under conditions of deep disagreement. To summarize, this approach proposes that communicators can earn trust by openly taking responsibility for the possibility of errors and unforeseen consequences. A simple instance of this practical logic is the used car dealer who can reasonably be suspected of peddling lemons, but who succeeds in persuading some customers to buy by offering an extended guarantee. This enforceable undertaking of extra responsibility creates for his audience a new reason to trust him. As another example, this analysis suggests that the viewing public does not trust their local weathercaster because of his record of consistently accurate predictions. Instead, the public has reason to trust the weathercaster because they have repeated opportunities to observe how he takes the consequences for his mistakes.
When applied to the communication of climate science, this analysis suggests the somewhat paradoxical conclusion (also proposed in the work of Brian Wynne) that climate scientists may be more trusted if they present themselves as less certain. Instead of stressing the inerrant consensus that backs their statements, it would be a more effective appeal to gain trust via central processing if they openly made themselves vulnerable to criticism for any mistakes they may make. In order to make themselves vulnerable and thus earn trust, scientist-communicators will need to pursue two interlinked communication strategies. First, scientist-communicators will need strategies for assuring the public that scientists will in fact be held responsible and bear significant consequences, if it turns out that what they are saying is wrong. Second, because global climate change is not directly perceptible by ordinary means, scientist-communicators will need to develop and convey indicators which make future climate change visible to non-scientists in the same way that a car's soundness or the local weather is visible. In sum, to earn the public's trust in their risk communication, scientists must accept a risk themselves--the risk of being shown to be wrong.
Goodwin, J. (2010). Trust in experts as a principal-agent problem. In C. Tindale & C. Reed (Eds.), Dialectics, Dialogue, and Argumentation (pp. 133-143). London: College Publications.
Leiserowitz, A.A., Maibach, E.W, Roser-Renouf, C., Smith, N. & Dawson, E. (2010). Climategate, public opinion, and the loss of trust. Retrieved from http://climatechangecommunication.org/resources_reports.cfm.
Nisbet, M.C. (2009). Communicating climate change: Why frames matter for public engagement. Environment, 51, 514-518.
Olson, R. (2009). Don't Be Such a Scientist: Talking Substance in an Age of Style. Washington, DC: Island Press.
Wynne, B. (1992). Misunderstood misunderstandings: Social identities and public uptake of science. In A. Irwin & B. Wynne (Eds.) Misunderstanding science? The Public Reconstruction of Science and Technology (1-18). Cambridge: Cambridge UP.