The Challenge of Rational Open Governance

What’s an honest policy informatician to do?

 

“Whenever the people are well-informed, they can be trusted with their own government.” - Thomas Jefferson

by the CPI Lab

As members of a Center for Policy Informatics - where we subscribe to the belief that complex public policy challenges can be addressed in part “by leveraging computation and communication technology to meaningfully connect people, harness knowledge, and facilitate informed and empowered deliberation and action” - we couldn’t help but notice some recent Internet chatter highlighting research that suggests a person’s ability to solve a mathematical problem is affected by their political beliefs. In most instances 2+2 = 4. Except, it turns out that if the answer 4 conflicts with your beliefs or preferences a different answer is more likely to be chosen than if congruent with beliefs. And then, telling you that the answer is 4 may only cause you to more strongly hold onto your insistence that it’s actually 30. Or -5. Or that the premise of the question is flawed.

 

Such a deeply troubling finding causes us to ask: how can we have open governance - indeed, how can we hope to have any kind of rationale governance - when we can’t rely on evidence to frame our discussions? Moreover, how can we hope to facilitate informed decisions and actions when ideological beliefs trump scientific facts? Clearly, this is something we need to look into.

 

Starting first with "Scientists’ depressing new discovery about the brain" (Marty Kaplan writing in Salon), which is really a shorter form of "Science confirms: Politics wrecks your ability to do math" (Chris Mooney, Grist), both are essentially reporting on recent research that shows that highly numerate people with strong political convictions fail a mathematical challenge when the correct answer to the math problem contradicted their political beliefs. This finding was even more likely when the subject was good at math and when their political beliefs were strongly entrenched.

 

Before going much further, we should probably look at the original research that’s in a draft working paper called "Motivated Numeracy and Enlightened Self-Government" (Dan Kahan, Ellen Peters, Erica Cantrell Dawson and Paul Slovic, who hold affiliations with Yale, Harvard, Ohio State, Oregon, and Cornell - reasonably reputable institutions). This paper compares two explanations for how people’s beliefs can deviate from the evidence: the “Science Comprehension Thesis” (SCT), where persistent controversy over policy-relevant facts are attributed to deficits in the public’s capacity to understand scientific evidence; and the “Identity-protective Cognition Thesis” (ICT), where the public’s capacity to interpret decision-relevant science is disabled by cultural and political conflict. Political "polarization did not abate among subjects highest in Numeracy; instead, it increased. This outcome supported ICT, which predicted that more Numerate subjects would use their quantitative-reasoning capacity selectively to conform their interpretation of the data to the result most consistent with their political outlook.” In short, believing causes seeing.  

 

(It’s interesting to note that Kahan et al. have become embroiled in political controversy over the controversial politics of their findings. It seems that people tend to support their findings when they agree with them, and otherwise pillory them when the results are less than complimentary).

 

This latest round of “do evidence and facts even matter?” supports several existing theories that have important implications for environmental communication, policy and collaborative governance. The theory of cultural cognition suggests people form risk perceptions “that reflect and reinforce values that they share with others” (Douglas & Wildavsky, 1982 in Risk and culture: An essay on the selection of technical and environmental dangers). Confirmation bias suggests that strongly held perceptions,whether driven by culture, cognition or political affiliation, will prevent individuals from changing their beliefs when presented with information that conflicts with their existing beliefs. “Blowin’ in the Wind: Short-Term Weather and Belief in Anthropogenic Climate Change” (Hamilton and Stampone, 2013) measured climate change beliefs, political affiliation and daily temperatures in New Hampshire and found that temperature fluctuations and anomalies influenced independent beliefs about climate change while Democrats and Republicans remained “far apart and firm in their beliefs about climate change.” In “When corrections fail: The persistence of political misperceptions” (Brendan Nyhan and Jason Reifler, in Political Behavior - also available in draft form here), the idea of a “backfire effect” is proposed: “individuals who receive unwelcome information may not simply resist challenges to their views. Instead, they may come to support their original opinion even more strongly.” Even more alarming than the finding that evidence doesn't convince people of something - it turns out that more facts can actually entrench people's opposite beliefs, making them hold on more strongly to what they believe. The idea of a “backfire effect” from too much evidence was popularized in several places, notably “Researchers discover a surprising threat to democracy: our brains” (Joe Keohane writing in the Boston Globe) and by David McRaney on his blog “You are not so smart: a celebration of self-delusion.”

 

It is possible that what's really being observed is that highly numerate people didn't so much as get the math wrong as their numeracy and their perspective caused them to view the data sceptically. But even this is explained in the research literature: In “Motivated Skepticism: Use of Differential Decision Criteria for Preferred and Nonpreferred Conclusions”, Peter Ditto and David Lopez show that “information consistent with a preferred conclusion is examined less critically than information inconsistent with a preferred conclusion” and this appears to be extremely important for individuals with greater scientific numeracy. In a previous study (Kahan et al. 2011), it was reported that the “cultural polarization” effect was more distinct among highly literate individuals: “respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence more concerned, as science literacy and numeracy increased.”

 

These findings - the tendency to interpret evidence in a way that suits our beliefs, to have our beliefs reinforced by evidence that supports those beliefs and the inability of contrary evidence to persuade us otherwise - would seem to lie at the root of group polarization. As Cass Sunstein wrote recently on his interpretation of the October 2013 government shutdown, when like-minded people speak only to each other - if I only trust Fox News and you only trust MSNBC - they tend to become more extreme, more confident and more unified (Sunstein also wrote about the way the Internet could fuel such “echo chambers” in his 2009 book Republic.com 2.0). When this happens broadly in society, we lament the absence of civil discourse, reasoned debate and social capital. When it happens amongst our legislators - which we lament more, since we had hoped that as professional fact-checkers they might find more things they agree on, not less - we get stalemate. There is a persuasive argument put forward by proponents of agonistic pluralism (see Chantal Mouffe “Deliberative Democracy or Agonistic Pluralism”) that the ideal of deliberative democracy is perhaps a little too idealistic, that democracy involves an “ineradicability of antagonism and the impossibility of achieving a fully inclusive rational consensus,” and that we should stop striving for it. Well … that was easy.

 

There appear to be many public policy issues where there is fairly strong empirical evidence, yet we still have intense public debate not just about the choices to be made in the face of a looming problem, but about the very nature of the problem - or even whether a problem exists. Anthropogenic global warming is perhaps the most prominent issue, but such issues can be found in policy areas such as public health, public safety, fiscal and monetary policy and environmental risk. On global warming, there are some who look at the Fifth Assessment Report of the Intergovernmental Panel on Climate Change as the point at which the climate “debate” can finally be settled, where every reasonable person will now agree that human activity has played a significant role in recent climate change and that we must act now. Others who see the issue differently immediately pointed to the absence of observed warming and past over-predictions as proof that the science can not yet been declared definitive.

 

Public policy analysis a generation ago was characterized by a belief that more evidence will lead to better decisions, that education is the foundation of rational behaviour, and that in a democracy we should begin with a discussion of the facts. There’s a quote attributed to the late Senator Daniel Patrick Moynihan - “Everyone is entitled to his own opinion, but not to his own facts” - that characterizes this period in the 1970s, a golden age of policy debate during which Senator Moynihan was a key player.

 

That era appears to be well over (if it ever really existed). But in its wake, we are left wondering what hope there is for fields like policy informatics that seek to inform public discourse to enhance collaborative governance? In an environment like this, what’s an honest policy informatician to do?

 

In this post-positivist era, we're suspecting that it is not "information is what changes us" (Stafford Beer, Platform for Change, p. 243); rather information is what we believe it is. So what can change us and “facilitate informed and empowered deliberation and action” as our mission statement urges us to strive for?

 

While information can be transmitted in documents and videos and conversations, it will lack resonance and tangibility with the recipient in the absence of experience or feeling. This is partly addressed by what people like David Roberts argue for (“The Futility of ‘Just the Facts’ Climate Science”) in the bringing together of science and politics, for scientists to bring their values together with their findings when they communicate their findings: “if scientists want the information they convey to be understood and absorbed [to have meaning for their audience], they will have to speak as humans speak, from within a cultural identity and a set of values, not hovering above such mortal concerns.”

 

If we assume that the aforementioned evidence is correct, that “individuals will reflect and reinforce values they share with others” and that conflict over public policy is fostered by fundamentally different cultural worldviews and values, we start to wonder how we can build shared experiences, influence mental models and promote cultural consensus that will enhance collaborative decision making.  

 

In the Center for Policy Informatics, we explore the potential for technology, computer simulation and participatory modeling to change mental models, perspectives and ultimately behavior. We measure the influence of games and visualization on mental models and beliefs. As just one example, LinkIT (LinkIT: A Ludic Elicitation Game for Eliciting Risk Perceptions, by Yan Cao and  William L. McGill) is a game-like tool designed to elicit group understanding of the relations between risk factors.

 

Our work looks towards the generation of experience and empathy through mechanisms more closely related to policy informatics. CPI is currently investigating whether participatory simulation and collaborative decision making can generate synthetic empathy. Empathy is the act of imagining, understanding, and actively responding to the conditions and perspectives of another related to a particular situation. This project uses a computer mediated synthetic environment as a deliberation space for individual participants to explore the perspectives of others, arrive at consensus, and make decisions for sustainable outcomes under conditions of uncertainty.

 

Drawing on the observations in “When Truth Is Personally Inconvenient, Attitudes Change: The Impact of Extreme Weather on Implicit Support for Green Politicians and Explicit Climate-Change Beliefs,” we see how our experiences can change what we believe. Joint experiences can promote shared mental models (Robert et al. 2008). Direct experience with extreme weather events increases a person’s support for politicians promoting environmental causes (see “After the Storms, A Different Opinion on Climate Change”). We do not suggest creating hurricanes in order to change beliefs about climate change (the human research ethics challenges would be significant, for starters). But we can draw on the use of computation and communication technology as a means to simulate experience through immersive computer environments. Additional research is currently exploring whether the development of a participatory model can generate relational social capital, enhance reciprocity and knowledge exchange (e.g., Takahashi 2000, Yamagishi and Cook 1993; Robert et al. 2008).


If ideology can edge out opposing opinions, scientific facts, and influence basic math skills, information alone will not change us. Evidence-based decision making and public policy will also be influenced by shared norms, experiences and values. Our initial results suggest that computer-based simulations, participatory modeling and collaborative platforms, can be used to facilitate informed decisions and build collaboration. We hope that future research will contribute to continued improvements in our understanding of the complex processes that shape perspective-building and decision-making, ultimately leading to public policy and governance choices that lead to genuine societal improvements.