A few months ago the trade journal ENDS Report solicited a piece from me based on a pitch from the publicist of my book, Ethics for a Full World, or Can Animal-Lovers Save the World? The pitch was about the impact of human cognitive errors on messaging. In the end, END Report decided not to use my piece, citing that it was too general for their readership. Maybe it is a message their environmental consultant readers wouldn't want to hear... Never one to let anything go to waste, here is the piece for a general readership anyway:
For decades, activists, scholars, and NGOs have made the mistake of assuming that once the facts were known and people and politicians truly understood, they would take action. But there are many reasons why things still don’t happen even when the facts are available and widely understood. There is a rich body of research identifying cognitive biases and other mechanisms that make people respond in weird ways to information. Coupled with well-funded disinformation campaigns and the political and financial might of opposing forces, the existence of such mechanisms go a fair way towards explaining why the dissemination of information has not gotten us further towards positive outcomes. Furthermore, if you take it upon yourself to enlighten others, all your opponents have to do is pretend that they still aren’t enlightened.
Everybody suffers from time squeeze. Nobody has time to do their job properly, or their civic duties, and there are always limited degrees of freedom left over to engage with promising new initiatives. Endless reports are produced, and languish in filing drawers. Most of the people you are trying to reach don't have the background to really understand the subject matter, and they won't realize it thanks to the Dunning-Kruger effect. The incompetent are incompetent to recognize their own incompetence. And you won’t get enough time to get them up to speed.
In our complex societies, nobody really has the power to get the important things implemented. We know the kinds of things that need to be done, but nobody really knows how to get humanity to act when we need to. Everybody is hamstrung for lack of critical mass behind a given idea. It is too easy to obfuscate, and thereby confuse large segments of the population, and divide humanity against itself. Everybody knows the frustration of not getting their message through. But often it is not really the lack of messaging, or even understanding, but the lack of ability to get anything important done. Time again: in the important matters, we don’t have time to wait.
People push information that has unwanted implications for their worldview, their ideology, or their in-group, away. Confirmation bias makes us absorb facts and accept hypotheses much more readily when they conform to our ideology, and disregard when they don’t. Providing information that corrects people’s beliefs can even backfire by making them cling even more strongly to their mistaken notions. Start by finding some common ground, framing communication in terms of values that they already buy into, and avoid threatening their identity politics.
Popularization of complex material may also backfire: if you make it seem simple, people think they know as much as, or more than, the experts. Unfortunately, the environment and natural resource management are among those fields that the man in the street can misguidedly believe that he understands, and has a justifiable opinion of.
Make sure you are in the right game. Is this the best use of your time? Work on the right scale. Nothing undermines our credibility like “solutions” that are hopelessly inadequate in scope and execution.
Will the politicians lead, or do politicians only follow? Politicians and bureaucrats can’t be very effective if they don’t really understand. It is not just a matter of allocating money when you want something good to happen. The job also has to be done well, and some good people have to be hired to do the job properly. And hiring well is also a skill.
Politicians and bureaucrats succumb to the pressure to be seen to be acting, when the spotlight is on. Unless particularly trained, people have a tendency to completely disregard the likelihood, or unlikelihood, of particular events—one of several mechanisms behind irrational fears. Narrative fallacy: humans can form seemingly coherent narratives to most any string of events, when more likely the overabundance of sheer data causes us to understand less rather than more when we follow the “news” diligently. Normalcy bias: people have a hard time believing that something that has not happened before, or that they have never seen happen before, can happen.
People rationalize, and compartmentalize. Wishful thinking is another cognitive error. It can lull people into thinking things will be all right, even if they don’t get personally involved. These are all mechanisms that weaken our democracies and our civic engagement.
Ultimately, it all comes down to ethics, to our values. What do we really care about?