The death of Stephen Hawking, while deeply emotional for many in the scientific community and broader society, also offered a welcome opportunity to revisit some of the less well-known contributions of this intellectual giant. One of his quotations I hadn’t seen before that struck me a particularly resonant in our current time is as follows:
“The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge.”
For anyone who follows my writing, it will come as no surprise that this notion resonated strongly with me. It comes in the same week that I saw highlighted on twitter an article from Science magazine in 1974 about the “American Tentative Society”, a group “committed to the idea that knowledge is, indeed, tentative”. The human mind has a tendency to see patterns and connect dots in scattered observations (a concept known as Apophenia), but the coincidence of timing of the article and, sadly, the death of professor Hawking, prompted me to write briefly on the topic.
The American Tentative Society seemed not be as flippant as the name might suggest. The short write-up in Science detailed the society’s search for meaningful projects, since they had been recently funded through the will of a co-founder. A short google search suggests that the society’s profile remained relatively small; some awards for journalism were given; one is detailed as being given to Dr Stephen Jay Gould for his adjustment of evolutionary theory (‘Punctuated Equilibrium‘). Beyond this, perhaps it’s hardly a surprise that the society hasn’t achieved widespread notoriety; not many journalists or scientists make their careers highlighting how they or others were wrong in the past, nor by leaving open the possibility that they themselves might be wrong.
The idea, at it’s core, seems important to me. Deluding ourselves that our prior knowledge is fixed and eternal blocks alternative viewpoints and stifles scientific discussion; viewpoints stagnate and change is hindered. It’s rare indeed that a new discovery is made that doesn’t in some way alter the state of knowledge as it was previously. Nevertheless, there’s a strong counter-argument for living our lives in a state of constantly questioning our own knowledge; knowledge may be uncertain, but the actions we take are irreversible. Making decisions about how to act based upon uncertain information is, I would argue, significantly more difficult than if one believes in the infallibility of the facts at hand. How many governments would be able to justify action based upon information they weren’t convinced of?
As ever, my view is we need to sit somewhere between the two extremes; we can’t take no action, but we can’t be sure of our knowledge. There’s precedent here, of course; the various formulations of the Precautionary Principle serve as legal and social frameworks for this grey area. In general terms, action to limit risk and harm shouldn’t be prevented simply because there is uncertainty around the potential for harm. There are stronger applications of the principle for different applications, but I think it’s a fair way to balance decision-making.
Interestingly, the way in which this grey area is understood and dealt with is likely to differ around the world. Social studies (famously Hofstede’s Cultural Dimensions Theory) in the last 50 years have looked at how different societies deal with uncertainty, quantifying it with the so-called ‘Uncertainty Avoidance Index’. There are wide variations across regions, some perhaps surprising (Germany and Belgium have a low tolerance for uncertainty, while nearby Denmark and Sweden seem to be much more tolerant), but perhaps we can learn something from the cultural approach of those countries that are more capable of incorporating uncertainty into their lives. Such cultural mores and tropes may be beneficial in navigating the grey area between action and uncertainty.