“In this era, it is common for people not to accept basic scientific data. This has now permeated society so much that people feel empowered to discount facts. My goal is to re-orient our group and say we have to look at the facts. We have to look at the science and really be willing to question our own beliefs. A responsible physician does what is best for their patients, and this should be based on evidence, not beliefs,” said James B Spies (Washington, DC, USA) who delivered the Honorary Lecture at the Global Embolization Cancer Symposium and Technologies (GEST US; 17–20 May, Miami Beach, USA).
Speaking on the topic “Preconceived notions, alternative facts
and fake news” Spies, professor of Radiology and chair of the Radiology Department at Georgetown University Medical Center, asked the questions: why do we believe what we believe? What can we do to base our understanding on objective facts? And, how do we evaluate new information about what we believe?
He explained that developing valid comparative trial data with all bias minimised has been an important goal in his career, which started out in public health before moving to the field of interventional radiology. He now believes it is “just as important to analyse how the results of such trials are interpreted and received”.
“We are trying to gather evidence using research to be able to determine what treatments work and what do not. I spent a lot of energy on study design and conducting trials in interventional radiology. But it has dawned on me, over the last several years, that there is the other half of it, which is that you can do the greatest research, but if people do not receive it in the right way, then the message does not get through. So what I tried to focus this talk on was: how is it that we receive these data, and what are our natural, inborn biases that I, and we all, carry, that lead us to question data that we do not agree with.”
Spies pointed to a whole host of biases that influence the acceptance of information. “There is a whole sociology out there regarding how people receive information. There are various types of biases that are at play when we take in information and make decisions. An important one is ‘confirmation bias’ in which we wish for something to be true, but do not know it to be so. Once a view is formed, we only embrace information that confirms that view, while ignoring, or rejecting, information that casts doubt on it. This can lead to the backlash effect, when people strike back when receiving information that challenges their set notions. So my talk also looked at how we use biases that we have in order to support our preconceptions.”
Speaking about how interventionists function, Spies explained: “We believe in what we are doing. We prefer action to inaction, even action with little chance of success is preferred over no action at all. We are also a pragmatic group and see apparent
cause-effect relationships, even in the absence of any theoretical foundation.”
Coupled with these traits is the fact that as a group “we are highly subjective and depend more on ‘gut feelings’ than
on ‘book knowledge’. We emphasise uncertainty in our defence. When things go wrong, it is not our fault. Because we deal with individual pateitns, rather than groups, we cannot rely on epidemiology concepts, or probabilities derived from population statistics. The trouble is we tend to think in anecdotes. How often do we hear ‘one patient that I treated did really well…’ Of course, we are obligated to treat patients one at a time and that is part of it. However, we cannot as a set of societies, pay for everything. We tend to be procedure-oriented people, so when technologies do not work, we should stop doing them. In the aftermath of practice changing trials such as the ATTRACT trial, we have seen this,” Spies reported.
Spies further weighed in to say: “Although these traits usually serve us well, they invite intellectual gerrymandering. So, when the evidence conflicts with our judgment, we tend to resist it, but when the evidence is consistent with our judgment we tend to embrace it. As a result, if there is even a 1% chance that some technologic advance is marginally better than the status quo, we act as if it is a certainty and discount the downside”.
Adopted, touted, doubted and discarded
“We interventionalists share all the above traits,” explained Spies. “We accept positive new trials based on the thinnest of evidence. We expect new interventions to be successful, so we are ‘early adopters’ and we tend to focus on the positive and minimise the negative. This is borne out in our adoption of technologies such as YAG lasers, Simpson atherectomy, bare metal stents for benign biliary strictures, or even angioplasty for chronic cerebrovascular venous insufficiency. These were all adopted, touted, doubted then discarded and in some cases resurrected. Once we adopt something, we believe in it and we defend it,” he commented.
Spies quoted American writer Upton Sinclair, who said: “It is difficult to get a man to understand something, when his salary depends on his not understanding it”, but qualified to say, “I would like to think that that money is not the primary reason that
people are resistant to data that they do not like; there are something like 20 different biases that come into play when we make a decision. Still, it is natural that when your livelihood and
financial well-being are threatened, the tendency is to question the source. To some extent maybe that is also healthy; because when we innovate and disrupt the work of other specialties, they then question us and it makes us do our work more rigorously, his is the positive side of the coin. But clearly a part of this [rejection of new information] is economic self-interest, or protection of what we do. But the truth is no one I know does this purely for money, we believe in what we do. So really, we need to learn more about how we can overcome the obstacles to accepting new facts.”
An open mind
Reiterating his key message, Spies stated: “We resist data that contradicts our beliefs. There is a misconception that your opinions are the result of years of rational, objective analysis. The truth is that your opinions are the result of years of paying attention to information which confirmed what you believed, while ignoring information which challenged your preconceived notions. It takes strong and convincing evidence to change our preconceived notions. And our first reaction to such evidence is to doubt it. Progress is measured by our willingness to change our views based on new evidence, so come to the table with an open mind.”