4 Aug 2021

Breeur (2.5) Lies – Imposture – Stupidity, Ch.2.5, “Reduction to Simplicity”, summary

 

by Corry Shores

 

[Search Blog Here. Index-tags are found on the bottom of the left column.]

 

[Central Entry Directory]

[Roland Breeur, entry directory]

[Breeur, Lies – Imposture – Stupidity, entry directory]

 

[The following is a paragraph by paragraph summary of Breeur’s text. Boldface, underlining, and bracketed commentary are my own. Proofreading is incomplete, so please forgive my mistakes. The book can be purchased here.]

 

 

 

 

Summary of

 

Roland Breeur

[Breeur’s academia.edu page and researchgate page]

 

Lies – Imposture – Stupidity

 

Part 1

Lies and Stupidity

 

Ch.2

Alternative Facts and Reduction to Stupidity

 

2.5

“Reduction to Simplicity”

 

 

 

 

 

 

Brief summary (collecting those below):

(2.5.1) (Recall from section 2.4 how industries, in the face of a preponderance of scientific findings that threaten their profits, might fund their own counter-research efforts. The purpose is to fabricate a “scientifically legitimized” opposing view, so to hinder adverse public policy and public opinion.) In response to industry’s clouding of scientific debate, one response is to insist on the better, more factual truth. But this would only work if the industry-funded scientists were simply lying. Instead, their efforts to cloud the debate only create “a context in which truth ceases to be of value” (46). (Perhaps the idea is the following. When both sides are given equal weight in the public discourse, all while only one is based in a genuine effort to conduct unbiased scientific research, then both true and false statements, or both good and bad scientific research, is regarded equivalently. As such, the true and good knowledge is lowered to the level of an alternative opinion or view or interpretation of the data, rather than being something with special merit.) This, then, involves the “reduction to stupidity” (see section 2.2.5, and summary at section 2.3.1) (where discourse is “reduced to hot air” on account of people holding on to inferior views,) which prevents the superior ones from promulgating and having persuasive effect. (2.5.2) This strategy often involves efforts to fight the bad, false and misleading industry funded research by replacing it with good, true research. Breeur calls this “stupidity as disproportion.” Yet, the reason so many people disbelieve good science (as for example the overwhelming scientific evidence for human-caused climate change) has to do with what they are willing to believe. Many people simply do not want to believe the truth. For instance, they may fall victim to confirmation bias: “i.e. people’s tendency to believe and value above all facts and information that confirm their cherished premises and presuppositions – or, negatively put, people’s unwillingness to accept facts and evidence that contradict their cherished premises and presuppositions” (46). (2.5.3) There are a number of cognitive phenomena that might explain why people tend toward such irrational thinking: “In the case of the backfire effect, a person will not only refuse to accept the evidence challenging his or her beliefs, they will even feel strengthened in their beliefs by evidence to the contrary, ‘doubling down’’ on their cherished beliefs. In the case of the Dunning-Kruger effect, also known as the ‘too stupid to know they’re stupid’ effect, people referred to as ‘low-ability subjects’ fail to recognize their own intellectual deficiencies” (47). (2.5.4 Some (including Lee McIntyre) use such cognitive phenomena to explain post-truth phenomena. What is puzzling is that the tendency to believe truth rather than these falsehoods should increase our survival chances. So (from an evolutionary standpoint) it is unclear why these cognitive biases are so prevalent. It is also found that conservatives tend to exhibit the “negativity bias,” as they “seem more inclined to believe threatening falsehoods than liberals” (47). McIntyre speculates that this may be explained by the fact that conservatives tend to have larger amygdalae. (2.5.5) Breeur notes that even McIntye’s thinking has its own biases. For instance, it is “politically polarized vis-a-vis the Democratic left versus the Republican right” (48). Also, “McIntyre’s reference to this mysterious amygdala – which is part of the so-called ‘limbic system’’ – is as perplexing as that of Descartes’s references to the pineal gland” (48). On the one hand, McIntyre aims to be scientifically rigorous in order to unearth the causes for why we are unwilling to accept inconvenient yet scientifically verified truths; while on the other hand, the scientific research he uses, although ingenious and inventive, nonetheless finds just trivial, banal truths. There is a striking discrepancy between these two factors (which demonstrates the futility of McIntyre’s approach). Yet, “It is on the basis of such banalities that a whole discourse has been created, which from the Olympian altitude of its ‘objective facts’ and ‘experimental evidence’ valiantly commits itself to fight against the impostors in power” (48). (2.5.6) Because this scientific response is so “full of clichés or simplifications,” it can be perplexing and in the end prove futile, weak, and stupid in the face of the “proliferation of alternative facts” (49). The problem with this approach is that it does not realize that not all truths have the same values, as some prove to be too simplistic or banal to have any real weight or effect. “This is the reduction to simplicity: Regardless of their truth status, the value of truth claims can be (if they are not often) so simplistic, so banal, so trivial, etc., that it is difficult to understand how anyone could believe in their efficacy. The anthropological and metaphysical models at the core of many theories and programs are so simplistic and vain that one cannot see how they could be invoked to fight against the indifference to truth that permeates contemporary society” (49).

 

 

 

Contents

 

2.5.1

[The Failure to Combat Industry Misinformation Campaigns by Insisting on Better Science]

 

2.5.2

[Confirmation Bias]

 

2.5.3

[Some Cognitive Phenomena Involved in Science Denial]

 

2.5.4

[The Negativity Bias of Conservatives]

 

2.5.5

[The Banality of Science Insistence]

 

2.5.6

[Reduction to Simplicity]

 

Bibliography

 

 

 

 

 

 

 

 

 

Summary

 

2.5.1

[The Failure to Combat Industry Misinformation Campaigns by Insisting on Better Science]

 

[(Recall from section 2.4 how industries, in the face of a preponderance of scientific findings that threaten their profits, might fund their own counter-research efforts. The purpose is to fabricate a “scientifically legitimized” opposing view, so to hinder adverse public policy and public opinion.) In response to industry’s clouding of scientific debate, one response is to insist on the better, more factual truth. But this would only work if the industry-funded scientists were simply lying. Instead, their efforts to cloud the debate only create “a context in which truth ceases to be of value” (46). (Perhaps the idea is the following. When both sides are given equal weight in the public discourse, all while only one is based in a genuine effort to conduct unbiased scientific research, then both true and false statements, or both good and bad scientific research, is regarded equivalently. As such, the true and good knowledge is lowered to the level of an alternative opinion or view or interpretation of the data, rather than being something with special merit.) This, then, involves the “reduction to stupidity” (see section 2.2.5, and summary at section 2.3.1) (where discourse is “reduced to hot air” on account of people holding on to inferior views,) which prevents the superior ones from promulgating and having persuasive effect.]

 

[ditto. (Recall the “reduction to stupidity” from section 2.2.5. Breeur was discussing how one’s insistence on their opinions is an instance of stupidity, because they persist with their faulty views even when confronted by superior ones. It would be better to have flexibility and drop bad judgements in favor of more informed and considered ones, and then to espouse the better ones instead. This would facilitate the flow and prosperity of truth, which involves development, refinement, adaptation, and so forth. When instead that flow is blocked because some people insist on keeping their faulty opinions in the face of other people’s better ones, this depletes those superior ones of their power to flow, promulgate, and have influence on other people’s minds. In that way, the better views are “reduced to hot air” so to speak, and overall it reduces the general discourse to inferior judgments. This reduction of the truth-flow power of evolving judgments is the “reduction to stupidity”.]

How to react against these forms of strategic and structural castings of doubt and confusion? There is a tendency, shared by McIntyre, to focus and remain fixated on the problem of truth. Indeed, what these fake experts do is create fake information – that is, they lie. But on such a socially extended and structurally implemented level, the reclaiming of truth seems | at best misguided, for these fake experts actually do not lie – they merely create a context in which truth ceases to be of value. Hence the reduction to stupidity.

(45-46)

[contents]

 

 

 

 

 

 

2.5.2

[Confirmation Bias]

 

[This strategy often involves efforts to fight the bad, false and misleading industry funded research by replacing it with good, true research. Breeur calls this “stupidity as disproportion.” Yet, the reason so many people disbelieve good science (as for example the overwhelming scientific evidence for human-caused climate change) has to do with what they are willing to believe. Many people simply do not want to believe the truth. For instance, they may fall victim to confirmation bias: “i.e. people’s tendency to believe and value above all facts and information that confirm their cherished premises and presuppositions – or, negatively put, people’s unwillingness to accept facts and evidence that contradict their cherished premises and presuppositions” (46).]

 

[ditto]

As a reaction to this reduction, however, a new form of stupidity emerges in the form of the aforementioned misguided and inappropriate emphasis on truth. This is what happens when media outlets stress the importance of “fact checking” in a domain where, as a matter of fact, facts do not matter. What McIntyre does is a bit different. He endeavors to fight bad research with good research, i.e. to replace the false with the true. In this process is generated a new form of stupidity which could be termed stupidity as disproportion. By way of a beginning, consider the following urgent question: Why are so many people blind to the truth? Why, for example, do so many people go around denying climate change? Why do they refuse to listen to reason? Why do they reject out of hand the wise, well-intentioned, and edifying sermons of the scientists? Here, McIntyre appeals to the study of psychological mechanisms as they have recently been investigated by cognitive and behavioral psychologists. The basic idea is simple: The explanation is that many people do not want the truth. One can circumvent this essentially tautological formulation with reference to the notion of “confirmation bias,” i.e. people’s tendency to believe and value above all facts and information that confirm their cherished premises and presuppositions – or, negatively put, people’s unwillingness to accept facts and evidence that contradict their cherished premises and presuppositions.

(46)

[contents]

 

 

 

 

 

 

2.5.3

[Some Cognitive Phenomena Involved in Science Denial]

 

[There are a number of cognitive phenomena that might explain why people tend toward such irrational thinking: “In the case of the backfire effect, a person will not only refuse to accept the evidence challenging his or her beliefs, they will even feel strengthened in their beliefs by evidence to the contrary, ‘doubling down’’ on their cherished beliefs. In the case of the Dunning-Kruger effect, also known as the ‘too stupid to know they’re stupid’ effect, people referred to as ‘low-ability subjects’ fail to recognize their own intellectual deficiencies” (47).]

 

[ditto]

Intellectuals, for instance, claim truth in the name of reason and science in their fight against all forms of “beliefs” nourished by emotions or feelings (i.e. the subjective), and they call upon science (in McIntyre’s case, upon cognitive psychology) to understand the mechanisms of the brain that explain tendencies towards “irrationality.” McIntyre himself, in an effort to understand the current electorate’s lack of rationality, refers to a number of tests carried out by psychologists that |reveal myriad “fascinating cognitive biases” from the “backfire effect” to the “Dunning-Kruger effect.” In the case of the backfire effect, a person will not only refuse to accept the evidence challenging his or her beliefs, they will even feel strengthened in their beliefs by evidence to the contrary, “doubling down’’ on their cherished beliefs. In the case of the Dunning-Kruger effect, also known as the “too stupid to know they’re stupid” effect, people referred to as “low-ability subjects” fail to recognize their own intellectual deficiencies.46

(46-47)

46. McIntyre, Post-Truth, pp. 48-51.

(47)

[contents]

 

 

 

 

 

 

2.5.4

[The Negativity Bias of Conservatives]

 

[Some (including Lee McIntyre) use such cognitive phenomena to explain post-truth phenomena. What is puzzling is that the tendency to believe truth rather than these falsehoods should increase our survival chances. So (from an evolutionary standpoint) it is unclear why these cognitive biases are so prevalent. It is also found that conservatives tend to exhibit the “negativity bias,” as they “seem more inclined to believe threatening falsehoods than liberals” (47). McIntyre speculates that this may be explained by the fact that conservatives tend to have larger amygdalae.]

 

[ditto]

These mechanisms and effects have led McIntyre (among others) to attempt to connect them with, or use them to explain, the post-truth phenomenon. Basically, the hope is to find some answer for the consternating fact that we (rational animals) cannot see that “believing the truth increase[s] our chances for survival. “What are the mental constraints that rob our brain of the ability to think clearly? That is, why are people irrational? McIntyre does not offer any answers; he simply concludes, on a dour note of resignation, that, “for whatever reason, we must recognize that a plethora of cognitive biases are just part of the way our brains are wired.”47 Rather than try to “solve” the “mystery” of irrationality, McIntyre explores its vicissitudes, though always with reference to the science of the brain. Some cognitive biases, McIntyre teaches us, function differently depending on our political beliefs. With reference to the work of anthropologist Daniel Fessler, who investigated what may be referred to as “negativity bias,”48 in order to demonstrate why conservatives seem more inclined to believe threatening falsehoods than liberals, McIntyre contends that this phenomenon is explicable when one considers the “experimental evidence” which indicates | that the “fear-based amygdala tends to be larger in conservatives than in liberals.”49

(47-48)

47. McIntyre, Post-Truth, pp. 48-51.

48. McIntyre, Post-Truth, p. 57.

(47)

49. McIntyre, Post-Truth, p. 58. Research on the so-called “partisan brain” is clearly in vogue at the moment. See, for example, the research conducted by Andrea Pereira and Jay Van Bavel (“The Partisan Brain: An Identity-Based Model of Political Belief,” Trends in Cognitive Science 22.3 (2018), 213-224), or Jordan B. Peterson’s (et alii) recent papers (to quote one of them: Shona Tritt, Michael Inzlicht, & Jordan Peterson, “Preliminary Support for a Generalized Arousal Model of Political Conservatism”, PloS one. 8. e83333.10.1371/journal.pone.0083333.)

[contents]

 

 

 

 

 

 

2.5.5

[The Banality of Science Insistence]

 

[Breeur notes that even McIntye’s thinking has its own biases. For instance, it is “politically polarized vis-a-vis the Democratic left versus the Republican right” (48). Also, “McIntyre’s reference to this mysterious amygdala – which is part of the so-called ‘limbic system’’ – is as perplexing as that of Descartes’s references to the pineal gland” (48). On the one hand, McIntyre aims to be scientifically rigorous in order to unearth the causes for why we are unwilling to accept inconvenient yet scientifically verified truths; while on the other hand, the scientific research he uses, although ingenious and inventive, nonetheless finds just trivial, banal truths. There is a striking discrepancy between these two factors (which demonstrates the futility of McIntyre’s approach). Yet, “It is on the basis of such banalities that a whole discourse has been created, which from the Olympian altitude of its ‘objective facts’ and ‘experimental evidence’ valiantly commits itself to fight against the impostors in power” (48).]

 

[ditto]

One may wonder to what kind of cognitive bias this kind of philosophical thinking, which encourages us to exercise our “critical” minds, has succumbed.50 First and foremost, McIntyre’s entire discourse (symptomatic of virtually all debates in the United States) is politically polarized vis-a-vis the Democratic left versus the Republican right. Second, for a philosopher claiming the almost absolute value of “facts validated by science,” McIntyre’s reference to this mysterious amygdala – which is part of the so-called “limbic system’’ – is as perplexing as that of Descartes’s references to the pineal gland. To my mind, the most disconcerting aspect of McIntyre’s discourse is the disproportion that appears between the proclaimed aims and ambitions of his reflections (namely, our inability or unwillingness to accept inconvenient truths) and the kind of explanations supposed to explain this anomalous irrationality (namely, scientific explanations, and specifically explanations for which the ingenuity or inventiveness of the “experiments” seems· to mask the banality, even the triviality, of the truths investigated). It is on the basis of such banalities that a whole discourse has been created, which from the Olympian altitude of its “objective facts” and “experimental evidence” valiantly commits itself to fight against the impostors in power.

(48)

50. Cf. Ibidem, p.57.

(48)

[contents]

 

 

 

 

 

 

2.5.6

[Reduction to Simplicity]

 

[Because this scientific response is so “full of clichés or simplifications,” it can be perplexing and in the end prove futile, weak, and stupid in the face of the “proliferation of alternative facts” (49). The problem with this approach is that it does not realize that not all truths have the same values, as some prove to be too simplistic or banal to have any real weight or effect. “This is the reduction to simplicity: Regardless of their truth status, the value of truth claims can be (if they are not often) so simplistic, so banal, so trivial, etc., that it is difficult to understand how anyone could believe in their efficacy. The anthropological and metaphysical models at the core of many theories and programs are so simplistic and vain that one cannot see how they could be invoked to fight against the indifference to truth that permeates contemporary society” (49).]

 

[ditto]

The potential “eugenic” implications notwithstanding vis-a-vis McIntyre’s emphasis on neurology and the amygdalae of conservatives, there is an imbalance between the urgency and the complexity of the problems identified |and the response provided. This response – shared by the well-meaning academic majority of “defenders of the Enlightenment” – is full of clichés or simplifications that would leave any more or less assiduous reader perplexed, so that, whether or not there is scientific validity in the experiments adduced by McIntyre, the posturing in opposition to the proliferation of alternative facts is futile, weak – in a word, stupid. This position is predicated on the belief that truth is a value powerful enough to eradicate the false and the fake, but this position betrays a misconception, for truth is not a value “in itself”; rather, “value” is a criterion to which “truth’’ must be submitted. This is the reduction to simplicity: Regardless of their truth status, the value of truth claims can be (if they are not often) so simplistic, so banal, so trivial, etc., that it is difficult to understand how anyone could believe in their efficacy. The anthropological and metaphysical models at the core of many theories and programs are so simplistic and vain that one cannot see how they could be invoked to fight against the indifference to truth that permeates contemporary society.

(48-49)

[contents]

 

 

 

 

 

 

 

 

 

 

 

Bibliography:

Breeur, Roland. Lies – Imposture – Stupidity. Vilnius: Jonas ir Jakubas, 2019.

The book can be purchased here.

 

Breeur’s academia.edu page and researchgate page.

.

 

 

.

No comments:

Post a Comment