That probing of thought processes applies to trusting or distrusting decisions that end lives or cause injury, or even simply addresses one person or group of persons causing fear in others.
An example here would be where we are told that if we don't agree with the current laws that we feel inadequately regulate gun possession and especially gun carry, that we are simply failing to trust the inherent goodness of our fellow man that someone else chooses to hypothesize. Speaking here for myself, I don't wish to trust a glittering generality like that,when clearly all of us are a mixture of good and evil as human beings. It is not a matter of trusting human goodness, but trusting other variables in gun use. Put in an oversimplification, I don't wish to be AS trusting as to which variable, good or evil, might apply to any situation where deadly force is involved. I also don't wish to trust the wide range of variables in thought processes, good and bad, that I have observed over the years, particularly in crisis judgment situations, and in risk assessment.
For purposes of this post, to establish a common foundation for any discussion and comment, and because they do a great job of introductory level material, I'm going to quote the opening paragraphs from Wikipedia on cognition, and on heuristics, before going further.
In science, cognition refers to mental processes. These processes include attention, remembering, producing and understanding language, solving problems, and making decisions. Cognition is studied in various disciplines such as psychology, philosophy, linguistics, and computer science. Usage of the term varies in different disciplines; for example in psychology and cognitive science, it usually refers to an information processing view of an individual's psychological functions. It is also used in a branch of social psychology called social cognition to explain attitudes, attribution and groups dynamics.Heuristic
The term cognition (Latin: cognoscere, "to know", "to conceptualize" or "to recognize") refers to a faculty for the processing of information, applying knowledge, and changing preferences. Cognition, or cognitive processes, can be natural or artificial, conscious or unconscious. These processes are analyzed from different perspectives within different contexts, notably in the fields of linguistics, anesthesia, neurology and psychiatry, psychology, philosophy, anthropology, systemics, computer science and creed. Within psychology or philosophy, the concept of cognition is closely related to abstract concepts such as mind, intelligence, cognition is used to refer to the mental functions, mental processes (thoughts) and states of intelligent entities (humans, human organizations, highly autonomous machines and artificial intelligences).
Heuristic ( //; or heuristics; Greek: "Εὑρίσκω", "find" or "discover") refers to experience-based techniques for problem solving, learning, and discovery. ... In more precise terms, heuristics are strategies using readily accessible, though loosely applicable, information to control problem solving in human beings and machines.The area of heuristic thinking that I find most applies to the differences in side-choosing over gun issues is cognitive biases. Going, again, to Wikipedia, for a common basis for this discussion to go forward:
A cognitive bias is a pattern of poor judgment, often triggered by a particular situation. Identifying "poor judgment," or more precisely, a "deviation in judgment," requires a standard for comparison, i.e. "good judgment". In scientific investigations of cognitive bias, the source of "good judgment" is that of people outside the situation hypothesized to cause the poor judgment, or, if possible, a set of independently verifiable facts. The existence of most of the particular cognitive biases listed below has been verified empirically in psychology experiments.The cognitive biases that most intrigue me as applying to the differences we see here include those categorized as biases in probability and belief such as:
Cognitive biases, like many behaviors, are influenced by evolution and natural selection pressure. Some are presumably adaptive and beneficial, for example, because they lead to more effective actions in given contexts or enable faster decisions, when faster decisions are of greater value for reproductive success and survival. Others presumably result from a lack of appropriate mental mechanisms, i.e. a general fault in human brain structure, or from the misapplication of a mechanism that is adaptive (beneficial) under different circumstances.
Cognitive bias is a general term that is used to describe many distortions in the human mind that are difficult to eliminate and that lead to perceptual distortion, inaccurate judgment, or illogical interpretation.
Anchoring effect – the tendency to rely too heavily, or "anchor," on a past reference or on one trait or piece of information when making decisions (also called "insufficient adjustment").Laci recently forwarded to me a radio interview with Daniel Kahneman talking about his book Thinking Fast and Slow. Unfortunately, blogspot makes it difficult to include or embed that BBC segment here. Alternatively, I did find this particular review of the book by a colleague of Kahneman to be insightful, and include very small excerpts from that larger review here.
Attentional bias – the tendency to neglect relevant data when making judgments of a correlation or association.
Availability heuristic – estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples.
Availability cascade – a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true").
As a card-carrying member of the biases-and-heuristics crowd of the behavioral decision research field, these are the questions I have continually been asked over the years, despite my belief that they were answered conclusively long ago. In accepting an invitation to review Thinking, Fast and Slow (TFS) by Daniel (Danny) Kahneman, I anticipated getting a comprehensive and clear response to these decades-old questions.
The field of behavioral decision research has proven to be remarkably robust, demonstrating effects that have had profound influences on economics, finance, marketing, medicine, law, and negotiation, among other applied fields. Behavioral decision research has diffused to other academic areas faster than any topic in the history of psychology. And Danny has been recognized with the Nobel Prize in Economics, among many other well-deserved awards. But for the past 35 years, one ongoing criticism of the behavioral decision research field, particularly the work focusing on heuristics and biases, is that it doesn’t offer enough detail about the psychological mechanisms underlying the fascinating effects it documents. This tension about the nature of the field, and about the nature of evidence needed for journal publication, may be partially responsible for behavioral decision research developing more in professional schools than in psychology departments in recent years. (Of course, there are other explanations as well.) .... Answering the many questions about psychological mechanisms underlying behavioral decision research is at the core of TFS,...In that context, I saw a great deal of myself at the same age in the following video,although without glasses:
One of the qualities, besides an interest in how we think, how we make decisions, that I share with Laci is a lifetime of risk taking behavior. That has made both of us very thoughtful and analytical about our own abilities and processes of risk assessment and response, and it makes us similarly critical of the thought processes of others, much of which is sadly superficial.