Trauma and an authoritarian upbringing can lead to rigid and narrow thinking and a need for absolute certainty
What if there was a cure for fanaticism? How would you persuade a true believer to submit to treatment?
Neural pathways of dogmatism
Rigid, concrete thinking
Damaged neural networks
Links between trauma and impaired categorizing
Certainty is an interesting sensation that may aid in the decision-making process, but can also lead us to invest in false assumptions. It is the sensation we feel that allows us to believe or value evidence, and to be able to successfully predict events from what we already know. When we try to recall the past, it is what allows us to distinguish from dream or story, and our actually lived events. The feeling of certainty, however, is still just a feeling that is attached to our perception. When we perceive a color, for example, the color itself is independent of the feelings we may have about that color. Similarly, truth is independent of the feelings we may have about it. Most “false” beliefs cannot be ‘reasoned” away. Changing one’s view usually takes more than a cognitive shift to assure that all the emotional investments and associations also shift along with the “cognition.” Also, the prefrontal cortex creates justifications for our opinions after we have formed them.
Deeply internalized belief becomes part of one’s identity and intuition. The source of this belief may even be forgotten or unrecognized. Once an erroneous belief is accepted, it becomes more difficult to supplant. An unconscious belief can influence how we think and can subtly change our perception of a situation or issue. In fact, mere exposure to an idea can alter an attitude or opinion. Complex neural networks can become rigidly ingrained when positive or negative experiences occur repeatedly or when highly traumatic experiences happen.
Extreme beliefs are a matter of degree, because they involve the same processes by which the human brain normally processes stimuli, categorizes, prioritizes, generates a sense of meaning and a narrative about the world to live by.
It is difficult for indoctrinated individuals to see their own impairments.
If you were “brainwashed”, how would you know it? True believers rarely consider themselves fanatical—they believe they are reasonable and clearheaded. Why should such individuals shift their thinking if they are certain that their particular brand of absolutist beliefs is unquestionably true? The condition itself precludes the desire to change.
This is the inherent nature of dogmatic certainty. When one’s sense of self and feeling of certainty are so thoroughly identified with particular beliefs, questioning these beliefs is equated with denying reality. Ingrained belief can lead to the inability to step outside one’s logical loop. When individuals are deeply indoctrinated, they develop rationales for the formation of their truth, fabricating a history and over-investing credibility in authority figures.
Note: All humans have similar tendencies to invest in and identify with cherished beliefs, to rationalize maintaining one's beliefs and to be influenced by unconscious biases and assumptions. Indoctrination and dogmatism are often a matter of degree and intensity.
Various brain imbalances appear to correspond to dogmatism and rigid thinking
Brain activity patterns on functional scans, for example, can indicate biological markers for obsessive, delusional or concrete thinking, perceptual impairments such as problems with perceiving social cues, impulse control and rage disorders, or traumatic imprints.
Forensic psychiatrists study how tyrants, cult leaders, and violent fanatics are formed, and the circumstances, political and psychological conditions in which they thrive. They do not arise in a vacuum.