tl;dr: 2007 book about cognitive dissonance written by psychologists is interesting to me because it came out a little while after the 2003 invasion of Iraq and in the wake of WMD cope, and I think it’s relevant for political work to understand more scientifically about how cognitive dissonance works, e.g. in regard to combatting deeply-held misconceptions and beliefs in people, and in conducting self-criticism and recognizing one’s errors

Link to book on archive.org

Interview & excerpts:

Why It’s Hard to Admit to Being Wrong: 2007 NPR interview of one of the authors and some excerpts from the book if you scroll down: - Less in-depth than the book. Mainly interesting to me as it’s like a little window back into 2007. They take some listener call-in questions.


Recently I have begun researching about how and why people believe in lies, particularly in relation to propagandistic lies or manufactured consent. I came across this book from 2007 called “Mistakes Were Made (But Not by Me): Why we justify foolish beliefs, bad decisions, and hurtful acts.”

The book itself is non-Marxist in its analysis framework and is written by two psychologists.

I’m finding it interesting so far, due partly to the fact that it was published during W. Bush’s presidency and that throughout the work and in interviews about it, references to Bush’s policies and the then ongoing Iraq war are being made. Knowledge about the WMDs being a false pretense for invasion is still being coped with by the public and people have a fairly fresh memory of the build up to the 2003 invasion of Iraq and are in the process of rationalizing how they could have believed these things.

That specific issue and the authors’ own (liberal) stances on it is not the main focus of the book–I’m just pointing out the time frame because I think it adds an important layer to the book’s content.

The main focus of the book is explaining research about cognitive dissonance to laymen audience, through a wide range of examples and studies, which are interesting in their own right and, in my opinion, useful information for anyone engaging in political work or making attempts to combat lies and propaganda that has taken root in peoples’ minds on various issues (including increasing awareness of the functioning of one’s own mind).

I am not very far into this book yet so I cannot exactly review it yet or make a detailed summary.

However some of the author’s points/claims I have come across in this book so far that interested me are:

  1. The urge to reduce cognitive dissonance is a natural and intense drive that is generally conducted subconsciously, making it very hard to notice and difficult to overcome. (In the NPR interview I linked, one of the authors says something to effect of ‘this has shown up in different cultures around the world everywhere that it has been studied,’ although he doesn’t get more specific than that)

  2. MRI brain scans showed that when people were exposed to “dissonant or consonant information” about politicians (George Bush and John Kerry) “reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain lit up happily when consonance was restored”. The authors then go on to provide some more examples of people either automatically filtering out dissonant information, or distorting it to entrench their original belief even further.


Given the ongoing situation… Everywhere… I thought that this may be a worthwhile topic to talk about in a Marxist forum, through a dialectical materialist and scientific viewpoint, to the extent that we are able.

My own research right now is revolving (as I said) mainly around lies designed to drum up support for and prolong wars, but “cognitive dissonance” is certainly a big topic that touches many other aspects of life, too, and any thoughts anyone wants to share on that are welcome.

For example, for anyone running into very dead-ends and deep-rooted beliefs when talking to non-Marxist friends about political issues, or those of us with family or friends getting sucked into cults and scams, this may also be useful information.

I also think it’s relevant to scientifically understand cognitive dissonance and confirmation bias in regard to the concept of conducting criticism and self-criticism.

I hope if you check out this book you find something in it that is useful to you.

Edit: I also wanted to include an excerpt not found on the NPR link if anyone is interested. This is about one of the studies they conducted, and it’s rather long so I put it under a spoiler:

spoiler

From p. 16-17

The beauty of an experiment is the random assignment of people to conditions. Regardless of a person’s degree of interest at the outset in joining the group. each participant would be randomly assigned to either the severe-initiation or the mild-initiation condition. If people who go through a tough time to get into a group later find that group to be more attractive than those who get in with no effort, then we know that it was the effort that caused it, not differences in initial levels of interest.

And so Elliot and his colleague Judson Mills conducted just such an experiment. Stanford students were invited to join a group that would be discussing the psychology of sex, but before they could qualify for admission, they would first have to pass an entrance requirement. Some of the students were randomly assigned to a severely embarrassing initiation procedure: They had to recite, out loud to the experimenter, lurid, sexually explicit passages from Lady Chatterly’s Lover and other racy novels. (For conventional 1950s students, this was a painfully embarrassing thing to do.) Others were randomly assigned to a mildly embarrassing initiation procedure: reading aloud sexual words from the dictionary.

After the initiation, each of the students listened to an identical tape recording of a discussion allegedly being held by the group of people they had just joined. Actually, the audiotape was prepared in advance so that the discussion was as boring and worthless as it could be. The discussants talked haltingly, with long pauses, about the secondary sex characteristics of birds—changes in plumage during courtship, that sort of thing. The taped discussants hemmed and hawed, frequently interrupted one another, and left sentences unfinished.

Finally, the students rated the discussion on a number of dimensions. Those who had undergone only a mild initiation saw the discussion for what it was, worthless and dull, and they correctly rated the group members as being unappealing and boring. One guy on the tape, stammering and muttering, admitted that he hadn’t done the required reading on the courtship practices of some rare bird, and the mild-initiation listeners were annoyed by him. What an irresponsible idiot! He didn’t even do the basic reading! He let the group down! Who’d want to be in a group with him? But those who had gone through a severe initiation rated the discussion as interesting and exciting and the group members as attractive and sharp. They forgave the irresponsible idiot. His candor was refreshing! Who wouldn’t want to be in a group with such an honest guy? It was hard to believe that they were listening to the same tape recording. Such is the power of dissonance.

This experiment has been replicated several times by other scientists who have used a variety of initiation techniques, from electric shock to excessive physical exertion. The results are always the same: Severe initiations increase a member’s liking for the group. These findings Jo not mean that people enjoy painful experiences, such as filling out their income-tax forms, or that people enjoy things because they are associated with pain. What they do show is that if a person voluntarily goes through a difficult or a painful experience in order to attain some goal or object, that goal or object becomes more attractive. If, on your way to join a discussion group, a flowerpot fell from the open window of an apartment building and hit you on the head, you would not like that discussion group any better. But if you volunteered to get hit on the head by a flowerpot to become a member of the group, you would definitely like the group more (p. 16-17)

Another reference to it on p. 24

People want to believe that, as smart and rational individuals, they know why they made the choices they did, so they are not always happy when you tell them the actual reason for their actions. Elliot learned this firsthand after that initiation experiment: “After each participant had finished,” he recalls, “I explained the study in detail and went over the theory carefully. Although everyone who went through the severe initiation said that they found the hypothesis intriguing and that they could see how people would be affected in the way I predicted, they all took pains to assure me that their preference for the group had nothing to do with the severity of the initiation. They each claimed that they liked the group because that’s the way they really felt. Yet almost all of them liked the group more than any of the people in the mild-initiation condition did.” (p. 24)


  • KiG V2
    link
    4
    edit-2
    2 years ago
    spoiler

    ___Wow, this is very interesting. This is very similar to my own theory as to cognitive dissonance, why people refuse to change their minds in lieu of new information, etc.

    Basically in my own words: they’re too invested. Depending on what group we’re talking about in particular, they may have spent precious thousands of hours of their life getting this far, they might have spent a real chunk of their wages, they might have squandered real opportunity, they may have sacrificed real friends or family. Changing their mind on even one small thing is unaffordable because the logical conclusion would be the death of their preconceived understanding and a total collapse of not only all of their hard work and sacrifice but of their most fundamental worldviews and a plummet into the unknowable; it’s a snowball effect, where one small change of heart leads to another and then 5 more and then 10, and I believe people understand this subconsciously.

    Depending on your specific scenario, such collapse might indeed lead to a quite literal death. But more often than not, I wish people knew how much better their lives would be if they burnt down the rotten structure their lives are built upon, that starting anew seems scarier than it really is. Of course, building up the habit early is important, the longest I have harbored a poor way of thought was probably a decade whereas some people, namely middle aged adults or older, might have had their thinking unchallenged for decades and decades and thus the crash falling down is much more severe, I can imagine it might cause mental breakdown. I know I myself had to be coaxed into shriekingly high-stakes emotions before I finally could start to take ownership of the truths that were always lurking underneath the surface…although I felt much, much happier and healthier following. I wish I could tell everybody afraid of changing their minds that the death they fear is an illusion, that there is quite likely little to nothing to be afraid of, that it will be a net positive in little to no time.

    I believe this all links back to the fear of death, which in my personal philosophical/psychological theory can be linked to almost all behaviors as the #1 root cause (sex drive being a far off second place)–I’m sure many people probably conceive of the world of humankind in much the same way.

    It’s been what I’ve believed for some time now, but it’s very cool to see someone conceiving an experiment to measure it effectively, I might never have thought of such an idea. Thank you for sharing! Very interesting.

    • @afellowkidOP
      link
      1
      edit-2
      2 years ago
      spoiler

      Thanks for your response.

      The way you describe it as people being “too invested” and also a “snowball effect” tracks very well with what I have read in this book so far. I don’t know if you checked out the NPR link, but the text excerpt found there is a good illustration of the “too invested” aspect.

      The tl;dr of it is that there was a person with an end of the world prophecy that had a small following of people, saying the world was going on end on Dec. 21, and that on Dec. 20 all of her followers would be picked up in a spaceship and taken to safety. In preparation for this event, the followers had quit their jobs, giving away they homes and all their money. A social psychologist (former professor of one of the authors iirc) went to their meeting on Dec. 20th to watch what would happen to them when (he hoped) their prediction turned out to be false.

      At midnight, with no sign of a spaceship in the yard, the group felt a little nervous. By 2 a.m., they were getting seriously worried. At 4:45 a.m., Mrs. Keech had a new vision: The world had been spared, she said, because of the impressive faith of her little band. […] The group’s mood shifted from despair to exhilaration. Many of the group’s members, who had not felt the need to proselytize before December 21, began calling the press to report the miracle, and soon they were out on the streets, buttonholing passersby, trying to convert them.

      Like you said, I imagine going in so deep on something like that could cause a breakdown, and so the brain tries to protect you from that breakdown at all costs, even shutting down your ability to reason if need be. As the authors explain, this urge we have to soothe cognitive dissonance does actually serve a good function much of the time. It prevents us from losing sleep every night over various bad outcomes and allows us to make peace with our choices without constant regret and second-guessing. There’s plenty of situations where that is really the most healthy thing to do (in my opinion). But this mechanism can’t really distinguish between majorly and minorly harmful mistakes, and its urge is to just immediately begin the self-justification process any time dissonance appears (my paraphrasing of what I took the authors to mean).

      it’s a snowball effect, where one small change of heart leads to another and then 5 more and then 10

      There is a section later in this book where they try to explain how people can take actions that eventually run counter to their principles, for example, why would a doctor start accepting gifts and then prescribing pills at the behest of a drug company, if he truly considers himself to be an honest and principled person? How could a medical ethics board allow itself to be taken over by the interests of pharmaceutical companies, if the people on the ethics board themselves genuinely believed themselves to be making ethical decisions the whole time? As you said, it’s step by step, one small change at a time, and the authors give some specific examples of such processes occurring where someone ends up more and more deeply entrenched on a certain path of action and belief.

      Edit: From some of my notes on this book, mainly using quotes from the book:

      Making a choice, when the right decision is not clear, "starts a process of entrapment–action, justification, further action–that increases our intensity and commitment, and may end up taking us far from our original intentions or principles.[…] The Milgram experiment shows us how ordinary people can end up doing immoral and harmful things through a chain reaction of behavior and subsequent self-justification. When we, as observers, look at them in puzzlement or dismay, we fail to realize that we are often looking at the end of a long, slow process.

      /Edit


      I know I myself had to be coaxed into shriekingly high-stakes emotions before I finally could start to take ownership of the truths that were always lurking underneath the surface…although I felt much, much happier and healthier following. I wish I could tell everybody afraid of changing their minds that the death they fear is an illusion, that there is quite likely little to nothing to be afraid of, that it will be a net positive in little to no time

      This is a great way of putting it. I have been there as well, and I feel similarly. I think it’s probably impossible to ever 100% escape this tendency (the authors of this book also seem to think so) but I think developing a better relationship to it definitely is possible and worthwhile.