Transcribe your podcast
[00:00:14]

Rationally speaking, is a presentation of New York City skeptics dedicated to promoting critical thinking, skeptical inquiry and science education. For more information, please visit us at NYC Skeptic's Doug. Welcome to, rationally speaking, the podcast, where we explore the borderlands between reason and nonsense. I am your host, Masamichi, and with me, as always, is my co-host, Julia Gillard. Julia, what's our topic today?

[00:00:48]

Masimo, today, we have a guest joining us by phone. I'd like to welcome Carol Tavaris. She's a social psychologist who's taught at UCLA and also an author. She has written for publications including The New York Times. And The L.A. Times is the author of a recently rereleased book called Psycho Babble and Biobank, and also the co-author of another recent volume called Mistakes were Made But Not By Me. Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts.

[00:01:19]

That's co-authored with Elliot Aronson. Carl, welcome. Very happy to be with you guys.

[00:01:25]

Well, so here's my first question. When I was I started getting interested in philosophy, one of my favorite philosophers was and to some extent still is Aristotle. Aristotle was famous for having said that the thing that differentiates human beings from any other animal is that we are the rational animal. But after reading your book, I have to conclude that we are actually more likely to be described as the rationalizing animal. Is that a fair assessment?

[00:01:52]

That is exactly right. And in fact, that's the term that Elliot Aronson himself used when he was first working with and developing the theory of cognitive dissonance. You know, it is so interesting in philosophy. Philosophy really gave us the field of critical thinking. When my colleagues and I were first working in psychology to write psychology textbooks, critical thinking was all the rage. And it was, how can we get students to think critically and scientifically? But in fact, the study of psychology teaches us not just how we can, but why we so often don't, what the barriers are to thinking rationally and critically and skeptically.

[00:02:35]

And so that's really been my life's interest, which is communicating what we know best from science that could be of great help to human beings. But then why we are so annoyingly resistant to changing our minds and using new information.

[00:02:55]

But do you think that, you know, one of the obvious things that that might come to mind to one of your readers is, well, if we do rationalize all the time and pretend, on the other hand, we're engaging in rational thinking, do you think that kind of their message sort of undermines the effort to make people think more reasonably to use reason? Because somebody could take a skeptical position at this point and say, well, but, you know, the human mind is just not made that way.

[00:03:23]

We all rationalizing you you when you think you're you're being reasonable and rational, in fact, you're just making up stuff as you go.

[00:03:30]

Well, it's a wonderful question, but in fact, I think that an understanding of the minds biases, the way we misperceive things, the way our memory is, can be flawed, our need to keep our beliefs coherent and so forth. And we understand the fundamental biases of the mind. Then we can compensate for them. We can correct for them. I regard this exactly as being the same as having a blind spot in vision. All good drivers better be aware of that blind spot when you're backing up and we learn to correct for it.

[00:04:04]

And I think in the same way, understanding our minds blind spots makes us aware of them and helps us compensate for them. So let me just say, though, what the rationalizing animal is, what we mean by this. You know, I'm sure you have you've talked often on your show about biases and perceptions, such as the confirmation bias, why it is we're designed to see and remember and believe things that confirm what we believe and ignore and forget and overlook information that is discrepant with what we believe.

[00:04:42]

We this is one of the most common biases in the way humans process information. But when you know that when you know that, when you know that after you buy a car, you are going to start looking at all the reasons for the wisdom of your decision and ignore any possible evidence that you were wrong in your decision. It's it's a way of helping us think more carefully and skeptically about the decisions we make and how we think about those decisions afterward.

[00:05:17]

For us, one of the things that I think is most important about what we've tried to do in this book. Mistakes were made is to show what happens, how the mind works after we make a decision, a moral decision or. A political decision or take a belief, take a position on a particular belief, whether it's medical or nutritional or whatever it might be, that we then justify or if you will, rationalize that choice, that decision, that position and start looking for the evidence that shows us that we were right in that choice and that decision.

[00:05:59]

And the further along that path we go, the more difficult it becomes to change direction. This is not just about, you know, beliefs that aren't important to us. What Eliot's great contribution was in the theory of cognitive dissonance was to show that it is most painful, it's most difficult. We are most motivated to get rid of information we don't like when it questions something essential in our view of ourselves. If we see ourselves as competent, smart and kind, we are not going to be happy accepting evidence that we just did something stupid, foolish and hurtful.

[00:06:41]

That's really the centerpiece of our argument. So it's not rationalizing or lying to other people to get off the hook for something we know we did wrong so that we won't be punished or lose our job. It's a it's a way we have of lying to ourselves to preserve our beliefs in our own competence and kindness, Carol.

[00:07:05]

But what I thought was especially disturbing about your book was that, I mean, disturbing in the best possible way.

[00:07:13]

And the interesting way was that it seemed that it's not only that we look for evidence to confirm that we made the right decision or have the right belief that we actually try to manufacture that evidence ourselves, that if we are cruel to someone, we actually not only look for evidence that that was the right thing to do, that they deserved our mistreatment. But but actually, that makes us more likely to mistreat them in the future to sort of bootstrap our way to feeling like we did the right thing.

[00:07:43]

Yes, that's exactly right. If I am a kind person and a good person and now I have mistreated another person, well, I wouldn't mistreat somebody who was a who who didn't deserve it, you know, therefore, they provoked it. They started it. It's their fault. And that's indeed we then we we preserve that justification. And it takes us very far along the path. I mean, you see this, whether it's between couples or between nations, the more we blame the other person in order to preserve our belief in our righteousness, the more difficult it becomes for us to own up to our own mistakes, our own contribution to the problem.

[00:08:28]

I had a very touching letter not long ago from a man who told me that he and his brothers and sisters had been engaged in one of these horrible family rifts, you know, fighting over money in the estate and so forth. And he said when your book first came out, I gave it to the mediator to give to them to show them what why they were so wrong and would they please own up to their mistakes in this report, he said, and for some reason, they paid no attention to me.

[00:08:54]

He said, and then I read your book and I thought, oh, oh, maybe I have something to do with this now. And it's just it was so interesting because that's the point of resistance that we really don't want to face in ourselves. Oh, maybe I have something to do with this. So it's not just it's not just saying my my father was mean and stern to me and how dare he treat me that way? It's maybe my behavior as a child was such that it evoked in him the need to be stern with me.

[00:09:31]

You know, we when we tell our stories, we kind of write our own our own responsibility for the event out of it. And that's that is what is so interesting and how we how we come to explain and tell these stories. But again, I really do want to emphasize one of the things I've learned in working with Elliot on how cognitive dissonance works and why it is such a powerful mechanism in our minds is that this is one of the really few central areas of research in cognitive and social psychology that when you get how it works, you really can take it away with you and do something about it.

[00:10:12]

Because cognitive dissonance, all it really means is it's that feeling of the conflict when one idea you have clashes with another that doesn't sit consistently together. And, you know, the classic example is a smoker who knows that smoking is harmful. So you either have to give up smoking or justify smoking. That's why the book is about self-justifying. And so when you see that what you can do is separate the cognitions, one of our favorite examples in the book is Shimon Peres is observation.

[00:10:47]

He was furious when his friend Ronald Reagan went to Bitburg Germany to wear the Nazi SS were buried in the cemetery. I don't know if you recall this awful standard of protest all over the world. What was Reagan doing going to Bitburg and the SS was buried there? Well, no, it was a German state visit and he was going to go and Peres was asked what he felt about Reagan making this visit. And Peres said when a friend makes a mistake, the friend remains a friend and the mistake remains a mistake.

[00:11:21]

Yeah, that's that's what we're trying really to convey here, that we can learn to say this of ourselves, too. When I make a mistake, it doesn't mean I am stupid, incompetent, cruel and an idiot. It means I made a mistake. Right. And so I need to face it and fix it.

[00:11:36]

I want to go back for a second to the cognitive dissonance being. So as you said, this is a very well established sort of phenomenon in concept, in in psychology. But what about the physiology, an annual basis of it? What do we know, if anything, about the sort of physiological neurobiological aspects of cognitive dissonance?

[00:11:56]

Yes, everybody loves this question. Eliot says we've had three thousand experiments in cognitive psychology and it's not enough to tell people that we're really doing science. But sure enough, now with modern technology, they have actually been able to observe the brain on dissonance. And basically, as Drew Westen and his colleagues, for example, have found, the brain is not happy when it is confronted with dissonant information. One very cute study, they had George Bush supporters and John Kerry supporters who were given dissonant information about their preferred candidate.

[00:12:31]

Some things that the guy had done that was stupid, wrong, foolish that they disagreed with. And what they find is that it is really an unpleasant experience in the brain. The way Drew Westen put it is the reasoning areas of the brain shut down and they don't they're not restored until confidence is restored, until you can think of some beneficial thing or some justification for your favorite candidates. Bad behavior. You said the brain. You said Turrell's the cognitive kaleidoscope until it puts the pieces, the dissident pieces back into harmony.

[00:13:09]

It's a lovely expression. And you said that's exactly what they have been able to observe in brain function. Now, of course, to say that we're designed that the brain is designed for consonance. This is one of these things from evolutionary psychology that we can say undoubtedly has had survival benefit, probably been very healthy and adaptive. We all know we can't run around changing our minds 12 times a day or asking ourselves for every single thing we do. What's the data for this?

[00:13:37]

What's the evidence for this? We come up with a set of beliefs and passions that guide our lives and convictions that we that we follow, that we regard as important. And that's an efficient way to operate in our lives. The concern becomes when our belief is really finally discredited by an overwhelming amount of evidence and then we can't let the belief go.

[00:14:04]

So this is a question that I had had while reading your book and that one of our commenters cost us had also asked on the rationally speaking blog about the adaptiveness of this behavior. So it certainly seems like from the way you describe the phenomenon in the book that we get sort of entrenched that we the the more confirmatory evidence we get presented with, the more we dig in our heels and the more we seek reassurance that we were actually right all along because the stakes are now higher.

[00:14:32]

So it seems like that would actually make us really ill equipped for cases in which we're clearly wrong. And so I guess I'm wondering, what do you know what evolutionary psychologists think made this behavior persist?

[00:14:49]

Well, of course, keep in mind that when our brains were first evolving and emerging lives were fundamentally simpler, one of the benefits of having a set of coherent motivating beliefs is that they bind us to our tribe and community and would therefore have quite a successful survival benefit. For example, I mean, why should we feel attached to our particular nation or tribe or religion or ethnicity and so forth for the many centuries? You know, we're cooperative little primates and we need.

[00:15:33]

We live in groups and we need groups in order to survive. So having a set of beliefs, my group is better than your group. You know, my book, my religious beliefs is better than your religious belief. These are organizing principles of the mind that give us social identities as well as individual identities. And they keep people from feeling like they're rattling around in an alien universe. Perhaps, unless, of course, you're following The Hitchhiker's Guide and you want that.

[00:16:05]

But but but evolutionarily speaking, what social psychologists would say is that, you know, all of these these universal attributes of the mind are prejudices toward people who are different, our prejudices toward the enemy, our feelings of ethnocentrism. These have all over the centuries had a certain survival benefit in today's very complex world. They very often do not. And so that becomes the task. Fortunately, our brains have also evolved to be flexible when they need to be flexible and to change and to understand ourselves and to to make the changes that we would ideally like to see in ourselves.

[00:16:54]

But isn't some of the problem that we're dealing with here also in ironically, I suppose to some extent extent too much flexibility on the part of the brain in another area? So I'm talking about memory, for instance. So it seems like the fact that memories are actually constantly reconstructed, you know, we don't as you know, we don't have a tape recording in our brains. We just conjure up events or things we said, which means that it's very easy, in fact, to reconstruct things in a way that they did not actually happen.

[00:17:24]

It's like we almost had our own little version of Orwell's Big Brother in inside our mind that can really added things and change our own past experiences as well. To what extent is memory playing a role in the ability of the human mind or the tenanted in my mind to rationalize?

[00:17:42]

Yes, this is a wonderful question. And of course, memory research has been so fascinating in recent years because people do want to believe that whatever memories we have are somehow in their right. It's just a matter of uprooting them as if they were turnips. We can get in there. We can get out that real memory. That's a it's been a very humbling few decades in studies of memory for just the reason you say for us to realize that our memories are fallible, that we construct them, that they are more accurately representations of what we think now about our families or our past than of what actually happened to them in the context of our book.

[00:18:25]

The way we phrase it is that memory is our live in self-justifying historian. And so we revise our memories to remain consonant. That is in harmony with the current stories we tell about our own lives or about our parents or about the events that happened to us. And it is why it is so shocking, so stunning when we're presented with evidence that a memory we have that's so clear is absolutely wrong, that it could not have happened or that it happened to someone else or that we added some details that weren't there.

[00:19:05]

Now, this is another example of how we could resist this information and say, no, no, no, no, no. My memory is absolutely perfect and completely accurate. All we could say, boy, isn't that interesting. I think maybe I will stop arguing with Harold about that memory of when we were four years old. You know, maybe I'll have to be a little more open to the possibility that my memory could be flawed. This is really charming information, useful information, rather than something that needs to be threatening to me, because maybe by actually getting the evidence, becoming a biographer of what?

[00:19:48]

A real biographer what happened to me in the past, a better historian. Maybe I can come up with a better story about my life or about my parents' lives or whatever it might be. I guess I would say this, Masimo, about what our brains are designed to do. Cognitive dissonance is hard wired. The conflict we feel under dissonance is pretty much universal, although the specific things that we feel dissonance about are very much culturally determined. The content is culturally determined, but the mechanism is a universal one.

[00:20:25]

OK, so so now we can say so what. The fact that it is in. There that it's hard wired tells us nothing at all about our ability to face our mistakes, learn from our mistakes, be willing to admit when we were wrong, be willing to change direction, be willing to say, gee, maybe I put an innocent man in prison, maybe I made a serious mistake in this operation. What can I do to make amends about this?

[00:20:57]

Those are all learned things, learn from our culture, learn from our immediate circumstances and from our values. So, you know, it's like saying, OK, we're biologically designed to be carnivores, but people can choose to be vegetarians, right?

[00:21:13]

Yeah. I was going to say that too many people have this idea that if something is hardwired or it's biological, it therefore means it's inevitable. It's not the example of changing habits of bad things you eat, for instance. It's it's it's a clear one. Yes. We have certain tendency, the tendencies that are hardwired, but that doesn't mean you cannot control those standards and redirect them in a different direction.

[00:21:36]

Carol, I'd like to hear some more about the cultural differences between the way cognitive dissonance and rationalization works. Are there different? Have you noticed different kinds of beliefs that tend to be especially vulnerable to this kind of phenomenon in different cultures?

[00:21:54]

Well, there's been research on this. For example, Japanese social psychologists have studied dissonance in Japan where they find that what creates the greatest dissonance there is different from what they cause it here. That is an individualistically oriented culture such as the United States, if an individual's own competence is questioned or threatened. That's a very dissonance producing feeling. And the person will try to deny the evidence to suggest that they were somehow less than perfectly competent in Japan and cultures that are more group and other oriented.

[00:22:34]

The idea that you have lost face in front of your of your friends or the public or your colleagues, that would be a more dissonance producing circumstance. So that's what I mean when I say the content of what creates the dissonance is different. People often ask me world, our men and women different in this, you know. I mean, aren't women more likely not to be able to reduce dissonance? I think of women who, you know, can't make a decision.

[00:23:00]

And so they stay up all night beating themselves on the head with I made the wrong decision, I did the right thing. I guess there's so much to do and so forth, because everybody knows individuals like this. Their problem is they can't reduce dissonance. They've they've made a decision or they've bought a car and they they keep having buyer's remorse. But but once again, in it, Aaronson's formulation, what creates the greatest dissonance and the greatest need for self-justification is when we do something that most threatens most conflicts with something that is centrally important to our sense of ourselves.

[00:23:39]

I am not a cook in my household. My husband is the cook is a fabulous cook. If I make a lousy meal or bread the cookies, I don't experience dissonance because it doesn't conflict with myself. My self concept. So. So for any given individual people will vary in how easy it is for them to say, boy, I really screwed up that one, so what can I do to to fix this one? And in some cases, of course, we're all able to do that.

[00:24:10]

But in some cases it becomes extremely difficult when it's something really central to how we see ourselves.

[00:24:17]

Now, you point out that some point in the book that even the the pros are prone to rationalization. I'm talking about scientists, for instance, people who we think perhaps dying themselves like to think that they are particularly trying to avoid, you know, pitfalls of bad reasoning. And they pride themselves in their alleged objectivity and so on and so forth. But what you do point out is that individual scientist may not be self-correcting. It is the process of science that is self-correcting, which reminds me of a theory put forth a few years ago by a philosopher of science and you know, whose idea is, in fact that a major difference between science and other kinds of human activity is in terms of self correction is precisely that.

[00:25:03]

It is inherently social activity that heavily depends on people crosschecking each other each other's work, you know, not just the peer review, the formal peer review process, but the fact that even once that scientific research is published, people keep constantly going back and checking and double checking the results. And that's the nature of the enterprise.

[00:25:22]

Now, if that is the case, it seems to me that that suggests that perhaps the best way that we have to deal with our own inability to overcome. Bias is, in fact, to do essentially in our little own world, what the scientists themselves do, which is talk to other people who disagree with us.

[00:25:39]

Well, you ask a very important question, which is how can we all learn to think a little more skeptically, critically and scientifically in our own lives? This is this is the central issue here, because you don't have to be a scientist to think scientifically. And not all scientists do themselves. As you noted, of course, the famous example is Abraham Lincoln and his team of rivals in his cabinet of actually hiring people in his cabinet whom he knew to disagree with him on on important issues and being able to listen to what they had to say.

[00:26:19]

Yes, that is that is a big part of it. It's being able to listen to another person's argument without the temptation to jump in and rebut it. But but really to hear or to read what it is that they are arguing for, what the source of their evidence is, how good their evidence is, what what the you know, how well they argue about it and so forth. And I think that as much as anything, what it begins with is the attitude that disconfirmation is not negative.

[00:27:06]

That's what people think. I mean, we have this problem all the time with students who think that critical thinking means you're being critical. I'm critical of her dress. I'm critical of that movie, meaning it's debunking. It's putting down, it's taking away. And instead, what I think we need to focus on is the ways in which being able to think critically and scientifically are creative and constructive. They give us new and better ways of doing things new and better solutions to old problems that have persisted so that rather than hanging on to some old way of doing things or some old belief that's past its shelf life, if we're able to be open to disconfirming information, we may actually, goodness learn something that's useful for us in our own lives, whether it's about nutrition, whether it's about how we do our work, whether it's about a belief that we hold, and that being willing to give up certain ideas has can have a tremendous benefit.

[00:28:07]

I think that's the attitude that scientists and skeptics need most to convey in the world, that our goal here is not just to tear down people's beliefs and make them give up some belief that is desperately and deeply important to them, but rather to move along to more beneficial things. You know, you look at, for example, the people who cling to the notion that vaccines cause autism. They put time and money and effort and passion and belief into this idea.

[00:28:39]

But don't you want to know what does cause autism? What interventions would be better? How about taking that passion, that energy, that fundraising effort and put it toward a different explanation? So I think that that should be one of the central goals when skeptics and scientists talk about what why we care so much about this for me, why I care so much about this is that the cool the very purpose is not to make people feel bad about what they believe.

[00:29:23]

It's so that we ourselves, along with others, can eventually move to better, more helpful, more constructive ideas.

[00:29:33]

This idea of positive skepticism is to some extent encapsulated in the very phrase that starts every page of Russian and Russian speaking like we use this as a model is from David Hume and it says springs from argument amongst friends.

[00:29:49]

Yes, that's beautiful.

[00:29:52]

Jason Carroll, I wanted to return briefly to the anecdote that you were reading earlier about the man who wrote to you about getting copies of your book to convince his family that they were wrong. And so you've been talking a little bit about how to how to recognize and correct for this bias in ourselves. But what about convincing other people that they're suffering from this bias? Because it seems it seems like you've got kind of like a Chinese finger trap situation. Like the more you try to show the wrong, the more they're going to dig in their heels, which is sort of what it sounds like happened with this fellow.

[00:30:25]

And is there do you have any recommendations for how how to gently cook someone? Well, you know, this has been this is a very huge issue in the skeptical movement, and it turned out to be one of the more unexpected consequences of our writing this book, which was how do we persuade people? So we touched on this in the last chapter, because when you really understand how dissonance works, the first thing you realize is that when you argue with another person, the one thing you can't do is put them in a state of dissonance, meaning you don't want to say anything that conveys the following message.

[00:31:10]

How could you be so stupid as to have thought that? Right. How what were you thinking when you spent your life savings on that stupid magazine? OK, well, now now what you have done with with phrasing it that way or conveying the message in that way is that the other person is now immediately in a state of dissonance. I'm a smart and capable and kind person. So and now you're telling me that spending my life savings on that magazine was a stupid thing to do?

[00:31:37]

No, it wasn't. I did it out of the goodness and kindness of my heart. I was hoping to make a lot of money for my heirs and for you, for that matter. And it was really done with. So the person will become even more entrenched in their course of action that they took. And I want to tell you that I myself, you know, I'm no I'm no genius at this myself. I not long ago got into an argument with a friend that really made it crystal clear to me that I was doing this.

[00:32:07]

In effect, I was talking to her carefully about research on vaccines and autism, actually. And I suddenly realized from her increasing anger with me that she thought I was dissing her. She thought I was not respecting her knowledge, her experience, her the nature of her evidence. It was it was a real eye opener for me in the nature of our quarrel. And I as soon as I saw that, I, I pulled back and said, let's start over again here.

[00:32:38]

This is really what I'm trying to say. And it was a very important lesson for me. So the thing is that when you when you get how this works, you don't want to do the thing that so often happens in in the skeptical world. We know the truth. We are the smart ones. And you're idiots to believe the stuff you believe. So this is not going to be helpful. What you want to do instead is to start from I think Masimo probably has better ideas about this than I think being a philosopher.

[00:33:12]

But you want to start with where your common and shared premises are and your understanding and acknowledgement that that person's views are valid and important to that person. So you want to separate the you want to you want to respect the person's intentions, kindness and motives and keep the focus of the evidence separate from that. This is a topic perhaps for another podcast, but it's a very important one because it's not an easy thing to do. It's really not an easy no.

[00:33:54]

You're right that this is a major problem for the skeptic movement in general. And in fact, I would say for educators, you know, teachers in general, I mean, that is I teach at a university in New York, a class every year on critical thinking. And the first task that I have is to convince my students that this is going to be a important for their lives. And B, this isn't about criticizing whatever beliefs they happen to hold on whatever topic we're going to be talking about.

[00:34:21]

And it's not an easy thing. I wanted to go back for a minute to I guess to Aristotle, but in a different way from the way we started. You write in the book at some point that we are constantly reinterpreting the things that happened to us through the filter of our core beliefs. And in other words, what we're doing is we we are telling ourselves we're constantly building and writing our own stories and trying to make them into a coherent and narrative about our lives, right?

[00:34:52]

Yes. And the whole idea of decreasing cognitive dissonance is because we want to write a good story about ourselves. And the reason I'm bringing this up is because this reminded me of Aristotle's conception of eudaimonia, which is the good life, the happy life in a broad sense of happiness. And Aristotle said that you cannot tell if somebody has had a good life until the very end and in fact, you know, until after it, because it's their friends is his friends and family and people that knew him that had to write.

[00:35:21]

They had the ability to write the entire narrative, the entire thing, and look at whether somebody has had a moral happy in good life. But in fact, that's what we want to do all the time. Right? I mean. Myself, rewriting my own experiences from last year or from from the years previously to make sense of them, which is important to me, otherwise, I get distressed about the mistakes that I made and about, you know, people that I interacted with and perhaps I could have had better interactions and so on and so forth.

[00:35:51]

But the problem seems to me that since I'm both the author and the reader of this story, it's much too easy to lose objectivity.

[00:36:00]

Well, first of all, the the family of the person in question only knows they're only ever knows a piece of the story of our life. Our lives aren't open books for everybody. We all of us have private lives, thoughts, guilt's worries, regrets that that others may never know. So the question I think here is, how do we tell a life story that finds a balance between being able to understand where we might have gone wrong, what we might have done wrong to face those?

[00:36:46]

That's the counter life idea, to assess the mistakes that we made and not beat ourselves up about them endlessly, as some people will do. But to to you know, in a way, there's two parts here. One is that we need to be able to reduce dissonance over paths we didn't take that we now think we should have decisions we made that we now think we're bad one's choices. We made that we wish we had not. So task one is to reduce dissonance over those wrong decisions so that we sleep at night over them.

[00:37:35]

But the other task, the existential task perhaps, and this is solely a matter of philosophy and psychology is nothing to say about this is a moral question. Is being able to come to terms with the mistakes we made so that we can put the burden down so that we are honest enough to face them, learn from them, and then let them go and where necessary, to make the life corrections that allow us to find peace. When we tell the story in our book of William Broyles, who suffered from post-traumatic stress syndrome after being in Vietnam, he just could not ever accept the horrors he saw there.

[00:38:23]

The horrors that he and his men committed there until he went back to Vietnam, met the people whom he had seen as his enemy, seen them as human beings, understood that that was in the past, that life goes on, history moves on, that he could make peace with himself. So those are those are the challenges in all of our lives, writing our histories to be as truthful as they can, admitting our warts, flaws and failures, and not just ignoring them, but not letting them destroy our lives either.

[00:39:08]

I think we're almost out of time. But I want to jump in with one last question before we move on to the next section. I am curious about how your theory explains moral decision making and the sort of slippery slope of moral decision making, because it seemed like it would predict that people, when they do something bad, would try to reduce the dissonance between that action and their self conception as a good person and justify it and in the process of justifying it, do something even more in that along that lines to convince themselves that it was really just fine from the beginning.

[00:39:46]

You're a terrific reader of this book. Exactly right. So it's a slippery slope for moral judgments and for the moral conscience of a nation as well. The way it works, the metaphor we use in our book is a pyramid. You know, here you have two people at the top of a pyramid. Should I cheat or not cheat? Should I support this government's use of torture or not? I mean, after all, we're at war, but on the other hand, is torture and the Geneva Conventions forbid torture?

[00:40:10]

Where do I want to what stand do I want to take on this issue? The minute we make a decision or take a stand, we cheat or we don't cheat. We go along with the government's use of torture or we do not either decision we make we will justify it to remain confident with the belief we've just come to or the behavior we've done. If we've cheated, we will now say. Cheating isn't such a bad thing. Oh, please, everybody cheats is just no big deal.

[00:40:35]

If we resist cheating, we will say cheating is a very big deal. It's immoral, it's wrong. It's not a victimless crime. Everybody suffers and it's not the right thing. Over time, we will find ourselves further justifying the repetition of that action or that belief, because after all, we just justify this small act of cheating. So now the next time the opportunity comes along, we are going to change our minds because we've already spent that mental energy to justify.

[00:41:04]

So is that actually empirically what we see, that people over time become more and more extreme in their moral views? Absolutely.

[00:41:13]

And in the study that I was just describing, what is really interesting is that you can take people with more or less the same views toward cheating. It's good. It's not it's not a bad thing. It's not a terrible thing. But after they have cheated or not cheated, they will be very far apart from each other on their views of cheating and come to believe that they always felt that way. I actually just saw a clip of this on a video news story of some of this cheating scandal at the business school in Florida and where, you know, two thirds of the kids in this guy's class were found to have cheated on an exam.

[00:41:46]

And they interviewed the kids and some the cheaters all said, oh, please, it's no big deal. Everybody cheats and cheaters. So this is just appalling. These are people going into the business world and they're already cheating in school. So that's how it works. And then once you have taken that step, it becomes easier and easier to stay on that path because to get off it, you have to look for the disconfirming evidence that your decision was the wrong one.

[00:42:13]

Fascinating. That's the hard thing to do. And creepy. Fascinating, but creepy. Exactly right. OK, we're going to wrap up this section of the podcast and move on to the rationally speaking, PEX.

[00:42:39]

Welcome back. Every episode, we pick a suggestion for our listeners that has declined our rational fancy. This time we ask our guests, Carl Tavaris, for her suggestions Carl.

[00:42:49]

Well, I have been reading and reviewing actually two wonderful books that have a go at the common notion that sex differences, sex differences are hardwired in the brain in this era of what we might call neuro sexism, where you wave around, fancy Omri's and think you're really doing good science. These books have just been a fabulous corrective. CORDELIA Fine Delusions of Gender and Rebecca Jordan Young brainstorm the flaws in the science of sex differences. These are terrific, interesting and very important books.

[00:43:26]

And I'm also very partial to my co-author, Eliot Aaronson's fantastic new autobiography and memoir for anyone interested in the world of social psychology. His book is called Not by Chance Alone.

[00:43:40]

So if you if I read the first two books, I will finally learn everything.

[00:43:44]

That there is not to know about women is that you will have excellent.

[00:43:52]

Can you can you think of any good examples from from either of the first two books about how how science goes wrong?

[00:44:00]

And as I said, I have been looking at this at the research about the let me say I've been looking at the efforts to find sex differences in the brain for many, many years. In 1992, I wrote a book called The Mis Measure of Women, and people were doing the same kind of things. Then let's find these differences in brain federalisation and the size of the corpus callosum and that and this will explain why women are better at empathy and men are better at science and so forth.

[00:44:31]

These are these are flawed assumptions. The differences that are said to be explained are stereotypes. And you see in social psychology, we don't ask, what do people think they're like? What do they say they're like? Do they think they're more empathic? We want to know how people actually behave and how they behave is very different in different situations. And so the brain differences that we're trying to find don't explain the behavior that we think is interesting. It's a great story.

[00:45:01]

It's a wonderful story. And one of the most important lessons for scientists and critical thinkers is to be wary, to be wary of claims that just because they're dressed up in high technology does not mean you're doing good science. We need to bring our critical faculties to assessing neuroscience, just as we do to any other claims that people make.

[00:45:29]

Karl, I just have to tell you this. The world is a small place. I just made a connection. Now that I read actually that book, The Measure a woman back when it came out in 1993.

[00:45:39]

Really. But I just realised now that you were the author. Oh, well, that's a real world. How interesting it is.

[00:45:47]

So this is when you've been around for a while, you see these things. Those things do tend to get something.

[00:45:54]

I've been wondering all this time how you learned all there is not to know about women from this book. I started out with 19 in 1993.

[00:46:03]

Well, the subtitle of that book is Why Women are Not the Better Sex, the Inferior Sex or the Opposite Sex. So it was guaranteed to irritate just about everybody. What we're not the better sex is not the smart marketing move on your part.

[00:46:17]

Yes, well well, Carol, it's been such a pleasure having you on the show. We are just about out of time now, so we're going to wrap up.

[00:46:25]

Thank you both very much. I've so enjoyed talking. Thank you. I'd just like to remind our listeners that the rationally speaking podcast is brought to you by the New York City skeptic's. And we encourage you to check out our website, NYC Skeptics dot org, to check out news of upcoming events and lectures and our conference, the Northeast Conference on Science and Skepticism. That will be coming up the spring. This concludes another episode of the rationally speaking podcast.

[00:46:54]

Join us next time for more explorations on the borderlands between reason and nonsense.

[00:47:06]

The rationally speaking podcast is presented by New York City skeptics for program notes, links, and to get involved in an online conversation about this and other episodes, please visit rationally speaking podcast Dog. This podcast is produced by Benny Pollack and recorded in the heart of Greenwich Village, New York. Our theme, Truth by Todd Rundgren, is used by permission. Thank you for listening.