Transcribe your podcast
[00:00:00]

This summer at boots, we're with you for whatever the season has in store, whether that's hay fever relief so you can take on a hike without letting the pollen stop you. The latest beauty must haves so you can glam up for a night out and freshen up the next day. Or SPF protection so you can kick back on the beach once the sun shows up with thousands of lower prices. We're with you for saving summer boots. With you for life.

[00:00:30]

Coming up next on passion struck.

[00:00:32]

David Attenborough comes always close to the top of the greatest revered Britons of all time. So maybe Winston Churchill is number one, but he is not close behind because all of all of his work using evidence to show the importance of climate change and its impact on the environment. So why is it, when you see such strong evidence, might people not respond to it in the way that they should? It is because of these biases. And these biases are reinforced by the fact that sometimes climate change is a matter of identity and politics rather than science. So one great documentary on climate change was an inconvenient truth, and that was laden with facts and figures and evidence. But because it was about Al Gore, this made it seem like a democrat versus republican issue. So even if you're a Republican who's able to understand data and science, and you're generally rational now, your identity feels threatened because you think, well, climate change is something that people like them believe and people like us, we should resist.

[00:01:38]

Welcome to Passion Struck. Hi, I'm your host, John R. Miles, and on the show we decipher the secrets, tips, and guidance of the world's most inspiring people and turn their wisdom into practical advice for you and those around you. Our mission is to help you unlock the path power of intentionality so that you can become the best version of yourself. If you're new to the show, I offer advice and answer listener questions. On Fridays. We have long form interviews the rest of the week with guests ranging from astronauts to authors, CEO's, creators, innovators, scientists, military leaders, visionaries, and athletes. Now let's go out there and become passion struck. Hello, everyone, and welcome back to episode 463 of Passion Struck. Consistently ranked the number one alternative health podcast and a heartfelt thank you to each and every one of you who return to the show every week, eager to listen, learn, and discover new ways to live better, to be better, and most importantly, to make a meaningful impact in the world. If you're new to the show, thank you so much for being here. Or you simply want to introduce us to a friend or a family member, and we so appreciate it when you do that.

[00:02:45]

We have episode starter packs, which are collections of our fans favorite episodes that that we organize in convenient playlists that give any new listener a great way to get acclimated to everything we do here on the show. Either go to passionstruck.com starterpacks or Spotify to get started. I'm thrilled to share an incredible milestone we've just achieved. Together, we've officially crossed 40 million downloads. This isn't just a number, it's a testament to the movement we're building, the conversations we're sparking, and the change we're inspiring across the globe. In case you missed my interviews from last week, I had enlightening conversations with Doctor Terry walls and Brian Evergreen. Doctor Terri Walls shared her revolutionary approach to health and wellness, detailing how she defied conventional medical wisdom with the walls protocol. This diet based treatment transformed her life with multiple sclerosis and offers profound insights into how dietary choices can impact chronic diseases and overall health. Brian Evergreen takes us on a journey through his latest work, autonomous transformation, creating a more human future in the era of artificial intelligence. He reveals the critical imperative faith facing today's leaders, the need to pivot from outdated mechanistic approaches to a new era of human centered social systems empowered by the latest advances in AI.

[00:03:52]

And if you liked those previous episodes or today's, we would so appreciate you giving them a five star rating and review. They go such a long way in strengthening the passion struck community where we can help more people to create an intentional life. And I know we and our guests love to hear your feedback. Today we have the distinct pleasure of speaking with Professor Alex Edmonds, a true luminary in the world of finance and economics, currently enlightening minds at the London business School. Alec's journey has taken him from the halls of MIT, where he earned his PhD as a Fulbright scholar, to the trading floors of Morgan Stanley and onto the esteemed faculties of Wharton in the World Economic Forum and Davos. Alex isn't just a scholar, he's a sought after speaker whose TED talks, including what to trust in a post truth world, have amassed 3 million views, challenging and reshaping our understanding of business's social responsibilities and the power of a pie growing mindset. Today, we delve into his compelling new book may contain lies how stories, statistics and studies exploit our biases and what we can do about it. In this critical work, Alex dismantles the minefields and misinformation that bombard our daily lives.

[00:04:53]

From fabricated tales that tug at our heartstrings to flawed studies that skew public policies and personal beliefs. With vivid examples and rigorous analysis, he unveils the biases that lead us astray and arms us with strategies not just to survive, but to thrive in a world awash with misinformation. Join us as we explore the essence of critical thinking, the importance of challenging the sources we trust, and how to empower ourselves through informed skepticism and personal accountability. Get ready to rethink how you perceive the world and your role in it. Thank you for choosing passion struck and choosing me to be your host and guide on your journey to creating an intentional life. Now let that journey begin this summer at boots.

[00:05:33]

We're with you for whatever the season has in store, whether that's hay fever relief so you can take on a hike without letting the pollen stop you. The latest beauty must haves so you can glam up for a night out and freshen up the next day. Or SPF protection so you can kick back on the beach once the sun shows up with thousands of lower prices. We're with you for saving summer boots. With you for life.

[00:06:06]

I am absolutely thrilled and honored today to have Alex Edmonds on passion struck. Welcome, Alex.

[00:06:13]

Thanks so much, John. It's great to be here.

[00:06:15]

Alex, you and I were introduced by Katie Milkman, our mutual friend, and I'm so interested to understand how you got involved with the behavior change for good initiative that she and Angela co lead.

[00:06:26]

So I was a professor at Wharton in the start of my career. I started in 2007 and I worked in finance and Katie joined the operations management department a couple of years later. Now, often as an academic, you have your head down. You just focus on only what you're doing. But Katie and I wanted to build some camaraderie within the junior faculty. So both her and I and another professor called Cassie McGillnor, we organized some junior faculty wide events in in order to encourage interdisciplinary interaction. And so that's how I knew of Katie's work. Her work is behavioral, and some of my work on finances, also behavioral, suggesting how psychological factors nudges can affect decisions.

[00:07:06]

Yeah, that is fascinating. And you have a best selling book before the one we're discussing today. But today we're going to be going through your brand new book, which launches the week that this is coming out, which is may contain lies. And I was hoping you could give us a brief overview of what prompted you to write may contain lies and what you hope readers will take away from it.

[00:07:28]

So my day job is as a professor of finance. And what professors do is they produce research and that research will get disseminated. Now. So for many professors, their main goal is for this to be disseminated for academic journals and to be read by other academics. But I really like research that impacts the real world. So how can this affect how investors allocate their money, how executives run companies? But when I started to do this, I realized that how practitioners would respond to research would be often driven by whether they liked its findings rather than whether the research was accurate. So if it was a research that accorded with their worldview, they would say, this is the world's best paper. And even I wouldn't all call my own papers the world's best papers. And if there was a research which contradicted their view of the world, they would say, well, that's just academic research that has no bearing on the real world. So I thought then what are we doing as an academic profession? We are supposed to be producing and disseminating information. But if the way in which the information actually has impact is based on playing on people's existing beliefs, then some great research will never have an impact.

[00:08:41]

And some flimsy research. Well, so this is why I wanted to highlight the biases that cause readers or practitioners to fall for misinformation and to separate out what is good research from what is flimsy.

[00:08:53]

Yeah, it's so interesting because my career was spent primarily in management consulting for companies like Booze and Anderson Consulting, and then in Fortune 50 enterprises. And I can't tell you how many projects I've seen where they are using research that's only slanted in the way that drives the outcome that they're hoping to achieve from the project. And they don't share any research that shows something completely different. My whole point of bringing that up is it's something we do on a regular basis in all areas of our lives. It's so fascinating.

[00:09:31]

Absolutely. And this is not because people are intentionally set out to mislead. They're not bad people, but they're human. And humans have their biases. They believe what their particular view of the world is. And so it's human that we will latch onto something which accords with our worldview. And if something doesn't, we instinctively think that it's wrong. And we might think, well, we're avoiding spreading misinformation by not publishing stuff that is wrong. So we will tune it out and only present stuff which accords with our worldview because we think that's genuinely right.

[00:10:01]

So, interesting. Well, Alex, you start out the book with a bang by discussing your experience testifying in front of the UK House of Commons select committee on business. And you were put into a really difficult situation. How did you end up discovering outdated and misleading evidence was being presented by another witness?

[00:10:23]

Yes. So let's describe what it means to be summoned to a select committee. So there's an inquiry into corporate governance, which is the way that companies are run. And I got there early because I was nervous. I wanted to swat up on any questions that the committee might ask me. And so I sat in on the earlier session, but my ears pricked up because the witness in that earlier session mentioned some research which sounded noteworthy. So it claimed that the lower the gap between CEO pay and worker pay, the better the company performance. Now, this was music to my ears, because much of my work is on sustainability, responsible business, and it's responsible for businesses to pay their employees fairly. And I want to claim that responsible businesses outperform. So this seemed to be a study which could be another weapon in my arsenal. So I wanted to look it up. And so I went to the witnesses written submission, which you have to submit before being called to testify. And I saw the reference to the paper. I looked the paper up myself and the paper claimed the completely the opposite result. It said, the lower the gap between CEO and worker pay, the worse the performance.

[00:11:34]

I thought, yeah, I'm nervous because I'm about to testify myself, but I'm not so nervous that I'm going to be misreading this paper. It was there, clear as day. It had the completely the opposite result of what the witness claimed. And so I dug a little deeper. I figured out what had happened is the witness was quoting a half finished paper, the finished version had actually come out. And after going through peer review and correcting its mistakes, it came up with completely the opposite result. So I notified the clerk to the select committee about this after my own session, and he seemed appalled. He said I should submit some supplementary evidence highlighting that the initial evidence got overturned. I did. The committee published it. Yet their final report on the inquiry referred to the debunked study as it was gospel. And so this taught me two things. Number one, it taught me that even sources that we seem to be, we think are reliable, like a government commissioned study, may be incorrect because they're undertaken by humans and humans have biases. And number two, you can almost always find research to support whatever position you want to support.

[00:12:47]

Even a half finished study, when the finished version shows the opposite. So we like to band around these phrases like, research shows that studies find that evidence proves that. But you can always find studies to support whatever you want to support. So the fact that a study shows something doesn't necessarily mean that it's true.

[00:13:10]

Yeah. So when you're faced with situations like that, where you've got studies on both sides of an issue, how do you present the right side of the issue? Or in your opinion, do you need to present both sides and then let the recipients draw their own conclusions?

[00:13:27]

So there's a couple of things that you can do. Firstly, what you can try to do is scrutinize the rigor of the study and not just be phased by the conclusions. And so this might be thinking, well, is this correlation or is this causation? So we might find a link between CEO pay and company performance, but is it CEO pay that drives company performance, or is it the opposite direction? If a company is already performing poorly, then maybe it can't pay the CEO as much. Or maybe there's a third factor that causes both. Maybe there are certain industries in which CEO pay tends to be higher and those industries also tend to be better performing. So that could be causing the outcome. And so you might think, well, isn't this tricky to try to think about alternative explanations? But it's not tricky. You don't need a PhD or even a degree in statistics to do this all the time. When I share studies on LinkedIn, or somebody else shares a study on LinkedIn that people don't like the sound of, then there's no shortage of comments as to why it might be flimsy. It could be it's correlation, but there's no causation.

[00:14:37]

But we suddenly switch off our critical thinking facilities when we see a study that we do. So I think the first thing that we can do is ourselves just apply a sanity check. Are there alternative explanations for the same particular result? But the second thing that we do is something that you were mentioning, John, is try not to be swayed by one particular study, because as I mentioned, there are studies that can show everything. There was a study published in a reputable journal showing that vaccination causes autism. Rather than just stopping and finding the study that gives the result that we want, let's try to look more generally at other studies. Is there indeed a credible other side? Sometimes it could be as simple as googling for the opposite of what we'd want to be true. Maybe I'd want an excuse this evening to drink lots of red wine. And if I googled red wine is good for your health. I'm sure I'd find some studies claiming that if instead I googled why red wine is bad for your health. Let me see whether that throws up any high quality studies now.

[00:15:42]

And I want to dive into the health area, even in more detail, given we're an alternative health podcast here in a second. But I thought, to really make this apparent for the listeners, I think it's important to go to the 2016 Brexit referendum that you cited in the book, because I think this is a great example of how misinformation not only influenced public opinion, but overall decision making on something that had major societal implications. Can you discuss this example?

[00:16:15]

Absolutely. So, in 2016, the UK had a referendum on whether to leave the European Union, and there were two campaigns. One isn't the Remain and the other is Brexit to leave. And one of the big pieces of information which may have affected people's vote was the side of a bus. So the Brexit supporters, the Brexit campaign had taken out advertisements on the side of busses in the UK, where it says, EU membership cost the UK 350 million pounds per week. Let's fund the National Health Service instead. Really powerful message, right? Health matters to everybody. I'm sure this resonates with the listeners of this podcast. And this is why we thought, well, if we left the EU, we would be able to have a better health service. But that number was completely wrong. So, number one, the actual figure was 250 million, but then there's a huge amount of rebates that the UK gets from the EU. So that 350 million pounds per week is actually 120 million pounds, and that is a third of the original size. Now, you might still say 120 million pounds per week is a lot, but compared to other things the government spends its money on, it's not.

[00:17:28]

And there's lots of benefits, such as free trade and a free movement from the people. Now, people didn't question this, right? So in the past, we knew what were the reliable sources of information. We had to go down to the library and get out an encyclopedia, or you go to a doctor for medical advice. You'd never think that the side of the bus is a reasonable source of information, but because this was something that played on people's biases, because people wanted it to be true, their confirmation bias was in action. And they paraded this 350 million pounds number even though it had not been vetted and it had come from an unreliable source.

[00:18:09]

Man, to me, it's just so fascinating. And I look at what happened there and today, what's happening around the world with some of the conflicts we're seeing and how much misinformation there is on both sides, depending on what story the other side wants to convey to its worldwide audience about what is going on. It's so interesting how this impacts things on a regional and global basis.

[00:18:36]

And if people want it to be true, they won't question it. So both your incentives to press spread misinformation are high, but also your ability to do so are high because people won't question it.

[00:18:46]

Yeah, absolutely, Alex. I think it's important for people to understand biases because they play a very important role in the beginning of the book. And I'm going to go through each one separately. Can you discuss what confirmation bias is in case a listener is unfamiliar, and how that ends up distorting our perception of information?

[00:19:08]

Certainly. So confirmation bias is the idea that we have a view of the world, and so any piece of evidence that confirms that view, we will accept it uncritically. And the basis of this is even neurological. When we see something we like, then this activates the striatum. That's the part of the brain that releases dopamine. It just feels good to see something we like. And the flip side to this is that if we see something that challenges our worldview, then we immediately want to dismiss it. And again, neurologically, when we see something we dislike, this triggers the amygdala. That is the part of the brain that activates a fight or flight response. We respond to information we don't like as if we're being attacked by a tiger. So let's make this concrete. So let's say you're somebody who strongly believes that climate change is a hoax. So if you see some new study come around which says climate change is a hoax, which is perpetrated by certain policymakers makers, you will accept this. You will tweet this from the rooftops without scrutinizing what it actually says. And then if you saw a study which found the opposite, that climate change is man made, you might even not even read it to begin with.

[00:20:26]

Or even if you did read it, you'd now read it with a critical eye, trying to tear it apart, trying to look at any possible alternative for the conclusions which were drawn. So that's about how confirmation bias leads us to responding to information in a biased manner. But it goes even further than that. It will also affect what information we search for to begin with. For example, if you tend to be more right wing, I will only watch Fox News if you tend to be more left wing. I might only watch MSNBC. And so we only get certain parts of information to begin with. We are living in an echo chamber.

[00:21:03]

This summer at boots. We're with you for whatever the season has in store, whether that's hay fever relief so you can take on a hike without letting the pollen stop you. The latest beauty must haves so you can glam up for a night out and freshen up the next day. Or SPF protection so you can kick back on the beach once the sun shows up with thousands of lower prices. We're with you for saving summer boots. With you for life.

[00:21:34]

It's so interesting. I just want to talk about climate change for a second. I'm not sure if you know who David Attenborough is. He's from Britain and he's been studying the effects on the world for 93 years. And Netflix just did this incredible documentary profiling him, talking about the changes that he has seen throughout his lifetime. And it's just amazing to me because 93 years ago, there were like 2.8 billion people on the earth. And when it all started, only 30% of the world inhabited populous areas, whereas 70% of the world was still wild. And you look at where we are today and we now have double that amount of people plus, and now we're occupying almost 70% of the world. And you start seeing how this imbalance starts impacting everything. And its so interesting to me how when you can see this through his eyes and this very well made documentaries, eyes of how much things have changed, how people still wont believe that its changing, and how there's such a different song that's being sung, when there's so much empirical science that shows the changes are happening. To me, it's just baffling.

[00:22:58]

And it's also baffling to me. So David Attenborough comes always close to the top of the greatest revered Britons of all time. So maybe Winston Churchill is number one, but he is not close behind, because all of his work using evidence to show the importance of climate change and its impact on the environment. So why is it, when you see such strong evidence, might people not respond to it in the way that they should? It is because of these biases. And these biases are reinforced by the fact that sometimes climate change is a matter of identity and politics rather than science. So one great documentary on climate change was an inconvenient truth, and that was laden with facts and figures and evidence. But because it was about Al Gore, this made it seem like a democrat versus republican issue. So even if you're a Republican who's able to understand data and science, and you're generally rational, now, your identity feels threatened because you think, well, climate change is something that people like them believe and people like us, we should resist. And similarly, some other messages will come across. There was a message ridiculing Ted Cruz for being a climate change denier and saying, 97% of scientists agree that climate change is man made.

[00:24:17]

But because you poke fun at Ted Cruz, then this might lead republican supporters to stand up for Ted Cruz and stand up for the underdog. And this now seems to be something where even if you're backed into a corner because the scientific evidence is pointing in one direction, because this is not a debate about evidence but ideology and whose side you are on, then people might tune out the evidence because their identity is something important to them. So then what is the best way to try to ensure greater climate literacy and climate knowledge is to disentangle the message, the evidence from the ideology. Perhaps sometimes resist the temptation to label the other side as uninformed or deniers or going against science. Instead, just present what the science is and what the evidence is without linking it to a particular political affiliation. Sometimes it might involve republicans highlighting the importance of climate change. It might also highlight mean the importance of highlighting not just the causes, but the solutions. So if the solution to climate change is taxation and regulation, Republicans might be unwilling to accept climate change as real because they don't want the solution.

[00:25:30]

But if instead the solution is innovation, ingenuity, capturing carbon in strong and deep geological formations, or launching some solar reflectors into the atmosphere to reflect the sun's rays, those are things which will be accordance with republican values. And that might cut through more than just presenting the facts.

[00:25:51]

And that, Alex, is a great introduction into the other bias that you bring up in the book. Last year I had the honor of interviewing Marianne Lewis and Wendy Smith, who you might know. Marianne is a professor at University of Cincinnati. Wendy is at University Delaware. Their book was the finalist for the next big ideas at club's book of the year. And congratulations to you for your book being nominated as a must read for the May edition of the next Big Idea Club. But their book covers both and thinking, and it really goes into the ramifications of what happens in most of western society, which has been taught to think about either or thinking. Or as you bring up in the book, black and white thinking. And I think what you just expressed about Republicans, and if it was just explained in the terms that you use, taxation, et cetera, that's never going to be accepted by them. But if you start thinking a bit that it's both that and these other things, it makes it such a more palatable discussion. So can you perhaps think of another example of this where people get stuck in this black and white thinking and how both and thinking would change their complete rationale for how they're thinking about a topic?

[00:27:05]

Absolutely. So let me explain what black and white thinking is to begin with. So this is the view that something is either always bad or always good. So this contrasts with confirmation bias, because confirmation bias is that you have a given viewpoint and you look for stuff that supports that viewpoint. And you might think, well, that applies to many things. Gun control, immigration, abortion, there's all pre existing viewpoints, but there's many things about which we might not have a pre existing viewpoint. So let's say food intake. So protein. Most people's view is that protein is good, it builds muscle, fat. Most people's views on fat is it's bad, it makes you fat. That's why it's called fat. But with carbohydrates, that's a bit more neutral. So you might think, well, people don't have pre existing views, but what black and white thinking means is that people think that carbs are either going to be always good or always bad. There's no middle ground. So even though you don't know which side you're on, you know that it can only be on one side. And so an example in which black and white thinking was exploited was the Atkins diet.

[00:28:11]

So Robert Atkins came up with a diet which argued that we should have as few carbs as possible. It was really extreme, minimized carbs, not something nuanced, not something in between, where carbs are fine as long as they're 30% to 50% of your daily calories. He just demonized carbs. So why was his book so successful? Why is it still the best selling weight loss book in history, even though it's been debunked by many scientists? Because it plays on black and white thinking. Our view that something has to be either always good or always bad, and also because it's really simple to implement. We only need to look at the nutrition label of food and look at the carbs line to figure out whether to eat something or not, rather than thinking about is it complex carbs or simple carbs? Maybe there's certain amounts of carbs that we should be eating and certain amount that we should not notice that had Robert Atkins come up with the completely the opposite conclusion and had a carb only diet, maybe that would have also gone viral. Why? It's easy to implement. It plays on black and white thinking by suggesting we should have as much of something as possible.

[00:29:24]

And indeed, sometimes now the protein diets might be playing into that role of black and white thinking more is always better. We don't allow for the possibility of diminishing returns or nuances.

[00:29:36]

Yeah, I think that's a great example. And I wanted to hit on this term that you use in the book called Post Truth World, where information is prevalent across various aspects of our life, which is the world that we're living in right now and staying on this health theme. One of the post truth world realities to me is the way that western medicine has been treating us now for decades based on the symptoms that you can think of as. As the leaves of a tree instead of looking at us holistically. And to me, it's so fascinating now, as you get into personalized medicine or functional medicine, how we're finding that oftentimes the best way to approach what's going on in your life has nothing to do with pharmacology and writing a prescription. It has everything to do with behavior change and the lifestyle that you're living. And yet people don't want to accept that because that's not what western medicine is really pumping out from an education standpoint. In fact, very few doctors get much background in diet and things like that out of what they're being trained to do. So I think this is another one where the biases are really harming people because they're such a different and intentional way that we could go about making ourselves healthier.

[00:31:03]

Yet it's not what's being prescribed by the majority of the doctors that we see.

[00:31:09]

Well, people want to have easy solutions. So behavior change is difficult. We want to unlearn behaviors that have been with us for maybe 40 or 50 years. Whereas the idea that you can take a pill, that's easy, or if the change in behavior is something that might cause a little bit of pain but can be gamified, then that's something which is relatively easy to do because there's a clear target to aim for. So the idea of cutting out carbs, that's a bit like gamification, try to have as few carbs as possible. And so even if it might be challenging at the start because you really like rice or bread, it's something where there is a clear cut target. So if you contrast that with certain behavior changes. For example, to reduce blood pressure by not getting stressed in particular situations and by managing your emotions. That's something much harder to achieve. That's something much harder to measure. There's not the gamification element to it compared to cutting out carbs or taking a couple of pills every day.

[00:32:09]

So, Alex, you've given talks and you've participated in forums all over the world. You've got some great TED talks that have over 3 million views that I'd like to point out. But how do you think that different cultural contexts affect the perception and spread of misinformation?

[00:32:26]

I think they certainly do have a huge effect. Why? Because the cultural context will mean that there is a pre existing view within a particular country or within a certain organization. And therefore a message might resonate, or not resonate, depending on whether agrees with that particular viewpoint. So one of my main fields is sustainability, or responsible finance, responsible business, as I've alluded to previously. And this is something which is really good for the planet, it's really good for the people. And if I go to some companies, I will typically highlight that message. But in contrast, if I go to investment banks or private equity or maybe law firms, if I was to give that message, that might just come across as wishful thinking, as going around in a circle and singing Kumbaya rather than having a commercial knowledge of business. So when I speak to those organizations, I will slightly emphasize different things in the message. How there's a commercial imperative, how evidence suggests, and this is high quality evidence, that companies that are more sustainable will do better in the long term. So this is not just a way of saving the dolphins. This is a way of making your company commercially and financially successful.

[00:33:39]

Notice this is not a case of chopping and changing the message and saying different things to different people, which cannot all be true at the same time. That's inconsistent, and you'll be quickly frowned out for doing this in particular. Since then, one way I communicate is often in written version through newspaper articles, and people easily find out if you're contradicting yourself. It's instead to look at the same picture from different perspectives, emphasizing different things about a message. Even if I have 30 minutes to give a talk, I can't give everything. So I might highlight the commercial imperative more for certain audiences and the social and economic imperative for others. But it's really important to understand the context in which you're speaking. Otherwise your message may not cut through, no matter how rigorous it is in.

[00:34:26]

Terms of the evidence basis no, it's really important. I've traveled myself to over 40 countries around the world, and it's so interesting, depending on what part of the world you're in, how people perceive what you're saying. I have found in Asia in particular, a lot of times I would go out there and do a briefing to my staff who was there and they'd be nodding at me and I was perceiving it as they understand what I'm saying and they're going to comply with it, which I found was not exactly true. What I found they were doing when they were nodding was they were acknowledging what I was saying. But in that culture, I found that they didn't feel comfortable contradicting me or challenging some of the things I'm saying. So they acknowledged what I was saying, but it didn't mean that they were actually going to carry through with what I was wanting them to do, which was a lesson I had to learn.

[00:35:24]

Absolutely, yes. Just to understand this cultural context and how things have meaning. As I mentioned, we can't take evidence in isolation. Evidence has a cultural identity to it. We can't take even simple gestures in isolation. We need to understand what it means. It'll be different in different contexts. So one big accusation of people is we have a weird view of the world. So what does weird mean? It stands for western educated, industrialized, rich and democratic. And that's where a lot of studies are conducted. And so our view of the world might be skewed by what weird people will do, how weird people act, and what does a nod or a head shake mean from a weird person?

[00:36:05]

Yeah, absolutely. I wanted to go through a couple of the areas of your book that I found fascinating. You have this one section that's called choosing your words and data carefully. And you write this. There's a different shortcut. If a statement is a direct quote, you can simply search for it without having to through the whole report. And you give this example. Thousands of articles claim that the former general electric CEO Jack Welch declared that shareholder value is the dumbest idea in the world. You do a Google search and it quickly tells you that what he actually said is, on the face of it, shareholder value is the dumbest idea in the world. Shareholder value is a result, not a strategy, which has a completely different meaning. Yet, although that reference wasn't technically false, it's a lie because they selectively pulled out that information. And I guarantee you this happens all the time.

[00:37:04]

Absolutely. And so why do I come up with the example? And actually this ties into the general theme of the whole book. So the book is called may contain lies. And you might think, well, the word lie, that sounds pretty inflammatory. It's a pretty provocative title. Did I choose this for clickbait? No. So the word lie, I want the reader to think about lie more broadly. So normally the word lie is reserved for an outright falsehood. So this is why to call somebody a liar, that is a big step. But to me, lie is simply the opposite of truth. Somebody can lie to you by saying something which is completely true, by ignoring the broader context. I think this is really important, because when you mentioned the post truth world earlier, John, often people think, well, the solution to a post truth world is to check the facts. But even if the facts are 100% accurate, they could still be misleading without the context. So Jack Welch absolutely did say shareholder value is the dumbest idea in the world. And why did this go viral? Because there's so many people who are concerned with capitalism that this was catnip to them.

[00:38:12]

Here was a big capitalist who ran Ge, who now is turning his back on capitalism. But the full quote, as you say, on the face of it, shareholder value is the dumbest idea in the world. He says shareholder value is a result, not a strategy. So what he means by this is when companies taking decisions every day, they don't sit there in their office and think, well, how do I maximize shareholder value? They might think, how do I grow? How do I maximize market share? How do I inspire my employees? How do I deliver great products to my customers? And if they do that, they end up creating shareholder value. So shareholder value is a good result. And so it is fair to say this company has created shareholder value. So we should lord this company and applaud this company. Even if shareholder value was not the day to day driver of the decisions. It's something that we can look at after the fact to gage whether the company has done a good job. But all of these nuances, these are swept under the carpet if we like to have the message, capitalism must be overturned.

[00:39:17]

One of the world's greatest capitalists is now turning his back on shareholder value.

[00:39:22]

No, I think it's a great example, and I wanted to highlight the other one that you bring up right after it, because one of the guests I've always wanted to have on this show is Matthew Walker. And anyone who is wanting to learn anything about how to sleep better looks at Matthew as one of the top, probably two or three experts in the world. And it's incredible. You bring up his best selling 2017 book, why we sleep. And it presents this bar chart showing how more sleep is associated with fewer injuries in teenagers. But what he ends up doing is he removed the bar showing that 5 hours sleep leads to fewer injuries than six or seven, because it doesn't fit his message. And I just couldn't believe it because I read that book and I never even questioned that data at all.

[00:40:12]

And this is not your fault, because you're human. And very few humans would have questioned this data. Why the book plays into confirmation bias and black and white thinking. Why does the book, why we sleep play into confirmation bias? Because everybody wants an excuse to stay in bed a little bit longer. You think the eager beavers who wake up at 05:00 a.m. in the morning, or who burn the midnight oil, they will get their comeuppance later. And this book suggests it. And also the book gave the black and white impression that more sleep is always better. So six is better than five, seven is better than six, and so on and so forth. And because it's by somebody with academic credentials, he puts on the front page doctor Matthew Walker. You believe him. And so he presents this graph showing that when you sleep more, increasing your sleep from six to seven to eight to nine, the fewer injuries that you get if you're an adolescent. But as you mentioned, there is a bar which shows that if you sleep 5 hours, you actually do even better than if you had 6 hours. So this contradicts his story.

[00:41:18]

He chose to cut it out, which is misleading. It's technically not a lie, because what he presented was absolutely true. But if you're a witness in a criminal trial, you swear to tell not just the truth, but the whole truth. And here he didn't tell the whole truth because he cut out a really important bar chart. And what is really surprising is that he didn't need to do that. Even if he cut out that bar chart, most of the chart supported his idea that more sleep is better. 6789. That all led to fewer and fewer injuries. Yes, five went in the other direction, but mostly it was in his favor. But because of black and white thinking, people like to see things as always good and always bad. They often can't handle nuances. He didn't want to present the full picture, which was a nuanced picture, man.

[00:42:09]

Fascinating to me. And I now want to jump to chapter six of your book, which is titled data is not evidence causation. And the thing I want to talk about here is smoking cessation. I'm a person who was in information technology and in some of these Fortune 50 companies, I was in charge of all big data. And to me it is so interesting how when you start interpreting data, how you can make it look however you want it to be interpreted. But you talk about this topic called reverse causation. How does that complicate the interpretation of data? And maybe lets use that smoking cessation one as a way to explain it.

[00:42:52]

This again highlights the importance of going beyond just the truth. So even if something is 100% accurate, it can still be misleading. But this is in a quite different context to what we discussed just a few moments ago. Here. What you have is large scale data which shows that if you are a smoker and you stop smoking, your chance of death in the next couple of years is actually higher rather than lower. So this seems crazy. How can it be that stopping smoking leads to more death? And if you are for the pro smoking lobby, you might say, well, this is an argument for why we shouldn't tax or regulate smoking. Actually, stopping smoking does not improve the health outcomes. But this is the concern that correlation is not causation. Is it that stopping smoking causes you to have a shorter life? Or is it that a shorter lifespan causes you to stop smoking? So when is it that smokers finally quit the habit? It's when their doctors tell them, look, you're in a really bad health state. Your likelihood of lung cancer or any of these other really serious illnesses is really high. Unless you stop, then that will finally get the person to stop.

[00:44:09]

But they would have probably died anyway and perhaps died even faster had they not stopped smoking. So what this suggests is it's the likelihood of imminent mortality causing you to stop smoking, rather than stopping smoking causing you to die soon. Now any listener will know that correlation is not causation. So you might think, well, how do people fall for something like this? Well, again, when our biases are at play, we will accept the explanation being paraded, no questions asked. And this is true also in my field of sustainability. So I would love to believe that sustainable companies always perform better and it's sustainability that drives their success. But sometimes it could be in the opposite direction. Sometimes if a company is already performing well, then it can start to invest in a sustainability. It costs a lot of money to pay your workers more to cut your carbon emissions. And so that's why the highest quality studies will try to disentangle correlation from causation. But you have some flimsy studies often that don't do that. They just rely on their one explanation and they hope that people will accept it. And so this is linked to the title of the chapter you mentioned, John, was data is not evidence.

[00:45:27]

So what does that mean? Data is just a collection of facts. High sustainability companies perform better. What is evidence? Let's go back to a criminal trial. Evidence is something that supports one conclusion and doesn't support other conclusions. So evidence in a criminal trial has to point to one suspect and not other suspects. Now, just the data showing that more sustainable companies perform better. That is not evidence, because it could be that sustainability drives performance, or the alternative suspect could be that performance drives sustainability. But just like a prosecutor or a police officer who's honed in on their one preferred suspect and blinds themselves to the possibility of alternatives, this can also be the problem when we look at data, when we have a preconceived viewpoint.

[00:46:20]

Yeah. Alex, I have to say, I loved the titles that you used for each one of these chapters. I found just as I was looking at the table of contents, that made me wanted to read more. And that is definitely the case with that last chapter in the next two. So, in chapter seven, you lay out a fact is not data. And as I was reading this chapter, I got to this point on the narrative fallacy. And it was so intriguing to me because I've read a lot of Simon Sinek's books and Malcolm Gladwell's books, and I never thought of it this way, that they're using a tried and tested theme that's actually behind hundreds, if not thousands, of smash hits, where they take a single big idea to make themselves as memorable as possible. And then they find as many examples as I can to illustrate the idea. However, they typically only draw ideas that support their hypothesis. So I was hoping you might be able to explain this narrative fallacy in a little bit more detail and go into maybe more about Isaacson, Sinek, and Gladwell.

[00:47:28]

Certainly. Let's use Simon Sinek as an example. So he's clearly been extremely successful, both with his book start with why and a few others, and his TED talk, how great leaders inspire action, which is, I believe, the third most viewed TED talk of all time. So what he wants to do is look at what drives success. And so he takes a highly successful company, Apple. The first company to reach $1 trillion and claims that Apple was successful because it started with Y. It had this idea of everything we do. We believe in challenging the status quo. They were daring. They were innovative. They had a why? Why do we exist? It is exploration to break boundaries. And this led to success. But as we know correlation is not causation. Even if Apple did start with a Y, how do we know that caused its success? There could be so many other things that caused Apple's success. Maybe it was Steve Jobs just was unusually talented, maybe he had a great network of contacts, maybe it was all of those other factors. But the narrative fallacy is the idea that you weave an explanation, a cause effect explanation, when none exists, and you dress this up into a nice story.

[00:48:44]

And so the idea that Steve Jobs was somebody who was inspired, perhaps because of his upbringing, he grew up in this household with a craftsman father who taught him about the importance of design, that inspired him about being different, about breaking boundaries. And he set up Apple. Why is that the preferred explanation that Simon Sinek and Walter Isaacson wrote about? Because that is empowering. So if indeed what led to Apple success was Steve Jobs unique network of contacts, that is not a great message. Because if you don't have a rolodex of contacts, you can't be successful. A book which tries to give you the secrets to successful. And if those secrets are unattainable, you're not going to buy the book because you don't have that network of contacts. You can't put it into action. But if the secret to success is to start with why that is empowering. Anybody can come up with a why if you just brainstorm hard enough or just have big enough blue sky thinking sessions. And so this gives you the message that anybody succeed if you were just to read the book and come up with your why. And notice that Senec doesn't just give the example of Apple.

[00:49:55]

He claims that Wikipedia beat Microsoft backed Encarta to become the world's fountain for knowledge because it started with y. And a y is what led the Wright brothers to beat Samuel Pierpont Langley to launch the first test powered flight. But he never considered any of these alternative suspects. He is the police officer who believes that this person is guilty and hones the explanation only on that person, ignoring everything else. There could be tons of other reasons for the success of those organizations. But if you've narrowed your focus to one particular one and you parade that explanation, that is what one might sell books. If indeed people like readers are also willing to believe that a why leads to success, because that's empowering.

[00:50:42]

And for the listeners, I just wanted to say I wish we could go through more of this book because in every single chapter, Alex brings fantastic details and stories like this to life. Which is why I wanted him to share a couple of the ones that I found most enduring. I didn't want to have this discussion without talking about some of the solutions. And you go through thinking smarter as individuals, as companies, and then as a society. And I wanted to start out by thinking smarter as individuals. And something that very well coming from the background that you have is the peer review process. Why is this peer review process so important?

[00:51:26]

So peer review, this might seem like an arcane scholarly ritual which doesn't matter to the person on the street, but it does matter. It has real implications. So it's just like any sort of kite mark. So how can we sleep safely at night? Evalux bear a kite mark. Showing it secure. Then we know it's something that we can trust. How do we know that something is certified organic? Again, there will be a particular kite mark. And that is the same for research. So if a paper is published in a top peer reviewed scientific journal, this shows that are the leading scientists have reviewed the study, have looked at the methodology to make sure that it is accurate. This contrasts with studies produced by many companies, and these companies are many that I respect, the likes of McKinsey or Blackrock, they will churn out studies and post them on the website. But it hasn't been vetted by anybody, and so it could be quite inaccurate and misleading. As we explained earlier, a study did a complete 180 on its conclusions after it went through peer review and corrected its mistakes. That was the study on pay gaps and company performance.

[00:52:36]

But it's important also to be realistic. So peer review is not perfect. So right now we see this controversy with the work of Francesca Gino at Harvard Business School, who's had some papers retracted which were published but now are seen to be potentially fraudulent. But we need to recognize that perfection not be the enemy of good. Yes, it's not the case that you can attach complete gospel to one study because it's been published in a peer reviewed journal. This is why, as I mentioned earlier, we should not put too much weight on one particular study, but it does increase our confidence. So it's certainly something that, as a reader, we should be looking at. Has the study been peer reviewed? And if it has, we're going to put more weight, not absolute weight, but more weight on it than if it has not.

[00:53:23]

Thank you for sharing that. And I want to go into the next chapter creating organizations that think smarter. And it's interesting, a couple months ago, I was watching a documentary on John F. Kennedy, and I was so intrigued, especially by his actions around Cuba and how he learned from them. And if he hadn't, what the repercussions could have been for those who aren't familiar. When he inherited the presidency, pretty quickly afterwards, he got involved in the bay of pigs, which went disastrously wrong, primarily because he listened to group think from his advisors. And so when a U two plane came over Cuba and found out that there were actually nuclear missiles there, he didn't want to repeat that same mistake again. And so, perhaps with that as a lead in, Alex, I'll let you explain the rest of the story and why this is so important.

[00:54:22]

Certainly. So this chapter is on the importance of cognitive diversity. So, diversity is a very common word nowadays, but often people equate cognitive diversity with demographic diversity. So what John F. Kennedy did is, after cuban missiles were spotted in Cuba, he set up this executive committee of the National Security Council to deal with the situation. Now, because it was the 1960s, unfortunately, it had only white men there. But despite a lack of demographic diversity, there was a big diversity of thinking. Now, some of the members of the committee were from the military, and obviously their initial response was, let's bomb the missile sites, let's invade Cuba. They wanted to look at military action, but the cognitive diversity was, there were people on the committee who saw the downsides of using hard power as well as the advantages, and one of them was Kennedy himself. So, unlike his predecessor, Eisenhower, who was a really decorated general, he was not from the same military background. And he had other colleagues who also saw this as not necessarily being the first solution, more a last resort. And so, rather than immediately going in and anchoring on this view of blockading, of invading, instead, he said, well, let's call a timeout.

[00:55:43]

Let's not anchor on one particular solution. Let's explore all the different alternatives that we have to respond. They came up with six. They debated them, whittled them down to just two. One was the invasion, and the other was the blockade. And then he divided the executive committee into two teams. Each of them were tasked with focusing and justifying on one of the solutions and then writing up a paper to explain the benefits of this. And then they share the paper with the other team, critique those papers, and they responded afterwards to make the case even more strongly and to address some of the concerns. And this is something which just doesn't often happen in corporate life, right? It is that we latch onto a particular solution. Anybody who raises a critique is just not a true believer. You're un american or you're not bought into the mission of the company. Whereas here, this was something where people actually wanted to debate, and they wanted dissent because if you were to disagree and highlight flaws in a particular proposal, this is not because you did not agree with the objective. You just thought that there were ways of actually achieving it in a different manner.

[00:56:54]

Yeah, I think that's a great example. And it makes me think of the way that Jeff Bezos was running Amazon in that if he had a critical new initiative that he wanted done, he would have that looked at by multiple small teams who would then come back. None of them knew they were working on this, and then they would come back to more of an executive team who would each present the way that they would attack the initiative, and then they chose from all of them which was the best way to go. And I think it's what ended up having them have so many successful initiatives because they looked at different sides of it. Not sure your thoughts on that, but to me, that's the way I think about it.

[00:57:34]

Absolutely. And this is what it truly means to have diversity of thought. So, yes, the focus on demographic diversity has some advantages, but often people think that if I have an organization where there's a mix of people, males and females and whites and non whites, that is enough. It is far from enough. And this reduces cognitive diversity to some very simple metrics, such as demographics, such as race and gender. Instead, what matters is not so much hiring a mix of people, but making sure that they have the space to come up with different ideas. Often as a leader, you think, I know best. That's why I'm the leader. Here's my idea. You junior people go and execute it. But instead, what Jeff Bezos was suggesting and what some of the truly great leaders do is, well, here might be the objective. I don't know the best way to achieve the objective. Let me give you time and space to come up with different ways, and then start from a blank sheet of paper and then look at, out of all the different suggestions, what is the one that we think is most effective, but we're going to allow a thousand flowers to bloom first and then choose the one that we're going to go with?

[00:58:37]

Yeah, I think it's a great example of a type of leadership that I think more companies need to practice, which is eyes on but hands off. Meaning that example that you just gave is a great example of one where Jeff Bezos absolutely is looking for a result, but he doesn't know how to do it. So he's hands off and giving people the autonomy to think about it and to come back with an uninfluenced way to go about it, Preston. Well, I want to end the interview on this. The last solution is to create societies that think smarter. And I want to ask you, Alex, looking forward, what are your greatest hopes for how society will handle misinformation?

[00:59:19]

I think it's to encourage people to be more discerning and more critical. Now, critical doesn't mean that you're always having to be negative, but it means that you're just expressing your critical thinking faculties. So just like if somebody was to sell us a used car, we will think, well, are there incentives to misrepresent the reliability of this car? And so that should be the same with data. If somebody's presenting data to us, is it that they presented this data because they're trying to sell something, because it conforms with a particular viewpoint? So we might want to apply the same healthy skepticism to something where we are chomping at the bit to believe as something that we're wanting to dismiss. And notice this is quite easy to teach. So there's not a single equation in the book. As I mentioned, you don't need a degree in statistics to be discerning about misinformation, but just to recognize the alternative suspects and alternative explanations. And I give a couple of potential ways to teach this, even in primary school. So just like we often have logic problems, in primary school, we can have statistical literacy logic problems. We have whodunit murder mysteries where there's alternative suspects.

[01:00:26]

And you need to think outside the box because often it's not the most obvious person who is actually the murderer or the culprit. And similarly, when we see sets of data, it's not necessarily the most obvious interpretation. Maybe it's reverse causality or other factors which are driving the correlation.

[01:00:43]

Well, Alex, I so enjoyed having you on today. If a listener wants to learn more about you, your books, the work that you're doing, where's the best place for them to go?

[01:00:52]

Well, I've really enjoyed the conversation too, John. So my website is alexedmonds.com. i also will be on X and LinkedIn under a Edmunds. The latest book is called May contain lies. And so all of these things are ways in which to find out more about my work.

[01:01:07]

And just for the audience, if you're watching this, here's a copy of the book. Love the COVID design. Thank you so much. And congratulations on its release.

[01:01:16]

Thank you so much, John.

[01:01:17]

What an incredible interview that was with Alex Edmonds. And I wanted to thank Alex, Katie Milkman and the University of California Press for the honor of him appearing on today's show. Links to all things Alex will be in the show. Notes@passionstruck.com dot please use our website links if you purchase any of the books from the guests that we feature here on the show. Videos are on YouTube at both John R. Miles and passion struck clips. Please go subscribe and join. Over a quarter million other subscribers, advertiser deals and discount codes are in one convenient place@passionstruck.com. deals please consider supporting those who support the show. I'm at John R. Miles on Twitter, TikTok, Instagram, Facebook and you can also find me on LinkedIn. And if you want to expand your courage muscles, then consider joining our weekly passion struck challenge, which you can do by joining our ever growing newsletter community of over 25,000 subscribers. And you can sign up for the newsletter@Passionstruck.com and it's titled live intentionally. Do you want to find out where you stand on your journey to becoming passion struck? Then consider taking the passion struck quiz, which you can find on passionstruck.com dot.

[01:02:19]

It consists of 20 questions and it'll take you about ten minutes, and it's based on the principles of my new book, Passion Struck. You're about to hear a preview of the passion struck podcast interview that I did with Angela Foster, a leading voice in health optimization and biohacking. As a former attorney turned health and performance coach, Angela has transformed her life and now helps others do it through her podcast, high performance health. In this episode, we dive deep into her insights into achieving peak physical and mental performance, exploring the latest in biohacking nutrition as well as lifestyle strategies.

[01:02:51]

So when I look at health optimization and we're trying to help someone optimize their health for high performance and longevity, I have a framework to make it easy to remember called shift. So how do I shift into optimal health? And that stands for optimizing your sleep, your hormones, gathering the insights, which is a combination of lab and wearable data. And then the f is how do I fuel my body? And for that, there's an acronym, flow. Because it's not just food. It's not just the food we eat. It's food, light, oxygen. So how we breathe, and water, hydration. And then the final piece of shift is the t. How do I train? But I don't just mean physical activity. How do I train my body and mind?

[01:03:29]

Remember that we rise by lifting others. So share this show with those that you love and care about. And if you found today's episode with Alex Edmonds useful, then definitely share it with those who could use his inspiration. In the meantime, do your best to apply what you hear on the show so that you can live what you listen. Until next time, go out there and become passion strong.