Transcribe your podcast
[00:00:04]

Welcome to Assembly Required with Stacey Abrams from Crooked Media. I'm your host, Stacey Abrams. You might have come across a lot of content related to the 2024 presidential election lately.

[00:00:20]

In this election, we each face a question, what country do we want to live in?

[00:00:27]

When President Joe Biden shared that he would not seek re-election, Vice President Kamala Harris secured a path to become the Democratic Party's nominee. Days before, Senator JD Vance got the Republican nod to become the GOP ticket's number two. Almost immediately, In both instances, timelines were flattered with memes and TikToks. You think you just fell out of a coconut tree? You exist in the context of all in which you live and what came before you. I'm just living that life. What is surprising to no one is that not all the information popping up on our FYPs or in our feeds has been true, or real, or accurate. And while the election tumult has certainly captured our attention, this phenomenon is not new. In 2019, then President Trump shared a video of then House Speaker Nancy Pelosi that was edited to make her appear to stammer during a news conference.

[00:01:31]

Here is the real video of House Speaker Nancy Pelosi.

[00:01:35]

And then he had a press conference in the Rose Garden with all this short visuals.

[00:01:41]

Now, the doctored video in which she appears to be impaired.

[00:01:44]

And then he had a press conference in the Rose Garden with all this short visuals.

[00:01:52]

That clip received more than two and a half million views on Facebook.

[00:01:57]

Be it deep fakes or fabricated news or the false statistics, the goal is the same, to distract us from what was actually said or done. Tragically, misinformation and disinformation have now reached a fever pitch. So what's the difference and why does it matter? In short, misinformation is getting the facts wrong. No, Elton John didn't sing, Hold Me Closer, Tony Danza, nor did an AI chatbot actually graduate from medical school despite telling its user that it had. And both have been repeated multiple times. But this information is worse. It's not a mistake. Those who spread these falsehoods and fabrications want to deceive you. It isn't only manipulated content either. It's any false information that's being deliberately shared to influence your view of something. You can find it online from a random Facebook user, or hear it directly from some of the loudest voices in politics, or have it repeated by your friends and family. From how bleach was supposed to cure COVID to reports of RNA and chicken feed, it's getting harder and harder to tell truth from fiction and good actors from bad ones trying to trick you. It feels like you can't escape the onslaught.

[00:03:15]

If you're watching, reading, or listening to the news, you find yourself questioning whether what's being reported actually happened. We trust each other and ourselves less and less. Social media has placed the dangers of misinformation and disinformation on hyper speed. The overload of content across platforms makes it nearly impossible for most people to navigate false information, real images, or questionable facts. What's just a funny meme to one person might actually seem real to another. The sheer amount of content on social, much of it unregulated and unlabeled, and our habit of doomscrolling, also erodes our ability to accurately discern what we're seeing. In this political moment, the disinformation targeting Vice President Harris has focused our attention. But this was an issue long before she became the nominee, and it's not going away anytime soon. More importantly, this info is much harder to spot when the target isn't standing for the highest office in the land. But regardless of how dire this feels, it isn't something we're going to shy away from. Because when there's a problem, we highlight the issue, we talk to people who know a lot about it, and we get to work on what we can do to solve it.

[00:04:37]

And I can't think of anyone better to do this with than Asosa Osa. After witnessing pervasive attacks and disinformation about Secretary Hillary Clinton while working on the 2016 campaign's research team, and then again about voter suppression and voter access while Deputy Executive Director of Fairfight Action, Asosa has become one of our nation's experts on Identifying and Stopping Dis-and Misinformation About Candidates and the Voting Process. Last year, she founded a nonprofit, Onyx Impact. It empowers black communities to fight against the harmful information ecosystems. Welcome, Asosa.

[00:05:17]

Thank you so much for having me. Very excited to be here.

[00:05:20]

I'm glad you are. So let's start at the beginning. You did not begin your career focused on the proliferation of disinformation. You know we both enjoy the comics. So what is your origin story?

[00:05:32]

I was raised in the great state of Ohio by Nigerian parents who were constantly involved, overly involved in our education, or as they would call it, their education. And one of the things that my dad, especially, would always say is that education is so important because information opens doors. And I think that where I got my maybe problematic level of curiosity that I took with me throughout my career. I started college in 2000 during the financial crisis. So of course, I go into finance to find out more about what exactly is happening, what stocks are, what bonds are as a fixed income portfolio manager, and left that to become a researcher on the Hillary campaign that she talked about. And then we lost the Republic in spectacular fashion. And so I went back home to Ohio and was curious about Trump voters. And so instead of reading a book like most people would, I went to go run a congressional campaign in Appalachia for 16 months. And then And had a lot of that. And had a conversation with my mom. And she was like, What do you really want to do? And I was like, There's this black lady in Georgia who's really into Sci-Fi.

[00:06:59]

And I I think I might want to go work over there. And with my mom's blessing, I drove down to Georgia with no job and no plan. It started knocking doors. And eventually, when I was still a volunteer at one event, there was all these little black girls sitting in the front. And when you walked out, the little black girls were all looking up in awe of what could be. Little black girls are my why. And And from then on, I have tried to work to create better communities for black folks, in particular, little black girls, to grow up in. And the rest is history.

[00:07:43]

I did not expect all of that. And that could make an amazing movie. In fact, we could call it Onyx Impact, which is the name of your organization. So what do you do now and why should we be both impressed with it and afraid of you?

[00:07:57]

We map and mitigate disinformation targeting Black communities in particular. And before I founded the organization, I was curious about, as always, who was doing this, and was fairly shocked to learn that across all of the many organizations countering this info, not a single one was focused on Black communities, which was pretty crazy to me, and started Onyx Impact to really try to fill this gap and this void. In less than a year, we have mapped the online an aligned disinformation landscape that is targeting Black voters. We've provided strategic guidance to dozens of organizations on how to fight disinformation in Black communities. We've launched national programs to fight this and can't see us slowing down for anything in the future. And this information is not going away, and it could very well get worse in our society. And there's a reason why every single disinformation research paper starts with, Please don't do this. This is very bad and very hard to stop. It's brought down not just communities, but countries. And so it's something that we need to be able to be much smarter about and fight back on.

[00:09:12]

Well, at the top of the show, I offered a very pithy explanation about the difference between misinformation and disinformation. Can you talk to us a little bit about the differences, the similarities, and particularly why the three letters at the beginning actually matter?

[00:09:28]

Yeah, absolutely. There's this thing that that is called the information disorder spectrum. And there are three parts: misinformation, disinformation, and malinformation. And misinformation is simply incorrect information. Disinformation requires a lot lives with the intent to lie. And malinformation is true information that is pushed with the intent to harm. And so in Black communities, you often see this as historical or current harms being constantly pushed with a goal of making it so folks don't go out to vote. You also see this with things like hack materials or doxing, putting out people's personal information. I think in particular, miss and disinformation are often used interchangeably, even though they have very different definitions. People, in my opinion, use misinformation. They really mean disinformation to shy away from accountability, to make an easier conversation, if you will. Even though we know that these are intentional acts, by calling it misinformation, it's almost a permission structure for folks to keep intentionally pushing that false information.

[00:10:37]

You and I met in 2018 during my campaign for governor of Georgia, but we really connected most deeply in 2019 at the start of Fair Fight. Voter suppression, voter access is a prime example of how disinformation, and to your point, malinformation, can affect our fundamental rights. Can you talk a bit about your work on the campaign and at Fair Fight, how What have they taught you about this field? And as you think about what's to come, what one real message do you want others to leave with?

[00:11:08]

I think because it's inherently fear-based, disinformation can lead to more serious threats and harassment. I really saw that as it affected Black women on the 2018 campaign and then continued in an even more aggressive way leading up to the 2021 run offs. While I was at Fairified and Preparing for the 2020 election, there was this leaked video of a senior Trump official that was saying something newsworthy about some issue, and it got played in all these news. But the actual video was like an hour and a half, and I decided to watch the full thing. And at the very end, they talked about a new strategy to take specific instances of voting issues or voting misconduct and label them as voter fraud, and then use the presidential platform to amplify them as much as possible. I was like, That sounds bad. And then we did additional research on it and found out that, Oh, not only is it bad, it's incredibly effective, not just with one specific demographic or one specific voting block. Everyone is going to be more likely to distrust elections if this is something that happens. And elections are now so intertwined with this information.

[00:12:24]

That's what led me to learning more about it and eventually founding Onyx impact. If there's one takeaway for folks to know is that this information affects everyone, can affect everyone, it's not something that one political party is somehow more susceptible to or some one income class is more susceptible to. At the very end of the day, our brains work in a very similar way, and that is that the more times we hear something, the more likely we are to believe something is true and to believe other people believe it's true as well. And so the amplification effect is really the underlying power of disinformation.

[00:13:05]

Well, to your point, a lot of disinformation gets spread through traditional news channels, the ones that put out that clip. They didn't necessarily put out the end of the clip that you described, but we know traditional news channels and politicians themselves have often been the purveyors of disinformation, and this has existed forever. So when you think about the 2024 campaign, pain and what we are experiencing without necessarily focusing on politics, which is where a lot of attention is focused, where are other parts of misinformation and disinformation having an impact?

[00:13:43]

That's a really good question. I think that when you look at the broader levels of distrust in our ecosystem overall, a lot of it is due to an inundation of mis-and disinformation. And a A lot of it's due to how we consume news now relative to how we consumed news in the past, with a lot more news coming from apps and social media. And what that means is that the correlation between content and credibility is really crumbling and being replaced by following. Followers, in many ways, determine credibility. And in an environment like that, it is very difficult to discern what is true and what is not. And I I think that you see this happen from a health perspective in terms of what we saw with COVID-19 and vaccines. I think you see it from an education perspective. If folks want to move children out of public school systems for reasons fueled by disinformation, it's really affecting all aspects of our life, including politics.

[00:14:53]

You mentioned social media as one of the places where disinformation spreads like wildfire, and we We know that because it's social media, it's particularly challenging to monitor and combat because it is so ubiquitous and so pervasive. But we also know there are companies that make money from being the platforms. Can you talk a bit about how these companies are incentivized or not incentivized to push this information? And is this information more popular on their algorithms?

[00:15:26]

Yes, absolutely. Disinformation is more popular on their algorithms because their algorithms are based on engagement. And so what types of content is going to get folks to engage with it as much as possible? We know that spreaders or purveyors of this information, it's only going to work if people can see things over and over and over and over again. So what better tool than an algorithm that's going to do that for you? I think that social media companies, despite all of the harm that we have seen, disinformation caused to our elections, caused to our democracy, they have really dropped the ball in terms of mitigating this issue, mitigating the hate, harassment, voter protection that they could possibly do. I think there was a free press report that X and Metta and YouTube had rolled back 17 different policies that helped keep this information in check after the 2020 election. They've completely done away with them heading into the 2024 election. We know that X has not only gutded their teams that fought misinformation, but is being led by someone actively pushing this and this information and really dangerous AI-generated content.

[00:16:40]

Speaking of AI-generated content, when we think about the landscape of online moderation by tech companies that evolved and then retreated, can you talk a bit about what concerns you would help people to keep in mind and what opportunities with regards to AI and this misinformation, disinformation malinformation landscape?

[00:17:02]

One of the biggest concerns is manipulated content. That is something that we need to be constantly on the the look out for as consumers of AI and something that regulators also need to be very concerned with helping prevent. I think there are just as many opportunities when it comes to artificial intelligence and especially generative AI, as there are problems or scary things. We have the ability to more quickly discover what issues folks care about and make sure that we can link those folks with the types of remedies that they may need or the types of information that they may need. I think that we have the ability to supercharge voter protection in a lot of ways with some of the advent of new AI tools, making sure that people get answers. We know the top 20 questions people are going to ask every single election cycle. How do we get that information in front of voters in the fastest way in the language they need at the exact date and time. I think there's so many different things from an AI perspective that we absolutely have an opportunity to continue to push on and make it into reality.

[00:18:18]

Pivoting from that, I live with a teenager. I call her my borrowed teenager, my niece, and she consumes most of her understanding of news, culture, world events. From social media, which puts her in very good company. I feel a great deal of sympathy for her and my other nieces and nephews because they have so many sources of information coming at them all at once. When I was growing up, we had the major news networks, the local newspaper, and if you knew my parents, you had NPR. You're a generation after me. I'm an ex, you're a millennial. How did your access to information evolve, and how does that impact how you're thinking about this landscape that you're working in?

[00:19:01]

It's a really interesting question. I think that my access to information probably started with encyclopedias. I would have to physically walk somewhere to get the information that I needed for anything, which is crazy. And then very early on, I think we moved to the most incredible invention, Ask Jeeves. Ask Jeeves became the way to cure your curiosity, even though it would take up the phone lines across the house. And And then Internet, Google came along making everything much, much easier. And look, I'm not ashamed to admit that I get the vast majority of my news from the comment section of the Shaderoom, right? That's how a lot of us engage with social media. And so I think that also really makes me understand how different folks will believe different things that may or may not be true based on how difficult it is to discern what information is truth and because of how much device passive content we also see online that really lowers distrust more broadly. As I often say, right now we're in a situation where the cost of spreading this information is approaching zero across the world, with some bad actors actually being paid to spread this information while the cost of accessing truth is becoming more and more expensive, whether it's from subscriptions or the time that it takes to discern what is true and what is not.

[00:20:25]

We're in a new paradigm, and it's going to be unbelievably important to navigate this.

[00:20:31]

So what does this tell you about how we should curate our media, our sources of information today? We can't wait for World Book or Britannica to put out the new encyclopedia, and generative AI is here and expanding exponentially. When someone's listening to this, when they are thinking about what you've laid out, what can you do? What should we be thinking about? And what do you tell people about how to curate the truth that they need to get to and avoid the cost or the zero cost of misinformation and disinformation?

[00:21:07]

Folks that may have a specific worldview, similar or different than your own, and folks that are in a more dangerous and isolated environment is simply how many different types of news someone engages in. It's less that they are in extremely far right or extremely far left spaces. It is that they are only in those spaces. One One thing that we can do is make sure that our diet includes more than one type of news source, more than one type of access to information. I think that's one big thing that I always push. One of the things that Onyx Impact is really dedicated to is the investment in Black media in particular that I think we really need to pay attention to. I think the Pew study said that over 64% of Black Americans still get their news from Black sources. If we need to make sure that there's more trust in our I think investing in Black media is going to be a huge part of that. Lastly, you lean on different types of orgs that can help give you best practices and differentiate between good news content, things like Pan America, the News Literacy Project, the National Black Cultural Institute, the Digital Democracy Institute of the Americas, which does a lot of great Latino-based disinformation research.

[00:22:26]

So there are definitely tools out there that folks can lean on as well.

[00:22:30]

You just gave us a set of really thoughtful ways to talk about the news that's coming out of the external information. But sometimes the call and the bad information is coming from inside the house. It's coming from our trusted circles. It's family members who are sharing this information on the family group chat or who are spreading deep fakes through WhatsApp. What can our listeners do if this is their reality? How can they talk to their loved about how not to be a spreader? And what do you do if they won't stop or can't stop whatever it is that's being shared?

[00:23:09]

This is a really good question, one that comes up often, and I think that folks should be better able to answer because it impacts so many of us so deeply. And if it's someone that is just a little too aggressive online, encouraging them to check their sources is one thing that actually works fairly well for 72 hours. I like doing that a a lot. But also encouraging folks to be very wary of very emotional content online, content that elicits a very negative or positive reaction or these things that especially need to be double-checked before you share them. But if it's someone in your family that's quite deep into one of these rabbit holes, I think the first thing you want to do is start from a place of shared values. And starting with that agreement really disarms folks and gets them a bit more open to the conversation. Second, please do not just go through and fact-check your family members, right? That's not going to be the most helpful. Focus on trying to get them to just be more open to other views, hold their own views a little less tightly, I think it's a really important step.

[00:24:23]

Remember that you are not out to catch them in a falsehood. You're not out to prove that you're smarter. These are folks folks that we care about and that we want to move to a better place. And it's not going to be one conversation or five conversations or 10 conversations, right? But if you care about them, you're willing to make that investment. So two is focus on making them more just open to other views. And how do we do that? I think three is, look, curiosity is key. I don't care if you're actually curious about what they have to say or not. You need to come to that conversation with curiosity. That's how you get folks to open up. That's how you folks to be more willing to ask their own questions instead of telling folks, asking them where they're finding their information from, if they think that there's any other potential cause using I statements. I feel like I'm in therapy. And then also, look, self-deprecating humor can get you a long way in a lot of these conversations. So be human and remember that you're there because you care about these folks. So starting from a place of shared values is focusing on trying to make them more open to other views and curiosity is key, would be how I would approach that particular problem.

[00:25:38]

We are living in a disinformation maelstrom right now. Insults are flying, but You and I know that there's a more pernicious, insidious campaign that's also being waged. You wrote about this with regards to me and several other Black women candidates in 2022. And just recently, Onyx Impact five steps. You went from three to five. You shared five steps. Oh, no, I think it's important. I think this is such an important issue. It took five. You had to add two. Because you wanted to talk about how we combat this information against black women candidates. I'm not going to make you list all five right now. We're going to make sure the folks have them. But I do want to focus on a couple of them. What is inoculation messaging? And can you share an example of it?

[00:26:25]

Absolutely. Inoculation messaging is messaging that gets voters prepare for this information they may hear and makes it so that they're less likely to, one, believe that this information, and they're more likely to dismiss what they hear as a attempt to try to trick or fool them. If you want to do good inoculation messaging, you start off with the good. Everyone knows Kamala Harris is overqualified. And because of this, folks are going to try to discredit her with racist and sexist tropes that are really, at the end of the day, just a political strategy to undermine her. I can go further, but that's what inoculation messaging is. And when they hear that and they hear the attacks that are related to it, they're more likely to be like, Oh, that sounds like a political strategy to me.

[00:27:16]

In addition to the inoculation messaging, which is staying the good and giving people context, you have a recommended formula that you bake into talking points. It sounds like when I was becoming a manager, it's the complement sandwich. Can you walk us through that and give me an example?

[00:27:33]

It's so interesting how far the complement sandwich goes. First is to highlight the good, the positive qualities, and not just highlight it, but make it something that everyone believes. The power of the many is really important. People want to be part of what everyone believes. So everyone believes, and they should believe, that Kamala Harris is overqualified because she is. To underscore the motivations, highlight the motivations of bad actors. And this is something that is always the case, is that it's more effective to highlight the motivations of bad actors. They may be grifting, they may think they're going to lose. There's all types of one. Then actually debunking it. Use a different part of our brain. It engages a different part of our brain, and it's actually more effective. So number two, underscore the motivations. Calling the attacks a political strategy is a particularly effective one to use because it's true and it places the fault on political elites as opposed to our friends and neighbors. That's why there's a political strategy to discredit and undermine her. And then three is that you want to prime the audience to see these as tricks, attempts to fool them.

[00:28:43]

Nobody wants to be tricked. Nobody wants to be fooled. But the American people will not be fooled or deterred by age-old tropes. After you do those three, it's always pivot time, right? Because you don't want to spend a ton of time talking about those attacks. You really want to get on the offensive. You want to make sure you're pivoting back to something offensive, either her qualities or my personal favorite, Project 2025.

[00:29:09]

You talk a lot about not just fighting the bad stuff, but adding a lot of good stuff out there, and that different forms of media can be additive to the conversation. And one of my favorite lines that you use is you say, Write and write some more. I'm giving you a soapbox right now to lecture all the good people who assume no will read what they say. What's your 30-second pep talk?

[00:29:32]

We need to get smarter about how the internet works, people. Let's do it. And we need to start speaking algorithm. The Internet currently sees 3-15 times more negative content articles about Black women in particular. We got to change that ratio. So you writing articles, creating content that is positive, fundamentally changes how the internet is perceiving Black women. And that imbalance is shaping how online communities view Black Lemon. It's not just about who is going to read and who is not. It's about shaping how the online ecosystem actually views these folks. That's why it's so incredibly important to write and write some more. That is how we can change the conversations that are going on right now.

[00:30:17]

And you came in right under the wire on that. Well done. You and I have had lots of conversations about this, which is why I wanted you to be on the show. And part of the conversation we've had is the value in having a diverse cohort of defenders, both online and offline. But we saw in 2018, 2019, 2022, that it matters what people hear and see, and it matters how many different types of voices they hear it from. Who has done a good job of using this, of engaging this strategy? And what else would you tell us we need to be thinking about so we can build our own diverse cohort board of defenders.

[00:31:01]

Can I throw that question back at you? Who do you think has done a really good job of having a diverse group of defenders?

[00:31:07]

I'm doing a lot of work in DEI right now. That's diversity, equity, and inclusion, and we will hear a lot about it in the next few months. What has really struck me is that as aggressive as the misinformation and disinformation has been from those who oppose DEI, we've seen some really interesting defenders. You've seen Mark Cuban and Jamie Dimon, but you've also seen the leaders of Hello Alice, Elizabeth Gore, and you've seen the women who lead Fearless Fund. We've got employment organizations saying, We're going to stand in this space You've seen the AMA refuse to cater and to kow-tow to those who would argue that diversity and health equity is a bad idea. That, to me, is one of those ways we have seen diverse defenders come in online offline and stand up for what we know to be true instead of buying into the disinformation on DEI.

[00:32:06]

That's exactly what we need. It can't just be Black women defending Black women. It can't just be women overall defending women. You need to make sure that it is a representation of America that is defending folks as they are either on TV or online.

[00:32:28]

You and I have worked together, as I said, and following your lectures to me, one of the first rules that you taught me is do not repeat the lie. But when we actually agree that a statement is true, that it's not a bad thing, how do we tackle it? And I'm specifically thinking about the term that's out there right now of DEI hire. It's not a bad thing, but it's being issued as an epithet. The rule I'm using right now is to just not not take the bait, but don't hide from the truth. Am I right? Am I wrong? And assuming I'm right, how can I be more right?

[00:33:07]

Assuming you're right is probably a good place to start. What's happening with DEI now and what happened with CRT is a pink elephant problem. And what I mean by that is the age-old, don't think about a pink elephant. What is everybody thinking about? It is incredibly difficult to make folks forget the first thing they hear about something. And by taking these acronyms that folks are not super familiar with, labeling them as bad and having that be the first thing people hear. They put us on almost a defensive of trying to work back from that. Sometimes we don't do it well. Sometimes we do it incredibly well. What we have to do is find a way to overwhelm what folks believe the current disinformation is. It should be easier for progressives to do because we have such hold, excuse me for saying, we have such a hold on the culture, right? We should be able to leverage that and use that to create a campaign that highlights the motivations of these bad actors, why they're doing this, and also brings a diverse cohort of folks to defend what should absolutely be defended. And then if that doesn't work, another strategy is always to reclaim or redefine the word.

[00:34:25]

We're not that great at this, but we've done it before. One example is Dark Brandon. The right use that for for a while, and I think we successfully reclaimed that. Another one is Black Jobs, right? And that was one we did immediately. That was one that we didn't even let take off. We heard it in the debate, used in a negative sense, and we immediately used humor to diffuse that in a way that I thought was really successful. There are some really powerful folks that have an outsized impact on information algorithms online, and getting some of those folks to talk DEI in a fundamentally better way could be incredibly powerful to reshaping how not just the internet sees it, but how the majority of folks see it. I'm happy that someone that is very familiar with campaigns is trying to fight DEI because that is exactly what we need, as opposed to scholars going on TV and talking about PhD-level courses, which may not work as well.

[00:35:26]

Understood. I will say to anyone thinking about this topic, watch space. One other piece of secret information about you, it's not a secret, it's just not as well known, is that you have a wicked sense of humor. And earlier you mentioned the importance of using self-deprecating humor as a way to bring friends and family in from the disinformation cloud. What's your favorite example of the use of humor to diffuse disinformation?

[00:35:53]

I want to be clear that this is only effective if you are funny. I am really partial to Black jobs because it gave us some light in a really dark place. I will be honest, there is an immigration conversation happening in this country that's quite erosive and destructive. And that part of the debate had the potential to move that conversation into a new space, into Black communities. And the swiftness with which Black folks decided to redefine the term was Absolutely incredible. Given where we are now with the potential to have the President of the United States be a Black job, with folks being vetted for the vice President to be a Black job, the current vice President being a Black job, I think it was an incredibly freaking awesome example of this. But again, Black comedians have saved us time and time again, and maybe we should be calling on them to save democracy as DEI is one example of how racism and other forms of discrimination get validated by demeaning terms of art and demeaning experiences.

[00:37:12]

And what I find to be one of the most persistent bits of misinformation that I chaf at is the reductive term culture war, which is used to dismiss these fundamental assaults on our liberty and our civil rights. How do you think about these deflections like culture war or rejecting issues as partisan rather than really engaging the core of what's happening? Do you see them as effective distractions? And if so, how do we get more people to pay attention to what's actually happening?

[00:37:44]

I think it's an incredibly powerful and dangerous tool when used in the way it's being used. It's either a race to the bottom of who can make the most reductive term, or it's a strategy of engagement. Who can make the more engaging message, the more passionate message, a message that folks are willing to take a second to listen to that can make them feel as if there's more to this conversation. I think that is something that you are uniquely skilled at doing I would put more Cee Cee Abrams out talking, pushing back on this. That's essentially what you need to fight these reductive terms.

[00:38:23]

That is a very high compliment. You and I both love science fiction, and the cloning wars have not yet It's taken full effect, but we'll keep an eye out.

[00:38:33]

They're coming.

[00:38:34]

They are coming. I know Onyx Impact focuses on how Black communities are targeted by the scourge of misinformation, disinformation, malinformation. You created this organization because you saw a gap, not only in research, but a gap in action. But I also know that you have been a long-time advocate for multiracial, multi-ethnic coalitions. You come from an immigrant family. You exist in multiple spaces. How can other communities learn from your work and what calls to action can they take with them?

[00:39:13]

As anyone that is part of the general media or societal ecosystem of today is going to be impacted by this information. It's incredibly important that we have folks ready to fight it, especially as we are waiting on policy and regulation to help us in this battle. And the second thing is that I do think we need to come to a lot of these conversations with more empathy. I think that a lot of folks who go down these rabbit holes are looking for truth, and they are often manipulated into believing things that are not true. It's very hard for folks to move away from that. It's incredibly important to recognize that when we're dealing with this issue. In terms of what folks should come away with, look, I've said it before, I'm going to say it again. Defend Black women. However much you're doing it right now, do it more. Invest in Black media and Check your sources.

[00:40:22]

Each week, we want to leave the audience with takeaways that we like to call our toolkit. So Asosa, here on Assembly Required, we close each episode with ways for listeners to continue exploring the topic of conversation and to encourage them to do their part to help solve the problem. So first, for those of you who are curious to learn more, check out how false news can spread from Ted-Ed to learn more about another form of misinformation, circular reporting.

[00:40:51]

There's a quote usually attributed to the writer Mark Twain that goes, A lie can travel halfway around the while the truth is putting on its shoes. Funny thing about that, there's reason to doubt that Mark Twain ever said this at all, thus, ironically, proving the point. And today, the quote, Whoever said it, is truer than ever before.

[00:41:18]

Asosa, you gave us tons of tools already in the interview. What's the most important thing we can do as individuals to stop the spread of misinformation or dispel myths that we see online?

[00:41:28]

Lead with the truth. Make make sure that the first thing that folks hear is accurate information. Even if you're trying to refute or debunk, it is incredibly important to have that be an overarching call, is to make sure we are always, regardless if we're journalists, if we're posting on social media, if we're anchors, if we're content creators, that we are leading with the truth forever and always.

[00:41:58]

For listeners who to go further and support organizations who are working every day to combat disinformation, Asosa Osa, tell us the website for Onyx Impact.

[00:42:08]

Onyximpact. Org.

[00:42:10]

There we go. As a reminder, I want to hear from you. If you have questions about navigating news sources, send us an email at assemblyrequired@crookidmedia. Com or leave us a voicemail. You and your question might be featured on the pod. Our number is 213-293-9509. That's all from us today at Assembly Required with Stacey Abrams. Talk to you next week. Assembly Required with Stacey Abrams is a Crooked Media production. Our lead show producer is Stephen Roberts, and our associate producer is Paulina Velasco. Kira Palavi Steve is our video producer. Our theme song is by Vassilis Phutopoulos. Thank you to Matt de Groot, Kyle Segglen, Tyler Buzer, and Samantha Slosberg for production support. Our executive producers are Katie Long, Madlyn Haringer, and me, Stacey Abrams.