Transcribe your podcast
[00:00:00]

From the New York Times, I'm Sabrina Tavernisi, and this is The Daily. A disturbing new problem is sweeping American schools. Students are using artificial intelligence to create sexually explicit images of their classmates and then share them without them even knowing. Today, my colleague, Natasha Singer, on the rise of deepfake nudes and one girls fight to stop them. It's Friday, June seventh. So, Natasha, for the past year, you've been reporting on artificial intelligence in American schools, and your most recent reporting has uncovered a pretty new and very disturbing trend involving AI. Tell me what you've been finding.

[00:01:13]

It's been a really striking year on my beat because we're seeing a rapidly spreading new form of peer sexual exploitation and harassment in schools, which is called deep fake nudes. What that means is that we're seeing middle and high boys across the United States using these widely available nudification apps to allow people to seruptuously take clothed photos of girls and women, and without their knowledge or consent, use these apps to remove their clothes. It looks like a simulated nude of a ninth grader who has posed for a photo at, say, a high school football game or her high school prom in a dress. Except when these When these boys use these apps, it looks like the girls have posed for these nude photos at these events. The thing is that boys have been sexually harassing girls online for decades, including using Photoshop to make fake nude pictures. But the difference here is that generative AI is an exponentially more powerful tool. It makes it look realistic. But second of all, it enables the mass production of these fake nude images of teen girls.

[00:02:29]

Very, Very quickly. Interesting.

[00:02:31]

One expert I interviewed pointed out to me that one boy with his phone in the course of an afternoon can victimize 40 girls, and then the images are out there. The possibility of harming many girls instantly and permanently is what is the concern here.

[00:02:54]

Okay, so this is a very disturbing and very new problem. Where did your reporting to understand this problem start?

[00:03:02]

So one recent afternoon, Sydney Harper, a producer at The Daily, and I traveled to Westfield, New Jersey.

[00:03:08]

Do you want to tell us a little bit about what we're looking at, what we're seeing?

[00:03:13]

We are driving through downtown Westfield, which is a small, affluent community of a few thousand people. It's this idyllic, tiny suburb outside of Newark. Manicured lawns. Very manicured. Definitely. Landscapers employed. With manicured lawns and lush dogwood trees in full bloom. Here we are at the Manny's. Then we went to see Francesca Manny and her mom, Dorota.

[00:03:44]

Oh, my God. I thought you guys were coming at 4:00.

[00:03:49]

You said 3:30.

[00:03:50]

Did I? Come in. Give me three seconds.

[00:03:53]

Dorota was surprised to see us at first because she'd been expecting us a little later in the day. Hi, Francesca. Hi. It was like a half hour after school was over, and Francesca was in the back, sun-tanning outside with the dog. Hi. I'm Sydney.

[00:04:08]

Nice to meet you.

[00:04:10]

After we said our halos, we sat in the living room, and Francesca started to tell us more about herself.

[00:04:16]

I'm Francesca Mani. I'm 15 years old, and I'm a sophomore at Westfield High School.

[00:04:22]

Francesca is a typical teen in many ways.

[00:04:24]

I love playing sports. My main sport is fencing. I love hanging out She plays sports.

[00:04:31]

She is training in fencing. She's really gregarious, and she's really confident. What was your first inkling that something had happened that had to do with the girls in your class? Francesca began to tell us what happened to her at school last October.

[00:04:50]

So I found out that there was a group of kids in my school, and they were taking pictures of girls' faces and putting on AI-generated pictures. It was not rumors, but it was word to mouth. It was like the game telephone was going around.

[00:05:09]

She found out that boys in her class had created these AI-generated fake nude images of some of the girls in her class.

[00:05:17]

It was in history class, the second period, and I was just sitting down, and then I came back from the bathroom and the girls were talking about it. I was like, Hey, what's going on? And then people were talking about it, and I was like, Oh. We were all super surprised.

[00:05:30]

The girls hadn't seen the images. They'd heard about them from boys who had seen them and who knew who was in them. Francesca said the girls made a Snapchat group to discuss what they should do, and then they decided that some girls needed to go down to the principal's office and notify administrators about these deep fake nude images. And what did you tell the principal?

[00:05:49]

We were like, there's this group of kids taking pictures of girls' faces and putting them on AI-generated bodies. And we were just worried, and we didn't know if it was one of us So we just went down. Even for the sake of the other girls, we just went down to tell the principal.

[00:06:05]

So the school isn't even aware these images exist until Francesca and her friends notify administrators.

[00:06:09]

That's right. So then Francesca goes back to class and goes about her school day.

[00:06:15]

The day went on, but a lot of girls were worried. A lot of girls were crying. I didn't really think it was going to be me. So I was doing my own thing. I was going to my classes. There was a lot of counseling or something, but But I need to stay on my work.

[00:06:32]

At this point, Francesca isn't thinking that any of this is going to affect her directly. But as time goes on, she begins to hear something troubling.

[00:06:40]

It was during fourth period. I was going to go print something out with one of my friends, and I see one of my other friends, and I was talking to them, and they said that they think there was one of me.

[00:06:53]

She heard that she might be one of the students who had these deepfake nude photos made of her. So as she's worried about this, the school begins to announce the names of girls in her class over the loudspeaker and ask them to come down to the office.

[00:07:12]

And I hear my name over the intercom telling me that I have to go down to the principal's office. And at that moment, I knew it was one of me because I think all of the girls were getting called down from the intercom. So I went down, and then The principal told me I was one of the victims, and then that's how I found out.

[00:07:36]

And how did you feel about that?

[00:07:38]

So basically, in the beginning, I felt shocked because I didn't think it could happen to me. And that's the funny part, because it could happen to anyone. I'm sad, too, because it's surprising because I didn't think my classmates could do this to me.

[00:07:53]

Francesca is upset and also feeling like there's a double standard, like the girls are not being treated by school with the same sensitivity and privacy as the boys who allegedly made these fake nudes.

[00:08:05]

What I found weird is that they announced the victims over the intercom while the boys that were getting investigated for it were pulled out of class privately. So no one could know who it was. But we girls had to get called over the intercom, which I think is violating our privacy.

[00:08:25]

Who heard that announcement?

[00:08:26]

Everyone. Because the class is finished And usually people speak over the intercom, so everyone heard it while they were in the hallway.

[00:08:35]

And as she walks out of the principal's office.

[00:08:38]

So I'm walking down the hallway to my math class, and I see this group of boys laughing at these group of girls crying about the situation. And I'm just like, Are you kidding me?

[00:08:50]

The first thing she sees is a group of girls standing together crying and a group of boys laughing at them.

[00:08:56]

So that's when I started boiling. I was super mad. And I just thought it was not fair because it's a serious thing. Why are you laughing at it?

[00:09:04]

And what did it make you want to do when you were mad?

[00:09:07]

Well, I wanted it to stop because girls shouldn't be laughed at because of something that had happened to them, and especially by a guy. I really don't think that's fair.

[00:09:23]

It is a pivotal moment for Francesca. She sees the girls crying, and she talks about how she was just pissed off, and she wanted to do something about it. She says to her mom, How are you going to help me do something about this?

[00:09:41]

I was like, Mom, this is what happened at school. We need to do something because I'm super mad this happened to me and to other girls, and I think we could do something to fix it.

[00:09:54]

What does Francesca's mom do when she hears what happened to her daughter and her friends?

[00:09:58]

Well, the first her mom, Dorota, sets out to do is find out what consequences there will be for the boys who did this.

[00:10:06]

One of the vice principals called me and informs me that, yes, there has been a situation where boys created nude images, but the boy has been suspended. So I asked the legitimate question of for how long.

[00:10:25]

Dorota says she learned from the school that it's primarily one boy who has made and circulated these images and that he will be suspended for just one day.

[00:10:35]

There was no consternation. We're thinking about it. We're still investigating. We're going to decide that for now, he has been suspended.

[00:10:42]

Dorotas had several conversations with administrators about what has happened and what they're doing. And she is beginning to feel that they're not taking seriously and they're not doing enough.

[00:10:52]

She's like, But don't worry, he is gone. I'm sitting there and I'm saying, gone to work? Gilligan's Island? He's going to be back on Wednesday. Again, I'm composed at this point. I said, Well, he's back on Wednesday, isn't he? Do you think as a mother, as a woman, do you think this is the right approach? She says, Well, this is what it is This is what we have decided. I said, Well, then I want to inform you that I'm not going to let it go.

[00:11:22]

The combination of the idea that, A, the school has announced the names of the girls over the loudspeaker, further humiliating them and violating their privacy. A, B, the boy who has instigated this has been suspended for one day and maybe is taking another day off, and then he will go back to school with the girls he did this to. These things in combination make Dorota feel that the school is not doing enough to protect the girls and is not going to do enough to punish the boys.

[00:11:54]

That's how the whole ordeal started.

[00:11:58]

It completely activated motivates her. One of the first things she does is Dorota writes a letter to the local online village news website, and she explains what happened to Francesca, and she says something like, Am I the only one who thinks that the school's response is inadequate? She puts her phone number on this letter to the editor of the local Westfield publication.

[00:12:27]

I think more than 300 people contacted connected me. It was like a headline, Older women to young mothers, to parents of Westfield High School, students, to Thents, to Counselors, some teachers. We all connected together through that.

[00:12:43]

She hears from the parents of other girls. She hears from the parents of girls who have been subject to other kinds of sexual harassment in schools. She hears from local politicians, and she decides that she's going to hold a town meeting about this in her own living room. Wow.

[00:12:57]

Put arm chairs and dining chairs. It was a full room of beautiful people that came to support us.

[00:13:05]

She has this meeting, and dozens of folks come, including the mayor and other parents and local legislators, and they talk about what needs to be done.

[00:13:16]

Then we need to put code of conduct AI policies and bullying policies in place, and there should be just one approach. If you've done the wrong thing, you've done the wrong thing. It's as simple as that.

[00:13:28]

What comes out of that meeting is Dorota and Francesca decide they want to push school districts nationwide to update their policies to better protect girls from deep fakes and prevent this from happening again.

[00:13:45]

How does the school respond to all of this?

[00:13:48]

When I reached out to the Westfield Public School district, they said they had opened an immediate investigation into the incident and consulted with police, but that they couldn't comment on the specifics of the case or talk about any disciplinary actions against the boys for privacy reasons because the students were under 18. When I asked them earlier this week whether they changed any of their policies to address AI abuse, the district declined to comment.

[00:14:17]

So, Natasha, the story of the Manny's, it really shows how unprepared schools can be. It seems like the school was completely blindsided by this and really fumbled it. But I guess in the school's defense, this is all really new, right?

[00:14:32]

I think that's an important point. This is brand new AI technology, and maybe school administrators, principals, superintendents, didn't even know that deep fake nudes existed or that there are apps you can use specifically to nudify photos of women. You could say they weren't prepared for this technological abuse. I spoke to parents in a number of districts, and I reached out to schools in a number of states. Mostly, the way that Westfield High handled this is typical. The typical response was, there will be minimal discipline for the guy or guys who did this. Girls, this is just a fact of life. Or like, Girls, yeah, this is really bad, but it will pass. There did not seem to be understanding in many districts that actually this could be devastating for mental health and also that it could have long-term consequences for these girls.

[00:15:37]

I guess the thing that's bothering me is it seems pretty clear that what these boys are doing is wrong.

[00:15:45]

But is it illegal? I mean, are they breaking the law when they're doing this?

[00:15:50]

You would naturally think that it should be illegal for anyone to make deep fake nudes of 12 and 13-year-old girls. But the answer to this question is a lot more complicated than you might think.

[00:16:19]

We'll be right back.

[00:16:25]

Hi, I'm Claire Tennis-Getter. I'm one of the many names you hear in the list of credits on the Daily Every Week. A big part of my job as a producer is talking to my colleagues, to New York Times reporters, to get their expertise on the news. But we also want to explore the human side of the news. And so another big part of my job is talking to people about how they're experiencing what's happening in the world. That can mean walking up to people on the street, making cold calls, and spending months making sure we represent all sides of the story. Whether it's about what shapes our political identities or how we're coping with crises, we always feel like there's to learn from these conversations. We often hear from listeners that these types of stories are what makes the Daily Special, and we want to keep bringing them to you. We can't do that without subscriber support. If you haven't subscribed to the New York Times, you can do that at ny eletimes. Com/subscribe. And thanks.

[00:17:20]

So, Natasha, you talked about the question of legality as being very complicated. What do you mean by that?

[00:17:26]

Well, what I mean is that there are different laws that could cover this. And so let me break it down. There's a whole section of the law that outlaws the possession and distribution of what's called child sexual abuse material. And that's explicit video or photos or images of underage children engaged in sexual acts. But that doesn't mean that every photo of an unclothed child is automatically illegal. For example, a photo of a naked child in and of itself may not be illegal. Like if you think of a parent taking a photo of their toddler in a bathtub. But if the image shows a child that is sexually suggestive or engaging in explicit sexual conduct, that is illegal.

[00:18:14]

That's the red line.

[00:18:15]

The big question, as we're starting to see more and more of these deep fakes, is where do these images fall on the spectrum of material? If an image shows a 17-year-old without clothes on, does that meet the federal definition of a legal material? This came up in my reporting because there is a case in a high school where boys made these deep fake nudes of their female classmates. And according to the police reports, local police heard about it from the parents of the girls who complained, but they didn't hear about it from the school. And so the police detective went to the school and said, You know, these images fall under the child sexual abuse material statute, and you, as a school, are mandatorily required to report such images. And eventually, the school district does report the deep fake news. But when I reached out to them to ask the school district about this, the school district sent me a note saying that they had reported the deep fake nudes of female students out of an abundance of caution because the school district's lawyer said, These were fake, they weren't real, and maybe it wasn't necessary to report them.

[00:19:31]

So even within school districts, there is confusion about the legality or illegality of these deep fake nude images.

[00:19:46]

Okay, so some of these deep fakes are actually illegal. Sexually explicit AI-generated images of minors. But some images are actually still pretty unclear. What is being done to clear up that gray area, to make it more black and white.

[00:20:04]

A number of families whose daughters have been subject to these deep fake nudes want to see a standard law or policies to protect their daughters and other people's daughters. You see that some families are lobbying for states to pass laws to prohibit these images. The Monies are one family who have been lobbying for New Jersey to pass a law to prohibit these deep fakes. But many states are introducing new bills, and there are different approaches to doing this. Some states are expanding their laws on child sexual abuse material to include these AI-generated nudes and make those illegal. Got it. Then there's another approach where states have laws on revenge born, and that prohibits people who have taken consensual sexual images with their partners. When they break up, if they're pissed off, it's illegal to post those online without the other partner's consent. States are taking these revenge porn laws and adding AI-generated nudes. The third approach is to come up with entirely new laws that are specific to these deep fake nudes and to prohibit the possession production or distribution of these AI-generated, sexually explicit images of minors, and sometimes both of minors and adults.

[00:21:38]

You're talking here about state laws. Is there anything happening on the federal front?

[00:21:43]

To be clear, federal law on child sexual abuse material applies here. It covers computer-generated and real images. The FBI has recently said that it covers these deep fake AI-generated images as well. There are also efforts to address this more directly. The White House issued an executive order last fall, making it a priority to find ways to stop AI from producing these abusive images. Congress also recently introduced bills to prohibit these AI-generated nude images and to allow victims to sue. But these efforts are not very far along. Really, the momentum for change is happening primarily at the state level right now. It's important for states to do this because these cases are often prosecuted at the state and local level.

[00:22:36]

Okay, so you talked about the third approach, creating entirely new laws. What does that process look like on the ground? What are you seeing pop up around the country?

[00:22:45]

We've seen over the last two years, lawmakers in two dozen states, introducing these bills that are specific to AI-generated deep fake news. These bills really vary in content and in the severity of the punishments. Massachusetts lawmakers are hammering out a new bill that would criminalize the sharing of these explicit images. But it takes a different approach to adults and minors. It would be a crime for adults to make these deep fake nudes of minors. But if it's teen boys making the same images, they could first be sentenced to a diversion program, which means that they'd have to learn about what's problematic about these images. They'd have to learn about the responsible use of generative AI. They'd have to learn what the punishments are. So Massachusetts is taking a more graduated approach. By comparison, Louisiana passed a law. And under the new Louisiana law, anybody who knowingly makes or shares or sells these sexually explicit deep fake nudes of minors could face a minimum prison of 5 to 10 years. Oh, wow.

[00:24:02]

Pretty tough.

[00:24:03]

Right. It raises the question of, should a teen boy who makes images of a teen girl face the same punishment as a 50-year-old pedophile who makes these images?

[00:24:14]

Right. Important question, and that is very tricky.

[00:24:17]

It is really tricky. It came up in my reporting because at the end of last year, police officers in the Miami area arrested two middle school boys for allegedly making and sharing these simulated nude images of two female classmates who were 12 and 13. And according to the police documents we got, the boys were charged with third-degree felonies under a 2022 state law which prohibits altering sexual depictions without consent. And so the question is, what happens if they're convicted? Maybe they could be sent to a remediation program like the one that the Massachusetts lawmakers are working on, or Maybe they would have to register as sex offenders for something they did when they were 13 years old, and that would follow them for the rest of their lives. There's a really complicated question about, as new laws are being passed, how should students who are doing this in middle and high school be treated as opposed to pedophiles?

[00:25:20]

So, Natasha, at the end of the day, we're left with a pretty complicated picture here and with a very hard question. This technology has resulted in real harm for young girls and their families. But at the same time, addressing that harm raises the question of criminalizing young boys.

[00:25:39]

I think that's the crucial question, Sabrina. It underscores how schools and parents and lawmakers and law enforcement are just in the opening stages of figuring out how to deal with AI abuse. I'd come to believe, based on my reporting, over the last year that it's actually part of a much larger problem, young people's increasingly toxic relationship with technology. Children and teens' relationship with technology changed during the pandemic.

[00:26:14]

We all know that.

[00:26:16]

Many kids and teens began spending more and more time online, and some kids became much more isolated. Schools tell me they believe that this has led to a spike in cyberbullying, and some psychologists blame it on kids' compulsive use of social media. I think the technology issue is also connected to a whole constellation of problems that schools are dealing with post-pandemic, like increased depression among students and increased absenteeism. I think that this is not just a story about AI abuse in schools. It's a story about how technology is fundamentally changing childhood, adolescence, and education. It's a story about how the grownups, whether it's school administrators or state lawmakers, parents or members of Congress, are all racing to catch up.

[00:27:17]

Natasha, thank you.

[00:27:19]

Thank you, Sabrina.

[00:27:30]

We'll be right back. Here's what else you should know today. On Thursday, an Israeli airstrike hit a United Nations school complex in the central Gaza Strip that had become a shelter for thousands of displaced Palestinians and, Israel said, Hamas militants. Gaza health officials said 40 people were killed in the attack, including 14 children and nine women. The Israeli military said that its forces had targeted a group of about 30 militants, using three classrooms as a base, including some who had taken part in the October seventh attacks. And, Stephen Bannon, a longtime advisor to former President Donald Trump, was told by a federal judge to surrender to authorities by July first to start serving a four-month prison term. Bannon was sentenced in October 2022 for contempt of Congress after he disobeyed a subpoena to give testimony to the House Committee that investigated the January sixth, 2021 attack on the Capitol. Banon also faces a trial later this year on charges of misusing money that he helped raise for a group backing Trump's border wall. A reminder, we'll be sharing a new episode of The Interview tomorrow.

[00:28:55]

Those few days that we shot the pivotal scenes in the movie, I had to call home a lot. I really was a tad unhinged.

[00:29:06]

This week, Lulu Garcia-Nvaro talks with comedy legend Julia Louis-Dreyfus about what it was like to go to much darker places in her latest film. Today's episode was produced by Sydney Harper and Shannon Lynn. It was edited by Marc George, contains a original music by Marion Lozano, Alicia Baetou, and Dan Powell, and was engineered by Chris Wood. Our theme music is by Jim Brunberg and Ben Lansberg of WNDYRLE. That's it for The Daily. I'm Sabrina Tavernisi. See you on Monday.