ncmec – Techdirt (original) (raw)

The Many Reasons Why NCMEC’s Board Is Failing Its Mission, From A NCMEC Insider

from the blowing-the-whistle dept

Yesterday we posted our latest podcast, with guest Don McGowan, former board member at NCMEC (the National Center on Missing and Exploited Children) and former general counsel or chief legal officer at Bungie and the Pokemon Company (where he would sometimes disagree with our coverage). In the podcast, he goes into great detail about why he left the NCMEC board, and why he felt the board had become rotten, captured by interests not aligned with the underlying mission of NCMEC, and more focused on making it look like they’re protecting kids than actually protecting kids.

Multiple people reached out to me last night after listening to it, noting that McGowan’s whistleblowing here is both explosive and extremely important for more people to know about. NCMEC is an important organization, and the work that it does is fundamental to actually helping to protect children. But its board has apparently been captured by extremists who support political positions and ideologies at odds with that mission.

Therefore, after receiving a few requests for a transcript, I put one together, and have highlighted some of the key points. In particular:

The whole thing is incredibly damning and worth either listening to or, now, reading:

Mike Masnick:
Hello and welcome to the Techdirt Podcast. I’m Mike Masnick. A few months ago on the podcast, we had Shelby Grossman and Riana Pfefferkorn from Stanford talking about their really incredible and detailed report on the CyberTipline, its opportunities and its challenges. As we noted in talking about that, it really highlighted both some of how important the CyberTipline is, but also how there were a bunch of challenges not necessarily because of the CyberTipline itself, or NCMEC, or anyone in the process, but just the basic realities of how the CyberTipline works, how the Fourth Amendment works and laws around that. The CyberTipline, of course, is run by NCMEC, the National Center for Missing and Exploited Children, and they do some really great work with the CyberTipline being one example of a few.

But over the years, I’ve had a few moments where I’ve grown somewhat frustrated by some aspects of NCMEC, including and maybe especially around its advocacy, in particular on some bills that I think were really problematic and actually, I think, put people in danger. For example, NCMEC advocated vigorously on behalf of FOSTA, which was a very problematic bill that became law and which I think has been a complete disaster since then, putting people’s lives at risk.

There are reports suggesting that many people have died because of this law. As far as I can tell, NCMEC has never commented on what a failure FOSTA has been and how it almost certainly did real harm to some of the people that NCMEC claims to want to protect. Similarly, NCMEC has advocated on behalf of KOSA, the Kids Online Safety Act that we’ve discussed many times and how it put many kids at risk, especially LGBTQ kids, by the nature of the way that KOSA is written. I’ve long wished that NCMEC would just focus on the actual good work that it does in the world rather than pushing for dodgy legislation.

So it caught my attention recently when Don McGowan wrote a thread on Blue Sky about quitting the NCMEC board. McGowan is a well-known lawyer who was most recently the general counsel at the video game company Bungie, and before that at the Pokemon company, and has also worked at Microsoft over the years. In his thread, he wrote about leaving NCMEC’s board for a variety of reasons regarding both the advocacy that the organization does and also some of the advocacy that it refuses to do, such as its refusal to come out against Project 2025 and its plan for opening up child labor laws to enable more kids to take dangerous jobs.

He also noted that in all the media interviews he’s done since leaving Bungie, few have asked him about this. So that struck me as something of a challenge to have him come on and talk about exactly this. So Don, welcome to the podcast.

Don McGowan
Thanks, Mike.

Mike Masnick
So I wanted to start out with the baseline of making it clear that I think both of us agree that NCMEC does some really good important work that does in fact save lives. So this is not a trashing of…

Don McGowan
That’s incredibly correct. I do my best when I go off about NCMEC to try and draw a bifurcation between the organization and its staff and the board. My off-going is against its board, which I think has been entirely captured by MAGA positions, and uses itself to make sure that no criticism will be drawn to those positions in ways that it can, and not to take action, not to say bad things with the CyberTipline, or any of the Code Adam work that the organization does, or any of the other great stuff that it does to help actual kids at actual risk. And if this was a video podcast, you folks listening to it would see, I am drinking from my NCMEC mug, or my NCMEC tumbler that was given to me by NCMEC for my time as a board member.

Mike Masnick
And how long were you on the board for?

Don McGowan
I was on the board for, well, I had a little hiatus in the middle because my first few years were on the board as a rep of Pokemon. And then after that, when I stopped being at Pokemon, I stopped being on the board for a few months. And then I went back on as just a regular civilian board member for about three more years. So my total years there were seven. I started my association with NCMEC during Pokémon Go.

Mike Masnick
That makes sense. I was looking at it, and it’s a fairly large board. And so how is the board constructed?

Don McGowan
In the charity world, are two types of boards, working boards and fundraising boards. The NCMEC board is more of a fundraising board than a working board. As for how does one get on it, I’m not on actual over the air radio, so I can use the technical term. It’s got a lot of people that are the usual DC-area cop fuckers. And a lot of people who want to be law enforcement adjacent. And some people who are there because their organizations have a relationship with NCMEC, like I was when I was at Pokemon. Although somewhat amusingly, Pokemon didn’t want me to talk about that association publicly, which will be a story in my upcoming book.

Mike Masnick
Wow, okay, okay.

Don McGowan
Yeah, I’m writing a book about my Pokemon years,

Mike Masnick
Interesting, very interesting. That will be something to look forward to. So let’s talk about the sort of advocacy that NCMEC has done. And I think, in my experience, as I mentioned in the intro, it really came to my attention when NCMEC came up very strongly in favor of FOSTA…

Don McGowan
I want to speak to that for a second, I’m sorry to interrupt you. You mentioned FOSTA at the jump. I was involved in the NCMEC work on FOSTA. I barely remember any of it because it was long ago, but I was involved. And I say this to say, feel free to take shots at it because I wouldn’t want you to get to the end and be like, ‘oh shit, I didn’t know, and I took shots at him right to his face.’ Nope, do it.

Mike Masnick
Okay. Okay. Excellent.

Don McGowan
I will spend my life expiating the sins of what I did in my past, and that’s a big one.

Mike Masnick
Okay, so it struck me as surprising, right? I mean, I had been aware of NCMEC. And in fact, at some point, a long time ago, I’d spoken to a board member at NCMEC in the early 2000s, I’d had a conversation with someone who was sort of explaining how NCMEC was functioning. And that person had indicated to me some dysfunction, but I hadn’t seen them really engage as much on the policy side outside of things directly related to NCMEC. Like I understood advocacy around things related to the CyberTipline. And, there was, as we mentioned, this report that Stanford did earlier this year, which recommended some legislative changes to help the CyberTipline, one of which was actually voted on and signed by the President a few weeks after that report came out.

That kind of advocacy, I totally get and make sense. It’s like, ‘how do we make the CyberTipline more effective, get around some of these problems that were discussed…’

Don McGowan
And that was a very good report. As somebody who knows how the sausage gets made. I read that report and I was like, damn, these folks, they did well.

Mike Masnick
Yeah. They put in a lot of work. I know they spent time at NCMEC for a few days and were watching over the shoulders of people working on the CyberTipline, understanding all of that, talking to people on all sides, all the different stakeholders. So that was great. And that kind of advocacy I get.

What surprised me specifically when the FOSTA situation came out was having, I think at the time it was NCMEC’s general counsel, go and testify before Congress that FOSTA was like this perfect solution and necessary and really, really helpful. And I felt personally that kind of misrepresented the law.

And I was kind of wondering why NCMEC, given the position it’s in, which is that it is a private nonprofit, but it is the only organization that is granted by law to be able to process child sexual abuse material as part of the CyberTipline, which leads to some people and some courts occasionally giving it quasi-governmental status. But it’s in this sort of unique position. I thought it was odd that it would then go out and publicly advocate for a law that seemed slightly tangential to its overall mission and what it was working on. And then that it was taking such an extreme position on it that went against what a whole bunch of other civil society folks were talking about and raising the concerns of this law. So I know that you said that was a long time ago and you might not remember the specifics…

Don McGowan
I’ll go back in. I’m hitting the memory banks as you were talking, and I’ve got some details.

Mike Masnick
So I’m curious if there was anything that you were aware of at the time that sort of led NCMEC to decide that they were going to go public and advocate for a bill like FOSTA?

Don McGowan
I’ll come at this a little bit obliquely. So, NCMEC is, as an organization, driven by its board. And aspects of NCMEC’s board are somewhat difficult to unpack unless you know the personalities of the humans sitting in the room and or if you’ve been in the room. And that’s always a shitty thing to say because you don’t get to be like, ‘you don’t understand because you’re not there.’ But there’s a little bit of it, except I was there, so I can tell you. How this stuff came about is: go back to what FOSTA was supposed to be and pretend you don’t understand what it turned into.

Mike Masnick
Okay.

Don McGowan
Okay, now remember, at the time it was a bill to cut down on human trafficking for the sex trade. And if it was that, I mean, one, protecting children is never a vice in American politics, and two, if it was that, that would have been a great thing to support.

Mike Masnick
Yes.

Don McGowan
And so you had people going, speaking up in support of it, who were speaking from the perspective of what they thought it would be. Now you and a lot of the civil society groups that spoke to it understood the actual mechanics of the law and what it would… You had a little bit of seeing into the future that you could do. You’re a little Nostradamus sometimes, Mike. A lot of us try to be, some of us succeed, a lot of us don’t.

I remember, you know, like that was one of the things was this bill, especially at that time and to a certain degree even today, NCMEC has no technical expertise in the building. They have a relationship with Thorn, which is Ashton Kutcher’s child protection charity. And Thorn does a lot of their technical work and carries the technical load in that space in a way that NCMEC’s just not set up to. And I chaired their tech committee for a few years, right? And so I actually co-chaired it with a guy who was a marketing manager at Charter. And he considers himself the tech brains of NCMEC, and he’s a marketing manager for an ISP.

So there was a guy in there, a guy on the board who ended up no longer being on the board, who was advocating for this geolocation app to help you know, like you’re walking down the street and it’ll ping your phone and say, a child was abducted here. And he thought this was such a fantastic idea, because it’d be great for awareness. I’m like why would anybody put this into your phone? This guy styled himself as the technical expert, right? So think about that. A guy who thinks that app is the greatest idea ever and should be the technical focus of the organization, is out there trying to set the guidelines.

This was a guy who we… there were a few of us who actually had a bingo game during board meetings of at what point is Lou going to bring up porn? And then we would work the word bingo into our next thing we said, and that would be how you would win bingo. If Lou mentioned porn at a time that you were ready to talk, you’d work it in and you’d win. If you’d picked that slot in the Squares game, you had first right to claim it for your victory, right? So we’re dealing with that level of sort of, you could write a script by this guy’s focus on this issue, fairly tangential to NCMEC’s mission. And so you had people setting that as a priority.

And so obviously, FOSTA was red meat to them.

Mike Masnick
Right. I mean, this is the problem with so many bills, right? You position them in one way. And if you don’t understand the mechanisms of how they’re actually gonna work, the bill sounds good. And even people today, the same with KOSA, right? You look at it and on its face, it sounds good. People want kids to be safe online. People want to stop sex trafficking. So these bills sound good. I guess I had assumed, apparently incorrectly, that NCMEC would have more sophistication than that regarding sort of the nature of these bills.

Don McGowan
You’d hope. There were traditionally a couple of board members from Facebook. And they were fairly displeased when NCMEC took that public position because that sort of happened without a lot of us knowing what was going on.

Mike Masnick
Interesting. The other thing that I had seen and I had written about this at the time and, maybe it was a little bit conspiracy theoryish on my part. But I did notice that the person who was chair of the board at that time was a lobbyist who happened to be lobbying for all of the major motion picture studios in Hollywood.

Don McGowan
Because that was Manus’s year, right?

Mike Masnick
Yeah, it was Manus. And that was coming out of what had been revealed…. I mean, all of these things connect in such weird ways, but had been revealed through the Sony Pictures hack years earlier that there had been this Project Goliath plan by some of the major motion picture studios to focus on sex trafficking as a way to pass laws that would undermine Section 230 and thereby harm Google. And so there’s this know, corkboard with red strings on it, where you could pull this Hollywood lobbyist connected to NCMEC, pushing for the bill that Hollywood had been talking about a few years earlier as its plan to get back at Google. I don’t know if you…

Don McGowan
Now, I’m going to interrupt Mike, because speaking of red meat, Mike and I have had some discussions over time. I have a slightly different attitude towards Section 230 than Mike does. I’m not going into that because it’s orthogonal to today’s conversation. But I can tell you, if there was an Always Sunny in Philadelphia murder stringboard going on anywhere, I never saw it. I don’t think that was Manus doing client work on the board. Manus was always very a two solitudes guy, right? And the streams never crossed.

Mike Masnick
So let’s get to this thread that you wrote on Bluesky. You posted about Project 2025, which…

Don McGowan
Well, yeah, didn’t specifically mean to be talking about Project 2025. It was more the thing that underlies that section of Project 2025, which is the let’s let kids do labor.

Mike Masnick
Right. So on the off chance that listeners aren’t unfamiliar with Project 2025, very quickly, it’s the Heritage Foundation’s plan for a new Trump administration. It’s basically a whole bunch of ex-Trump admin people, and they have this whole plan to like, these are going to be the policies, these are going to be the people that we’re going to put in place. Many, many of the policies are horrible in all sorts of ways. But specifically, the part that you called out, which is in there and is somewhat shocking, is this idea of changing child labor laws to allow for more kids’ access to dangerous jobs.

Don McGowan
It is to facilitate kids doing work that we thought we were done with in the 1800s.

Mike Masnick
Yes. And specifically, the way it’s framed is really kind of incredible because it says “some young adults show an interest in inherently dangerous jobs,

Don McGowan
The children yearn for the mines…

Mike Masnick
… but current rules forbid many young people, even if their family is running the business.” So, it’s can you exploit your kids in the mines doing such dangerous jobs? And so they want to say, with parental consent and proper training young adults should be allowed to work in more dangerous occupations. So you called this out in particular. Do you want to describe what was your view on…

Don McGowan
Sure, so I’ll speak to that. I come at it from a slightly unusual perspective, which I’m not going to go deeply into except to say, as a child, my old man was a miner. He ran mining companies. And so I somewhat, legendarily to my friends at least, was airlifted into northern Canada and left by the side of a lake for two months to help my old man find places to mine.

Mike Masnick
My goodness. Wow. That is quite a background story.

Don McGowan
That’s exactly. So I got a background story of living off the land in Northern Canada as a 12 year old. So speaking of dangerous jobs, I feel like I got a perspective. But so I remember, because obviously this Project 2025 thing didn’t come out of nowhere. And I think we’re all aware that there’s been some amount of permissiveness coming into labor laws, especially in what I always euphemistically refer to (because it bugs the crap out of them) as Central America, otherwise known as the middle of the country, and not what we usually refer to as Central America. But so in the middle part of the country, there’s been some relaxation of labor laws. I noticed this while I was still on the NCMEC board and put forward… We had this board platform for board discussions. It’s called Boardable. Anybody who’s worked on a board may know it. And it’s basically a discussion board back and forth. And so I found one of these bills and I was like, hey, shouldn’t we be taking a stand on this?

And somebody came back and said, ‘why?’ And I said, well, it’s right there in the fucking name. National Center for Missing and Exploited Children, to which a different board member, the aforementioned comms manager at Charter Communications, came back and was like, ‘well, there’s no sexual exploitation in this bill.’ I was like, ‘Oh, I didn’t really know that the org’s name was the National Center for Sexually Missing and Sexually Exploited Children. Do we only help families after hurricanes when the hurricane raped the child?’

Mike Masnick
Oh gosh.

Don McGowan
And that got another board member to say, ‘you know, listen, like you’re going off on one of your crusades again,’ which I’ll come back to in a second. ‘But, you need to understand, I want my kids to be able to get a paper route. So it’ll teach kids responsibility.’ And I was like, ‘One, no, you don’t. You’re a corporate executive. And two, we all know this isn’t about paper routes. This is about teenagers working in meatpacking plants. And they’re gonna lose their fingers, like they’re gonna lose their arms, because it’s gonna get cut off in the meat cutting machines. And we should be taking a stand on this. I’m sorry that many of you support a political party that thinks it’s politically expedient to do this. But we should, this is exactly the kind of thing we should care about.’

If we have a policy as an advocacy committee, and we did at the time, I don’t know if they do still, but if we have a policy and advocacy committee, this is exactly the type of thing on which the nation’s legislators would look to us for guidance. We should provide it. To which the answer was me getting a call from the chair of the board saying, ‘Hey dude, step off. Be a little more collegial if you would please.’ I was like, ‘all right, fine.’

So at that point, that was when I reached the conclusion we were coming up on the… that NCMEC has three board meetings a year. We’re coming up on the April board meeting of last year. And I sort of put in my head like, okay, let’s see how this board meeting goes. It’ll probably be my last one at this point.

I mentioned a few minutes ago about how people made a reference to me going on one of my crusades. I had a separate issue that I was fairly vocal on, which is, as I would describe it, I think it’s terrible that we have a political party that in this country has decided it’s politically expedient to set aside a group of children, namely trans kids, for state-sponsored political persecution. We should care about this and we should be speaking out about it. I’m sorry some of you don’t like this. I’m sorry trans kids make you feel icky. But this is the kind of thing on which we should be providing moral leadership to the legislators of the country. And we should be saying wrong is wrong.

You know, there was a fair amount of disagreement at that last board meeting, that April board meeting, NCMEC got a grant from the state of Texas and the grant was subject to return if the money was unused or misused. And I was like, we got to find out what “misused” means. If the state of Texas has decided there is an entire population of children that should not receive any support… and if we use this money and some of it goes to their benefit, they may want their money back. And one of the board members looks over and he goes, ‘Thunk! You know what that is? That’s the sound of you beating a dead horse.’

Mike Masnick
Wow.

Don McGowan
I was like, OK, you know what? In my mind, I said, I just quit. Didn’t speak for the rest of the meeting. Let the meeting end. Said goodbye to everybody. Walked out. Never went back. Got home. Flew back to Seattle. Got home, walked in, wrote up a note of resignation to the general counsel of NCMEC. Went to that Boardable software, posted a note saying, ‘This was my last board meeting. There’s going be a lot of you that are going to be happy to see the back of me. What you may not realize is I’m just as happy to see the back of you.’ Send. Send resignation. Peace out.

Mike Masnick
Wow.

Don McGowan
I mean, NCMEC has data in its stat banks that say some of the kids most at risk in the world are trans kids. And they ignore that data.

One other thing, a thing that underlay my feelings about the child labor issue is, I mean, who are the kids that are gonna get put in the meatpacking plants? Two categories of them. One is kids of undocumented people who’ve been brought in and have to pay off the debt to the coyote that brought them across the border. And the second one, and I hate to say it because we said at the jump, NCMEC does a lot of great work. There’s another group of people in America who does amazing things, and that’s foster parents. But there are some really shitty foster parents out there who get the kids from the foster agencies so they can use them for sex trafficking and so they can use them for labor. Give those kids what I’ll euphemistically call access to a meatpacking plant, and now they can be put to work in what we would traditionally understand to be slavery.

And that’s exactly what I expect to see happen, is those bad foster parents who are there to use these kids as a way to make money for themselves and wreck those kids’ lives. We need the state to protect them, because those kids literally have nobody else. And instead, those state legislatures have decided to make it easier for those kids to be persecuted.

You can probably even hear it in my voice, but if you can’t, I’ll say it. And that’s the kind of thing that makes me just outraged to my core that a group of people who receive 90% of their budget from the United States federal government can’t bring themselves to say slavery is bad.

Mike Masnick Yeah. That’s astounding. We talked about the states in the in the center of America. Arkansas in particular, I had written something about this because it was kind of incredible: at the very same time that they were pushing a law to allow kids to work in meatpacking plants, right after there had been some scandals regarding kids getting hurt in meatpacking plants. So the governor there was pushing this law for to allow kids to work in meatpacking plants. At that very same time, she was also pushing a law for kids internet safety. And so I couldn’t believe that they couldn’t put two and two together. They were saying we have to protect the children, we have to put in place all these laws to protect them from Facebook, but in order to send them into meatpacking plants to work. And the disconnect….

Don McGowan Yeah. Protecting children takes on a very specific meaning in these circles.

It’s protect the children from the things we don’t want the children to see and have access to. Right? That was an, and I don’t even like using the word insight. That was a thing I discovered spending time among them. I now can process how they think. And it gives me, you know I mentioned earlier that you’re kind of a Nostradamus? It gives me the ability to see around some corners.

Mike Masnick I sort of understood that there were activists who felt that way and definitely pushed kid safety as an excuse to keep certain information away from kids. I totally understood that. What is kind of shocking to me in this discussion is recognizing that those people are on the NCMEC board.

Don McGowan That’s the thing, they go somewhere. It’s because a lot of them, the personality trait that goes along with it, is a lot of them are all cop fuckers.

I sit slightly on the left. I’m certainly going to tell you I was to the window to the walls yesterday. Certainly, you’ve mentioned earlier that I use a lot of Bluesky. I actually blew up my Twitter account one day picking a fight with the alt-right. And so I now spend my time on a niche left-focused microblogging site. It’s fun. But the idea of a lot of these folks… like I’m a lot more cop positive than a lot of people I know, because I’ve spent time with the cops that are doing the work at NCMEC, right? Like I have sympathies for those folks because I know what their job is and what they’re going through. I mean, these are cops who are out there trying to save kids from sex trafficking, right? There are liaison officers at NCMEC from most of the three-letter agencies. And you know, until I started this podcast, I had pretty good relationships with them. I hope I still will afterwards…

Mike Masnick I think that’s sort of an important point that is worth underlining, and often gets lost in these discussions, which is, and we started out the podcast by talking about how there is good work that NCMEC does. And a lot of that is coordinating with law enforcement, doing the actual good things that you want law enforcement to do. This is not a universal condemnation of either law enforcement or NCMEC itself.

But within that, that opens up opportunities for people with sort of problematic viewpoints to abuse that system to their advantage. And that has always been my problem with… like going back to FOSTA. It was presented as, you know, you can present these things in a good way. Like stopping sex trafficking is obviously a good and important and virtuous goal overall.

But when you have people whose real mission, and this was true of many of the backers of FOSTA, was not to stop sex trafficking, but to stop all pornography, all adult content entirely, and as part of that, to end encryption and a bunch of other things. And they were using FOSTA and some of these other laws as a kind of stalking horse to begin that process. And they were not subtle about this. I mean, the National Center, what is it? Oh gosh, what’s NCOSE? It’s, they have a name that sounds like, kind of like NCMEC, the National Center on Sexual Exploitation, I think, which is this nonprofit that, I think they used to be called like the Moral Majority or something like that, who had very strong belief that that there should be no pornography anywhere ever. And they were huge supporters and lobbied very strongly in favor of FOSTA and were quite open.

I mean, somehow I’m on their mailing list. So they send me all their press releases in which they are very explicit that that this was Step One in their goal of ending all pornography everywhere. They have a very very strong view on things, but the fact that they sort of wrap it around this idea of ‘we’re stopping sex trafficking’ when it’s really this other thing. But I had always sort of mentally separated out there are groups like that, which you know where they’re coming from and you know are staffed by crazy people. And I had hoped that NCMEC just wouldn’t have been captured by that side of things.

Don McGowan So I’m going to speak obliquely to that by going to an entirely different topic, but you’ll see why I draw the connection in a minute. So a few years ago, we were under the regime of the host of Celebrity Apprentice, and there was a government policy to put children in cages. One might wonder, where was NCMEC on that?

Mike Masnick That’s a good question.

Don McGowan I’m about to give you the answer to that question.

Where NCMEC was on that was there were a collection of us inside who put some fairly significant pressure on the CEO at the time, who was a guy named John Clark, who had come to NCMEC after being the head of the US Marshals, to say, ‘hey, apparently the government is having trouble finding these kids’ families. Isn’t that what we specialize in? Let’s go do that.’

And so John eventually, you know, there were a number of board members at the time. I was not as alone as I was at the end, but there were a number of board members at the time who sort of shared that opinion and asked John like what the hell’s going on? And he finally came back and said, which is true, ‘we can only go where we’re asked.’ Right. Kind of like, you know, vampires and lost boys. We can’t come in the house unless you ask us. We couldn’t go because we weren’t asked. And we were like, well, please go start sending correspondence to the heads of the agencies to tell them: ‘We would like to become involved in this and help you find the families for these kids.’

Almost immediately, there became a board schism on that. And the board schism came from people who said, many of them turned out to be my adversaries in the discussions around where are we on the child labor issue. Their stalking horse for it was, ‘listen, our budget is funded by Congress. Congress is implementing these policies. Let’s not piss them off.’ And so we punched through that. And John Clark finally reached out to DHS and all the various agencies. And their answer was, ‘Yeah, thanks. We’ll ask you if we need you. If we’re interested, we’ll call. Did we call? No. So we don’t want your help. Thanks. If we need you, we’ll reach out.’

So that’s where NCMEC was on that issue, was very much a… those of us on the board who saw things a little differently than the other folks pushed to try and get involved but you know, obviously there was no discussion of let’s take a public stance. Because the people would have the number of votes on the board would have overpowered taking a public stance.

I remember there was some very strong advocates and some very some people who are my very good friends on the board stepped off after that because they were like, ‘You know what? I don’t want to be involved here anymore because my advocacy and my money can be used better elsewhere. And my time can be better used than flying into DC for these meetings.’ And so, sort of obliquely, I think that’s the answer to the, ‘where were they on a lot of these issues?’ Why would they take such a strong stance on FOSTA? Well, because you have board members who were ideologically aligned with the people who were proposing the bills.

And so, what I said earlier about the board member who was obsessed with porn, right? Do we think that guy was ever going to vote against us taking a stance in favor of FOSTA? No. And, you know, I mentioned earlier, one of the things I’m doing now is expiating my sins. In particular, it’s the, did I not see it? Like, did I get played by these motherfuckers? And I think I did. Right? And that’s a terrible thing you to realize about yourself is I got played by people. And I got played, you know, there were some of them that I still like as humans, and I have to sort of wrap my head around that, you know what, I got played by these motherfuckers. Because they had an ulterior motive, but they didn’t know how to achieve it. But what they did know is how to use me to achieve it.

Mike Masnick Huh. I mean, that’s a little harsh on yourself…

Don McGowan But it’s true. So, you know, that’s one of the reasons I took such a… ‘Let me start burning it down around KOSA.’ Was, well, because, you know what? I’m not gonna get played twice. You know, how did George W. Bush put it? Fool me once, won’t get fooled again.

Mike Masnick Right. It’s interesting as I’m thinking about it.. the argument that because so much of the funding for NCMEC comes from Congress… one sort of throughline in all of this is that NCMEC is unwilling to challenge child exploitation that is effectively blessed by the government, and is only willing to challenge it when it’s outside the government that is the problem. And some of that could be tied to the fact that so much of its funding comes from the government.

Don McGowan I think that’s accurate.

And that to me is almost an argument of, then shut the fuck up about everything.

Mike Masnick Yeah, yeah. I mean, that’s sort of how I feel…

Don McGowan If you’re only ever gonna support government policy, shut the fuck up…

Mike Masnick Yeah, because that leads you to bad places.

Don McGowan Yeah, don’t try and pretend you’re taking a stance. ‘Yeah, we take stances on issues that protect kids, so long as the government likes them.’ Or more directly, so long as the Republican Party likes them. This is the point where I say out loud, I was around NCMEC for the QAnon years. Those fuckers did nothing for actual child protection. Right? The QAnon people don’t care about children.

What they care about is being able to say they care about children.

And so, you know, they never cared about actual children because the actual children who were at highest risk during those years were the children of undocumented immigrants. And they could not have wanted more to round those kids up, put them in cages, and send them back outside of the United States.

Mike Masnick Are you suggesting that the board was sort of QAnon captured itself or just afraid to…

Don McGowan I mean, we didn’t take any public stances against QAnon. And that was the one that made me start first thinking like, huh, what the fuck is going on here? Like, why? We all know these people aren’t actually helping actual kids. Why are we not saying anything about that?

And it’s funny, given what we all think, I’ll tell you, some of the most aware of the problem directors were the ones from Facebook. Given what the internet thinks… It’s funny, I mentioned my Bluesky habit, and you mentioned my Bluesky habit a little while ago. Jess Miers showed up on Bluesky and started to talk about tech issues and immediately got hammered by a collection of scolds who all thought they knew better than she did about the issue that she had spent her career working on and chased her away. And so, you know, I sort of watched that and I was like, okay, that’s what happens to the people who are willing to stand up and say, ‘hey, let’s look at this because it had the same name as an issue that these people cared about.’ And, you know, we all use words based on our experience of them. And not everybody always uses the word in a dictionary definition, et cetera. We are all at risk of that particular problem, I guess is where I’m going.

Mike Masnick Fair enough. So I think to sort of round out the conversation, is there anything that can be done? Again, noting the good that NCMEC has done and the importance of the CyberTipline? Is there a way to fix NCMEC?

Don McGowan I think there is. I think the challenge is, as a private nonprofit, it’s always going to have to have a board. And its board members are going to be the people who are attracted to those kinds of boards. And especially when it’s a fundraising board, it’s going to be people with a certain political bent. Not always, but it’s likely to be, especially given how law enforcement adjacent it is. And so I would say what should happen is the organization should be properly captured by the federal government, become an agency of the federal government, stand as an independent agency of the federal government, similar to the way so many of the boards, et cetera, are statutorily say that it stands separate from the federal government so you don’t end up with people, with the agency worried about, for example, ‘oh crap, we shouldn’t protect kids in meatpacking plants because we might lose our congressional funding.’ Hypothecate the funding, to use the technical term, and then the funding is just there, hypothecated, not subject to annual reauthorization.

Right now, I think that’s one of the biggest issues with NCMEC, is that it has its annual funding resolution that has to go through, and one of the great sponsors of NCMEC was a person who isn’t always loved on the internet, but Senator John McCain. Senator John McCain could always be counted upon to sponsor the NCMEC funding resolution. And somebody would sponsor in the House.

I’ll tell you the other thing. When I was at Pokemon and I was handling government affairs for that company, you’d think the Pokemon name would do it, but there was not a member of Congress on Earth that wouldn’t take a meeting with a member of the board of NCMEC. Right? That organization, carries a lot of weight in DC. Democrats right through Republicans, nobody wouldn’t take my meeting. The entire organization is under the halo of, I said earlier, I’ve got people that I still like there, even though I have remember that they weren’t always aligned with the way I think. That organization sits under the halo of John Walsh.

John Walsh, who is, by the way, one of the greatest people I know, and two, exactly like that in real life. If you’ve seen him on America’s Most Wanted or any of his other shows, he is exactly like that. My wife and I sometimes look at each other and go, ‘get those dirt bags!’ Because that was the great John Walsh expression, ‘dirt bags!’ That man in my mind is one of the good men on earth. His wife is one of the great Americans. I think segregating out the organization and its mission from the organization and its public persona, I think is the way to rescue and save it.

Mike Masnick We could go down a rabbit hole, which is not worth it. I just want to note that like, the challenge of making it actually an independent government agency is then, especially for the CyberTipline, you start to run into Fourth Amendment issues over…

Don McGowan The Fourth Amendment issues are already there. There’s already case law on NCMEC and the Fourth Amendment.

Mike Masnick Yes, but not in every circuit, so there’s a possibility that it will… but yes, you’re right that they are already dealing with some of that already because it has been determined by Neil Gorsuch, in fact, that that they’re a wing of the US government…

Don McGowan And so I’ll say two things to that. One, if you think Supreme Court Justice Neil Gorsuch would say something different, and two, as is the issue for so many private organizations, the fact that it’s not in every circuit doesn’t leave anybody thinking that it’s not going to be. Any circuit court decision, take it from a guy who was general counsel slash chief legal officer for 20 years. Any circuit court decision is the law of America.

Because you don’t pray for circuit splits. Because circuit splits are expensive

Mike Masnick That is that is absolutely true.

All right. Well, I will let you go. But this has been a…

Don McGowan Thanks for giving me the chance to come on, Mike. I really appreciate…

Mike Masnick Yeah, it’s a really fascinating discussion, a really interesting look into NCMEC. And, I hope that that this gets sorted out, because I would like the organization to continue to do the good work that it does. And it worries me when they when they start promoting this nonsense or not protesting against other nonsense.

Don McGowan This stuff needs to have somebody who is on the inside who’s willing to talk.

Mike Masnick Yeah. Yeah. And so thank you so much for being willing to step up and to explain it.

Don McGowan Can I take one last minute of this to do a thing that I would appreciate being able to do?

To the folks out there who were hurt by FOSTA, I don’t have another way to put it, I’m sorry. I will spend my life trying to fix that sin.

Mike Masnick Well, thank you. Thank you for saying that. Thanks again for doing this and for speaking out and hopefully making more people aware of all this.

Don McGowan Thanks to you, Mike.

Mike Masnick And thanks to everyone for listening as well. And we will be back next week with another podcast.

Filed Under: don mcgowan, fosta, kosa, protect the children
Companies: ncmec

The REPORT Act: Enhancing Online Child Safety Without the Usual Congressional Nonsense

from the a-good-bill?-for-the-children? dept

For years and years, Congress has been pushing a parade of horrible “protect the children online” bills that seem to somehow get progressively worse each time. I’m not going through the entire list of them, because it’s virtually endless.

One of the most frustrating things about those bills, and the pomp and circumstance around them, is that it ignores the simpler, more direct things that Congress could do that would actually help.

Just last week, we wrote about the Stanford Internet Observatory’s big report on the challenges facing the CyberTipline, run by the National Center for Missing & Exploited Children (NCMEC). We wrote two separate posts about the report (and also discussed it on the latest episode of our new podcast, Ctrl-Alt-Speech) because there was so much useful information in there. As we noted, there are real challenges in making the reporting of child sexual abuse material (CSAM) work better, and it’s not because people don’t want to help. It’s actually because of a set of complex issues that are not easily solvable (read the report or my articles for more details).

But there were still a few clear steps that could be taken by Congress to help.

This week, the REPORT Act passed Congress, and it includes… a bunch of those straightforward, common sense things that should help improve the CyberTipline process. The key bit is allowing the CyberTipline to modernize a bit, including allowing it to use cloud storage. To date, no cloud storage vendors could work with NCMEC, out of a fear that they’d face criminal liability for “hosting CSAM.”

This bill fixes that, and should enable NCMEC to make use of some better tools and systems, including better classifiers, which are becoming increasingly important.

There are also some other factors around letting victims and parents of victims report CSAM involving the child directly to NCMEC, which can be immensely helpful in trying to stop the spread of some content (and on focusing some law enforcement responses).

There are also some technical fixes that require platforms to retain certain records for a longer period of time. This was another important point that was highlighted in the Stanford report. Given the flow of information and prioritization, sometimes by the time law enforcement realized it should get a warrant to get more info from a platform, the platform would have already deleted it as required under existing law. Now that time period is extended to give law enforcement a bit more time.

The one bit that we’ll have to see how it works is that it extends the reporting requirements for social media to include violations of 18 USC 1591, which is the law against sex trafficking. Senator Marsha Blackburn, who is the co-author of the bill, is claiming that this means that “big tech companies will now be required to report when children are being trafficked, groomed or enticed by predators.”

Image

So, it’s possible I’m misreading the law (and how it works with existing laws…) but I see nothing limiting this to “big tech.” It appears to apply to any “electronic communication service provider or remote computing service.”

Also, given that Marsha Blackburn appears to consider “grooming” to include things like LGBTQ content in schools, I worried that this was going to be a backdoor bill to making all internet websites have to “report” such content to NCMEC, which would flood their systems with utter nonsense. Thankfully, 1591 seems to include some pretty specific definitions of sex trafficking that do not match up with Blackburn’s definition. So she’ll get the PR victory among nonsense peddlers for pretending that it will lead to the reporting of the non-grooming that she insists is grooming.

And, of course, while this bill was actually good (and it’s surprising to see Blackburn on a good internet bill!) it’s not going to stop her from continuing to push KOSA and other nonsense moral panic “protect the children” bills that will actually do real harm.

Filed Under: csam, cybertipline, jon ossoff, marsha blackburn, modernization, report act, sex trafficking
Companies: ncmec

The Problems Of The NCMEC CyberTipline Apply To All Stakeholders

from the no-easy-answers dept

The failures of the NCMEC CyberTipline to combat child sexual abuse material (CSAM) as well as it could are extremely frustrating. But as you look at the details, you realize there just aren’t any particularly easy fixes. While there are a few areas that could improve things at the margin, the deeper you look, the more challenging the whole setup is. There aren’t any easy answers.

And that sucks, because Congress and the media often expect easy answers to complex problems. And that might not be possible.

This is the second post about the Stanford Internet Observatory’s report on the NCMEC CyberTipline, which is the somewhat useful, but tragically limited, main way that investigations of child sexual abuse material (CSAM) online is done. In the first post, we discussed the structure of the system, and how the incentive structure regarding law enforcement is a big part of what’s making the system less impactful than it otherwise might be.

In this post, I want to dig in a little more about the specific challenges in making the CyberTipline work better.

The Constitution

I’m not saying that the Constitution is a problem, but it represents a challenge here. In the first post, I briefly mentioned Jeff Kosseff’s important article about how the Fourth Amendment and the structure of NCMEC makes things tricky, but it’s worth digging in a bit here to understand the details.

The US government set up NCMEC as a private non-profit in part because if it were a government agency doing this work, there would be significant concerns about whether or not the evidence it gets was collected with or without a warrant under the Fourth Amendment. If it’s a government agency, then the law cannot require companies to hand over the info without a warrant.

So, Congress did a kind of two-step dance here: they set up this “private” non-profit, and then created a law that requires companies that come across CSAM online to report it to the organization. And all of this seems to rely on a kind of fiction that if we pretend NCMEC isn’t a government agent, then there’s no 4th Amendment issue.

From the Stanford report:

The government agent doctrine explains why Section 2258A allows, but does not require, online platforms to search for CSAM. Indeed, the statute includes an express disclaimer that it does not require any affirmative searching or monitoring. Many U.S. platforms nevertheless proactively monitor their services for CSAM, yielding millions of CyberTipline reports per year. Those searches’ legality hinges on their voluntariness. The Fourth Amendment prohibits unreasonable searches and seizures by the government; warrantless searches are typically considered unreasonable. The Fourth Amendment doesn’t generally bind private parties, however the government may not sidestep the Fourth Amendment by making a private entity conduct a search that it could not constitutionally do itself. If a private party acts as the government’s “instrument or agent” rather than “on his own initiative” in conducting a search, then the Fourth Amendment does apply to the search. That’s the case where a statute either mandates a private party to search or “so strongly encourages a private party to conduct a search that the search is not primarily the result of private initiative.” And it’s also true in situations where, with the government’s knowledge or acquiescence, a private actor carries out a search primarily to assist the government rather than to further its own purposes, though this is a case-by-case analysis for which the factors evaluated vary by court.

Without a warrant, searches by government agents are generally unconstitutional. The usual remedy for an unconstitutional search is for a court to throw out all evidence obtained as a result of it (the so-called “exclusionary rule”). If a platform acts as a government agent when searching a user’s account for CSAM, there is a risk that the resulting evidence could not be introduced against the user in court, making a conviction (or plea bargain) harder for the prosecution to obtain. This is why Section 2258A does not and could not require online platforms to search for CSAM: it would be unconstitutional and self-defeating.

In CSAM cases involving CyberTipline reports, defendants have tried unsuccessfully to characterize platforms as government agents whose searches were compelled by Section 2258A and/or by particular government agencies or investigators. But courts, pointing to the statute’s express disclaimer language (and, often, the testimony of investigators and platform employees), have repeatedly held that platforms are not government agents and their CSAM searches were voluntary choices motivated mainly by their own business interests in keeping such repellent material off their services.

So, it’s quite important that the service providers that are finding and reporting CSAM are not seen as agents of the government. It would destroy the ability to use that evidence in prosecuting cases. That’s important. And, as the report notes, it’s also why it would be a terrible idea to require social media to proactively try to hunt down CSAM. If the government required it, it would effectively light all that evidence on fire and prevent using it for prosecution.

That said, the courts (including in a ruling by Neil Gorsuch while he was on the appeals court) have made it clear that, while platforms may not be government agents, it’s pretty damn clear that NCMEC and the CyberTipline are. And that creates some difficulties.

In a landmark case called Ackerman, one federal appeals court held that NCMEC is a “governmental entity or agent.” Writing for the Tenth Circuit panel, then-judge Neil Gorsuch concluded that NCMEC counts as a government entity in light of NCMEC’s authorizing statutes and the functions Congress gave it to perform, particularly its CyberTipline functions. Even if NCMEC isn’t itself a governmental entity, the court continued, it acted as an agent of the government in opening and viewing the defendant’s email and four attached images that the online platform had (as required) reported to NCMEC. The court ruled that those actions by NCMEC were a warrantless search that rendered the images inadmissible as evidence. Ackerman followed a trial court-level decision, Keith, which had also deemed NCMEC a government agent: its review of reported images served law enforcement interests, it operated the CyberTipline for public not private interests, and the government exerts control over NCMEC including its funding and legal obligations. As an appellate-level decision, Ackerman carries more weight than Keith, but both have proved influential.

The private search doctrine is the other Fourth Amendment doctrine commonly raised in CSAM cases. It determines what the government or its agents may view without a warrant upon receiving a CyberTipline report from a platform. As said, the Fourth Amendment generally does not apply to searches by private parties. “If a private party conducted an initial search independent of any agency relationship with the government,” the private search doctrine allows law enforcement (or NCMEC) to repeat the same search so long as they do not exceed the original private search’s scope. Thus, if a platform reports CSAM that its searches had flagged, NCMEC and law enforcement may open and view the files without a warrant so long as someone at the platform had done so already. The CyberTipline form lets the reporting platform indicate which attached files it has reviewed, if any, and which files were publicly available.

For files that were not opened by the platform (such as where a CyberTipline submission is automated without any human review), Ackerman and a 2021 Ninth Circuit case called Wilson hold that the private search exception does not apply, meaning the government or its agents (i.e., NCMEC) may not open the unopened files without a warrant. Wilson disagreed with the position, adopted by two other appeals-court decisions, that investigators’ warrantless opening of unopened files is permissible if the files are hash matches for files that had previously been viewed and confirmed as CSAM by platform personnel. Ackerman concluded by predicting that law enforcement “will struggle not at all to obtain warrants to open emails when the facts in hand suggest, as they surely did here, that a crime against a child has taken place.”

To sum up: Online platforms’ compliance with their CyberTipline reporting obligations does not convert them into government agents so long as they act voluntarily in searching their platforms for CSAM. That voluntariness is crucial to maintaining the legal viability of the millions of reports platforms make to the CyberTipline each year. This imperative shapes the interactions between platforms and U.S.-based legislatures, law enforcement, and NCMEC. Government authorities must avoid crossing the line into telling or impermissibly pressuring platforms to search for CSAM or what to search for and report. Similarly, platforms have an incentive to maintain their CSAM searches’ independence from government influence and to justify those searches on rationales “separate from assisting law enforcement.” When platforms (voluntarily) report suspected CSAM to the CyberTipline, Ackerman and Wilson interpret the private search doctrine to let law enforcement and NCMEC warrantlessly open and view only user files that had first been opened by platform personnel before submitting the tip or were publicly available.

This is all pretty important in making sure that the whole system stays on the right side of the 4th Amendment. As much as some people really want to force social media companies to proactively search for and report CSAM, mandating that creates real problems under the 4th Amendment.

As for the NCMEC and law enforcement side of things, the requirement to get a warrant for unopened communications remains important. But, as noted below, sometimes law enforcement doesn’t want to get a warrant. If you’ve been reading Techdirt for any length of time, this shouldn’t surprise you. We see all sorts of areas where law enforcement refuses to take that basic step of getting a warrant.

Understanding that framing is important to understanding the rest of this, including exploring where each of the stakeholders fall down. Let’s start with the biggest problem of all: where law enforcement fails.

Law Enforcement

In the first article on this report, we noted that the incentive structure has made it such that law enforcement often tries to evade this entire process. It doesn’t want to go through the process of getting warrants some of the time. It doesn’t want to associate with the ICAC task forces because they feel like it puts too much of a burden on them, and if they don’t take care of it, someone else on the task force will. And sometimes they don’t want to deal with CyberTipline reports because they’re afraid that if they’re too slow after getting a report, they might face liability.

Most of these issues seem to boil down to law enforcement not wanting to do its job.

But the report details some of the other challenges for law enforcement. And it starts with just how many reports are coming in:

Almost across the board law enforcement expressed stress over their inability to fully investigate all CyberTipline reports due to constraints in time and resources. An ICAC Task Force officer said “You have a stack [of CyberTipline reports] on your desk and you have to be ok with not getting to it all today. There is a kid in there, it’s really quite horrible.” A single Task Force detective focused on internet crimes against children may be personally responsible for 2,000 CyberTipline reports each year. That detective is responsible for working through all of their tips and either sending them out to affiliates or investigating them personally. This process involves reading the tip, assessing whether a crime was committed, and determining jurisdiction; just determining jurisdiction might necessitate multiple subpoenas. Some reports are sent out to affiliates and some are fully investigated by detectives at the Task Force.

An officer at a Task Force with a relatively high CyberTipline report arrest rate said “we are stretched incredibly thin like everyone.” An officer in a local police department said they were personally responsible for 240 reports a year, and that all of them were actionable. When asked if they felt overwhelmed by this volume, they said yes. While some tips involve self-generated content requiring only outreach to the child, many necessitate numerous search warrants. Another officer, operating in a city with a population of 100,000, reported receiving 18–50 CyberTipline reports annually, actively investigating around 12 at any given time. “You have to manage that between other egregious crimes like homicides,” they said. This report will not extensively cover the issue of volume and law enforcement capacity, as this challenge is already well-documented and detailed in the 2021 U.S. Department of Homeland Security commissioned report, in Cullen et al., and in a 2020 Government Accountability Office report. “People think this is a one-in-a-million thing,” a Task Force officer said. “What they don’t know is that this is a crime of secrecy, and could be happening at four of your neighbors’ houses.”

And of course, making social media platforms more liable doesn’t help to fix much here. At best, it makes it worse because it encourages even more reporting by the platforms, which only further overloads law enforcement.

Given all those reports the cops are receiving, you’d hope they had a good system for managing them. But your hope would not be fulfilled:

Law enforcement pick a certain percentage of reports to investigate. The selection is not done in a very scientific way—one respondent described it as “They hold their finger up in the air to feel the wind.” An ICAC Task Force officer said triage is more of an art than a science. They said that with experience you get a feel for whether a case will have legs, but that you can never be certain, and yet you still have to prioritize something.

That seems less than ideal.

Another problem, though, is that a lot of the reports are not prosecutable at all. Because of the incentives discussed in the first post, apparently certain known memes get reported to the CyberTipline quite frequently, and police feel they just clog up the system. But because the platforms fear significant liability if they don’t report those memes, they keep reporting them.

U.S. law requires that platforms report this content if they find it, and that NCMEC send every report to law enforcement. When NCMEC knows a report contains viral content or memes they will label it “informational,” a category that U.S. law enforcement typically interpret as meaning the report can be ignored, but not all such reports get labeled “informational.” Additionally there are an abundance of “age difficult” reports that are unlikely to lead to prosecution. Law enforcement may have policies requiring some level of investigation or at least processing into all noninformational reports. Consequently, officers often feel inundated with reports unlikely to result in prosecution. In this scenario, neither the platforms, NCMEC, nor law enforcement agencies feel comfortable explicitly ignoring certain types of reports. An employee from a platform that is relatively new to NCMEC reporting expressed the belief that “It’s best to over-report, that’s what we think.”

At best, this seems to annoy law enforcement, but it’s a function of how the system works:

An officer expressed frustration over platforms submitting CyberTipline reports that, in their view, obviously involve adults: “Tech companies have the ability to […] determine with a high level of certainty if it’s an adult, and they need to stop sending [tips of adults].” This respondent also expressed a desire that NCMEC do more filtering in this regard. While NCMEC could probably do this to some extent, they are again limited by the fact that they cannot view an image if the platform did not check the “reviewed” box (Figure 5.3 on page 26). NCMEC’s inability to use cloud services also makes it difficult for them to use machine learning age classifiers. When we asked NCMEC about the hurdles they face, they raised the “firehose of I’ll just report everything” problem.

Again, this all seems pretty messy. Of course you want companies to report anything they find that might be CSAM. And, of course, you want NCMEC to pass them on to law enforcement. But the end result is overwhelmed law enforcement with no clear process for triage and dealing with a lot of reports that were sent in an abundance of caution but which are not at all useful to law enforcement.

And, of course, there are other challenges that policymakers probably don’t think about. For example: how do you deal with hacked accounts? How much information is it right for the company to share with law enforcement?

One law enforcement officer provided an interesting example of a type of report he found frustrating: he said he frequently gets reports from one platform where an account was hacked and then used to share CSAM. This platform provided the dates of multiple password changes in the report, which the officer interpreted as indicating the account had been hacked. Despite this, they felt obligated to investigate the original account holder. In a recent incident they described, they were correct that the account had been hacked. They expressed that if the platform explicitly stated their suspicion in the narrative section of the report, such as by saying something like “we think this account may have been hacked,” they would then feel comfortable de-prioritizing these tips. We subsequently learned from another respondent that this platform provides time stamps for password changes for all of their reports, putting the burden on law enforcement to assess whether the password changes were of normal frequency, or whether they reflected suspicious activity.

With that said, the officer raised a valid issue: whether platforms should include their interpretation of the information they are reporting. One platform employee we interviewed who had previously worked in law enforcement acknowledged that they would have found the platform’s unwillingness to explicitly state their hunch frustrating as well. However, in their current role they also would not have been comfortable sharing a hunch in a tip: “I have preached to the team that anything they report to NCMEC, including contextual information, needs to be 100% accurate and devoid of personal interpretation as much as possible, in part because it may be quoted in legal process and case reports down the line.” They said if a platform states one thing in a tip, but law enforcement discovers that is not the case, that could make it more difficult for law enforcement to prosecute, and could even ruin their case. Relatedly, a former platform employee said some platforms believe if they provide detailed information in their reports courts may find the reports inadmissible. Another platform employee said they avoid sharing such hunches for fear of it creating “some degree of liability [even if ] not legal liability” if they get it wrong

The report details how local prosecutors are also loathe to bring cases, because it’s tricky to find a jury who can handle a CSAM case:

It is not just police chiefs who may shy away from CSAM cases. An assistant U.S. attorney said that potential jurors will disqualify themselves from jury duty to avoid having to think about and potentially view CSAM. As a result, it can take longer than normal to find a sufficient number of jurors, deterring prosecutors from taking such cases to trial. There is a tricky balance to strike in how much content to show jurors, but viewing content may be necessary. While there are many tools to mitigate the effect of viewing CSAM for law enforcement and platform moderators, in this case the goal is to ensure that those viewing the content understand the horror. The assistant U.S. attorney said that they receive victim consent before showing the content in the context of a trial. Judges may also not want to view content, and may not need to if the content is not contested, but seeing it can be important as it may shape sentencing decisions.

There are also issues outside the US with law enforcement. As noted in the first article, NCMEC has become the de facto global reporting center, because so many companies are based in the US and report there. And the CyberTipline tries to share out to foreign law enforcement too, but that’s difficult:

For example, in the European Union, companies’ legal ability to voluntarily scan for CSAM required the passage of a special exception to the EU’s so-called “ePrivacy Directive”. Plus, against a background where companies are supposed to retain personal data no longer than reasonably necessary, EU member states’ data retention laws have repeatedly been struck down on privacy grounds by the courts for retention periods as short as four or ten weeks (as in Germany) and as long as a year (as in France). As a result, even if a CyberTipline report had an IP address that was linked to a specific individual and their physical address at the time of the report, it may not be possible to retrieve that information after some amount of time.

Law enforcement agencies abroad have varying approaches to CyberTipline reports and triage. Some law enforcement agencies will say if they get 500 CyberTipline reports a year, that will be 500 cases. Another country might receive 40,000 CyberTipline reports that led to just 150 search warrants. In some countries the rate of tips leading to arrests is lower than in the U.S. Some countries may find that many of their CyberTipline reports are not violations of domestic law. The age of consent may be lower than in the U.S., for example. In 2021 Belgium received about 15,000 CyberTipline reports, but only 40% contained content that violated Belgium law

And in lower income countries, the problems can be even worse, including confusion about how the entire CyberTipline process works.

We interviewed two individuals in Mexico who outlined a litany of obstacles to investigating CyberTipline reports even where a child is known to be in imminent danger. Mexican federal law enforcement have a small team of people who work to process the reports (in 2023 Mexico received 717,468 tips), and there is little rotation. There are people on this team who have been viewing CyberTipline reports day in and day out for a decade. One respondent suggested that recent laws in Mexico have resulted in most CyberTipline reports needing to be investigated at the state level, but many states lack the know-how to investigate these tips. Mexico also has rules that require only specific professionals to assess the age of individuals in media, and it can take months to receive assessments from these individuals, which is required even if the image is of a toddler

The investigator also noted that judges often will not admit CyberTipline reports as evidence because they were provided proactively and not via a court order as part of an investigation. They may not understand that legally U.S. platforms must report content to NCMEC and that the tips are not an extrajudicial invasion of privacy. As a result, officers may need a court order to obtain information that they already have in the CyberTipline report, confusing platforms who receive requests for data they put in a report a year ago. This issue is not unique to Mexico; NCMEC staff told us that they see “jaws drop” in other countries during trainings when they inform participants about U.S. federal law that requires platforms to report CSAM.

NCMEC Itself

The report also details some of the limitations of NCMEC and the CyberTipline itself, some of which are legally required (and where it seems like the law should be updated).

There appears to be a big issue with repeat reports, where NCMEC needs to “deconflict” them, but has limited technology to do so:

Improvements to the entity matching process would improve CyberTipline report prioritization processes and detection, but implementation is not always as straightforward as it might appear. The current automated entity matching process is based solely on exact matches. Introducing fuzzy matching, which would catch similarity between, for example, bobsmithlovescats1 and bobsmithlovescats2, could be useful in identifying situations where a user, after suspension, creates a new account with an only slightly altered username. With a more expansive entity matching system, a law enforcement officer proposed that tips could gain higher priority if certain identifiers are found across multiple tips. This process, however, may also require an analyst in the loop to assess whether a fuzzy match is meaningful.

It is common to hear of instances where detectives received dozens of separate tips for the same offender. For instance, the Belgium Federal Police noted receiving over 500 distinct CyberTipline reports about a single offender within a span of five months. This situation can arise when a platform automatically submits a tip each time a user attempts to upload CSAM; if the same individual tries to upload the same CSAM 60 times, it could result in 60 separate tips. Complications also arise if the offender uses a Virtual Private Network (VPN); the tips may be distributed across different law enforcement agencies. One respondent told us that a major challenge is ensuring that all tips concerning the same offender are directed to the same agency and that the detective handling them is aware that these numerous tips pertain to a single individual.

As the report notes, there are a variety of challenges, both economic and legal, in enabling NCMEC to upgrade its technology:

First, NCMEC operates with a limited budget and as a nonprofit they may not be able to compete with industry salaries for qualified technical staff. The status quo may be “understandable given resource constraints, but the pace at which industry moves is a mismatch with NCMEC’s pace.” Additionally, NCMEC must also balance prioritizing improving the CyberTipline’s technical infrastructure with the need to maintain the existing infrastructure, review tips, or execute other non-Tipline projects at the organization. Finally, NCMEC is feeding information to law enforcement, which work within bureaucracies that are also slow to update their technology. A change in how NCMEC reports CyberTipline information may also require law enforcement agencies to change or adjust their systems for receiving that information.

NCMEC also faces another technical constraint not shared with most technology companies: because the CyberTipline processes harmful and illegal content, it cannot be housed on commercially available cloud services. While NCMEC has limited legal liability for hosting CSAM, other entities currently do not, which constrains NCMEC’s ability to work with outside vendors. Inability to transfer data to cloud services makes some of NCMEC’s work more resource intensive and therefore stymies some technical developments. Cloud services provide access to proprietary machine learning models, hardware-accelerated machine learning training and inference, on-demand resource availability and easier to use services. For example, with CyberTipline files in the cloud, NCMEC could more easily conduct facial recognition at scale and match photos from the missing children side of their work with CyberTipline files. Access to cloud services could potentially allow for scaled detection of AI-generated images and more generally make it easier for NCMEC to take advantage of existing machine learning classifiers. Moving millions of CSAM files to cloud services is not without risks, and reasonable people disagree about whether the benefits outweigh the risks. For example, using a cloud facial recognition service would mean that a third party service likely has access to the image. There are a number of pending bills in Congress that, if passed, would enable NCMEC to use cloud services for the CyberTipline while providing the necessary legal protections to the cloud hosting providers.

Platforms

And, yes, there are some concerns about the platforms. But while public discussion seems to focus almost exclusively on where people think that platforms have failed to take this issue seriously, the report suggests the failures of platforms are much more limited.

The report notes that it’s a bit tricky to get platforms up and running with CyberTipline reporting, and that even as NCMEC will do some onboarding, it’s very limited to avoid some of the 4th Amendment concerns talked about above.

And, again, some of the problem with onboarding is due to outdated tech on NCMEC’s side. I mean… XML? Really?

Once NCMEC provides a platform with an API key and the corresponding manual, integrating their workflow with the reporting API can still present challenges. The API is XML-based, which requires considerably more code to integrate with than simpler JSON-based APIs and may be unfamiliar to younger developers. NCMEC is aware that this is an issue. “Surprisingly large companies are using the manual form,” one respondent said. One respondent at a small platform had a more moderate view; he thought the API was fine and the documentation “good.” But another respondent called the API “crap.”

There are also challenges under the law about what needs to be reported. As noted above and in the first article, that can often lead to over-reporting. But it can also make things difficult for companies trying to make determinations.

Platforms will additionally face policy decisions. While prohibiting illegal content is a standard approach, platforms often lack specific guidelines for moderators on how to interpret nuanced legal terms such as “lascivious exhibition.” This term is crucial for differentiating between, for example, an innocent photo of a baby in a bathtub, and a similar photo that appears designed to show the baby in a way that would be sexually arousing to a certain type of viewer. Trust and safety employees will need to develop these policies and train moderators.

And, of course, as has been widely discussed elsewhere, it’s not great that platforms have to hire human beings and expose them to this kind of content.

However, the biggest issue on reporting seems to not be a company’s unwillingness to do so, but how much information they pass along. And again, here, the issue is not so much unwillingness of the companies to be cooperative, but the incentives.

Memes and viral content pose a huge challenge for CyberTipline stakeholders. In the best case scenario, a platform checks the “Potential Meme” box and NCMEC automatically sends the report to an ICAC Task Force as “informational,” which appears to mean that no one at the Task Force needs to look at the report.

In practice, a platform may not check the “Potential Meme” box (possibly due to fixable process issues or minor changes in the image that change the hash value) and also not check the “File Viewed by Company” box. In this case NCMEC is unable to view the file, due to the Ackerman and Wilson decisions as discussed in Chapter 3. A Task Force could view the file without a search warrant and realize it is a meme, but even in that scenario it takes several minutes to close out the report. At many Task Forces there are multiple fields that have to be entered to close the report, and if Task Forces are receiving hundreds of reports of memes this becomes hugely time consuming. Sometimes, however, law enforcement may not realize the report is a meme until they have invested valuable time into getting a search warrant to view the report.

NCMEC recently introduced the ability for platforms to “batch report” memes after receiving confirmation from NCMEC that that meme is not actionable. This lets NCMEC label the whole batch as informational, which reduces the burden on law enforcement

We heard about an example where a platform classified a meme as CSAM, but NCMEC (and at least one law enforcement officer we spoke to about this meme) did not classify it as CSAM. NCMEC told the platform they did not classify the meme as CSAM, but according to NCMEC the platform said because they do consider it CSAM they were going to continue to report it. Because the platform is not consistently checking the “Potential Meme” box, law enforcement are still receiving it at scale and spending substantial time closing out these reports.

There is a related challenge when a platform neglects to mark content as “viral”. Most viral images are shared in outrage, not with an intent to harm. However, these viral images can be very graphic. The omission of the “viral” label can lead law enforcement to mistakenly prioritize these cases, unaware that the surge in reports stems from multiple individuals sharing the same image in dismay.

We spoke to one platform employee about the general challenge of a platform deeming a meme CSAM while NCMEC or law enforcement agencies disagree. They noted that everyone is doing their best to apply the Dost test. Additionally, there is no mechanism to get an assurance that a file is not CSAM: “No one blesses you and says you’ve done what you need to do. It’s a very unsettling place to be.” They added that different juries might come to different conclusions about what counts as CSAM, and if a platform fails to report a file that is later deemed CSAM the platform could be fined $300,000 and face significant public backlash: “The incentive is to make smart, conservative decisions.”

This is all pretty fascinating, and suggests that while there may be ways to improve things, it’s difficult to structure things right and make the incentives align properly.

And, again, the same incentives pressure the platforms to just overreport, no matter what:

Once a platform integrates with NCMEC’s CyberTipline reporting API, they are incentivized to overreport. Consider an explicit image of a 22-year-old who looks like they could be 17: if a platform identified the content internally but did not file a report and it turned out to be a 17-year-old, they may have broken the law. In such cases, they will err on the side of caution and report the image. Platform incentives are to report any content that they think is violative of the law, even if it has a low probability of prosecution. This conservative approach will also lead to reports from what Meta describes as “non-malicious users”—for instance, individuals sharing CSAM in outrage. Although such reports could theoretically yield new findings, such as uncovering previously unknown content, it is more likely that they overload the system with extraneous reports

All in all, the real lesson to be taken from this report is that this shit is super complicated, like all of trust & safety, and tradeoffs abound. But here it’s way more fraught than in most cases, both in terms of the seriousness of the issue, the potential for real harm, and the potentially destructive criminal penalties involved.

The report has some recommendations, though they mostly seem to deal with things at the margins: increase funding for NCMEC, allow it to update its technology (and hire the staff to do so), and have some more information to help platforms get set up.

Of course, what’s notable is that this does not include things like “make platforms liable for any mistake they make.” This is because, as the report shows, most platforms seem to take this stuff pretty seriously already, and the liability is already very clear, to the point that they are often over-reporting to avoid it, and that’s actually making the results worse, because they’re overwhelming both NCMEC and law enforcement.

All in all, this report is a hugely important contribution to this discussion, and provides a ton of real-world information about the CyberTipline that were basically only known to people working on it, leaving many observers, media and policymakers in the dark.

It would be nice if Congress reads this report and understands the issues. However, when it comes to things like CSAM, expecting anyone to bother with reading a big report and understanding the tradeoffs and nuances is probably asking too much.

Filed Under: csam, cybertipline, incentives, overreporting
Companies: ncmec

Our Online Child Abuse Reporting System Is Overwhelmed, Because The Incentives Are Screwed Up & No One Seems To Be Able To Fix Them

from the mismatched-incentives-are-the-root-of-all-problems dept

The system meant to stop online child exploitation is failing — and misaligned incentives are to blame. Unfortunately, today’s political solutions, like KOSA and STOP CSAM, don’t even begin to grapple with any of this. Instead, they prefer to put in place solutions that could make the incentives even worse.

The Stanford Internet Observatory has spent the last few months doing a very deep dive on how the CyberTipline works (and where it struggles). It has released a big and important report detailing its findings. In writing up this post about it, I kept adding more and more, to the point that I finally decided it made sense to split it up into two separate posts to keep things manageable.

This first post covers the higher level issue: what the system is, why it works the way it does, and how the incentive structure of the system is completely messed up (even if it was done with good intentions), and how that’s contributed to the problem. A follow-up post will cover the more specific challenges facing NCMEC itself, law enforcement, and the internet platforms themselves (who often take the blame for CSAM, when that seems extremely misguided).

There is a lot of misinformation out there about the best way to fight and stop the creation and spread of child sexual abuse material (CSAM). It’s unfortunate because it’s a very real and very serious problem. Yet the discussion about it is often so disconnected from reality as to be not just unhelpful, but potentially harmful.

In the US, the system that was set up is the CyberTipline, which is run by NCMEC, the National Center on Missing and Exploited Children. It’s a private, non-profit; however, it has a close connection with the US government, which helped create it. At times, there has been some confusion about whether or not NCMEC is a government agent. The entire setup of it was designed to keep it as non-governmental, to avoid any 4th Amendment issues with the information it collects, but courts haven’t always seen it that way, which makes it tricky (even as the 4th Amendment is important).

And while the system was designed for the US, it has become a defacto global system, since so many of the companies are US based, and NCMEC will, when it can, send relevant details to foreign law enforcement as well (though, as the report details, that doesn’t always work well).

The main role CyberTipline has taken on is coordination. It takes in reports of CSAM (mostly, but not entirely, from internet platforms) and then, when relevant, hands off the necessary details to the (hopefully) correct law enforcement agency to handle things.

Companies that host user-generated content have certain legal requirements to report CSAM to the CyberTipline. As we discussed in a recent podcast, this role as a “mandatory reporter” is important in providing useful information to allow law enforcement to step in and actually stop abusive behavior. Because of the “government agent” issue, it would be unconstitutional to require social media platforms to proactively search for and identify CSAM (though many do use tools to do this). However, if they do find some, they must report it.

Unfortunately, the mandatory reporting has also allowed the media and politicians to use the number of reports sent in by social media companies in a misleading manner, suggesting that the mere fact that these companies find and report to NCMEC means that they’re not doing enough to stop CSAM on their platforms.

This is problematic because it creates a dangerous incentive, suggesting that internet services should actually not report CSAM they found, as politicians and the media will falsely portray a lot of reports as being a sign of a failure by the platforms to take this seriously. The reality is that the failure to take things seriously comes from the small number of platforms (Hi Telegram!) who don’t report CSAM at all.

Some of us from the outside have thought that the real issue was that NCMEC and law enforcement had been unsuccessful on the receiving end to take those reports and do enough that was productive with them. It seemed convenient for the media and politicians to just blame social media companies for doing what they’re supposed to do (reporting CSAM), ignoring that what happened on the back end of the system might be the real problem. That’s why things like Senator Ron Wyden’s Invest in Child Safety Act seemed like a better approach than things like KOSA or the STOP CSAM Act.

That’s because the approach of KOSA/STOP CSAM and some other bills is basically to add liability to social media companies. (These companies already do a ton to prevent CSAM from appearing on the platform and alert law enforcement via the CyberTipline when they do find stuff.) But that’s useless if those receiving the reports aren’t able to do much with them.

What becomes clear from this report is that while there are absolutely failures on the law enforcement side, some of that is effectively baked into the incentive structure of the system.

In short, the report shows that the CyberTipline is very helpful in engaging law enforcement to stop some child sexual abuse, but it’s not as helpful as it might otherwise be:

Estimates of how many CyberTipline reports lead to arrests in the U.S. range from 5% to 7.6%

This number may sound low, but I’ve been told it’s not as bad as it sounds. First of all, when a large number of the reports are for content that is overseas and not in the US, it’s more difficult for law enforcement here to do much about it (though, again, the report details some suggestions on how to improve this). Second, some of the content may be very old, where the victim was identified years (or even decades) ago, and where there’s less that law enforcement can do today. Third, there is a question of prioritization, with it being a higher priority to target those directly abusing children. But, still, as the report notes, almost everyone thinks that the arrest number could go higher if there were more resources in place:

Empirically, it is unknown what percent of reports, if fully investigated, would lead to the discovery of a person conducting hands-on abuse of a child. On the one hand, as an employee of a U.S. federal department said, “Not all tips need to lead to prosecution […] it’s like a 911 system.”10 On the other hand, there is a sense from our respondents—who hold a wide array of beliefs about law enforcement—that this number should be higher. There is a perception that more than 5% of reports, if fully investigated, would lead to the discovery of hands-on abuse.

The report definitely suggests that if NCMEC had more resources dedicated to the CyberTipline, it could be more effective:

NCMEC has faced challenges in rapidly implementing technological improvements that would aid law enforcement in triage. NCMEC faces resource constraints that impact salaries, leading to difficulties in retaining personnel who are often poached by industry trust and safety teams.

There appear to be opportunities to enrich CyberTipline reports with external data that could help law enforcement more accurately triage tips, but NCMEC lacks sufficient technical staff to implement these infrastructure improvements in a timely manner. Data privacy concerns also affect the speed of this work.

But, before we get into the specific areas where things can be improved in the follow-up post, I thought it was important to highlight how the incentives of this system contribute to the problem, where there isn’t necessarily an easy solution.

While companies (Meta, mainly, since it represents, by a very wide margin, the largest number of reports to the CyberTipline) keep getting blamed for failing to stop CSAM because of its large number of reports, most companies have very strong incentives to report anything they find. This is because the cost for not reporting something they should have reported is massive (criminal penalties), whereas the cost for over-reporting is nothing to the companies. That means, there’s an issue with overreporting.

Of course, there is a real cost here. CyberTipline employees get overwhelmed, and that can mean that reports that should get prioritized and passed on to law enforcement don’t. So you can argue that while the cost of over-reporting is “nothing” to the companies, the cost to victims and society at large can be quite large.

That’s an important mismatch.

But the broken incentives go further as well. When NCMEC hands off reports to law enforcement, they often go through a local ICAC (Internet Crimes Against Children) task force, who will help triage it and find the right state or local law enforcement agency to handle the report. Different law enforcement agencies who are “affiliated” with ICACs receive special training on how to handle reports from the CyberTipline. But, apparently, at least some of them feel that it’s just too much work, or (in some cases) too burdensome to investigate. That means that some law enforcement agencies are choosing not to affiliate with their local ICACs to avoid this added work. Even worse, some law enforcement agencies have “unaffiliated” themselves with the local ICAC because they just don’t want to deal with it.

In some cases, there are even reports of law enforcement unaffiliating with an ICAC out of a fear of facing liability for not investigating an abused child quickly enough.

A former Task Force officer described the barriers to training more local Task Force affiliates. In some cases local law enforcement perceive that becoming a Task Force affiliate is expensive, but in fact the training is free. In other cases local law enforcement are hesitant to become a Task Force affiliate because they will be sent CyberTipline reports to investigate, and they may already feel like they have enough on their plate. Still other Task Force affiliates may choose to unaffiliate, perceiving that the CyberTipline reports they were previously investigating will still get investigated at the Task Force, which further burdens the Task Force. Unaffiliating may also reduce fear of liability for failing to promptly investigate a report that would have led to the discovery of a child actively being abused, but the alternative is that the report may never be investigated at all.

[….]

This liability fear stems from a case where six months lapsed between the regional Task Force receiving NCMEC’s report and the city’s police department arresting a suspect (the abused children’s foster parent). In the interim, neither of the law enforcement agencies notified child protective services about the abuse as required by state law. The resulting lawsuit against the two police departments and the state was settled for $10.5 million. Rather than face expensive liability for failing to prioritize CyberTipline reports ahead of all other open cases, even homicide or missing children, the agency might instead opt to unaffiliate from the ICAC Task Force.

This is… infuriating. Cops choosing to not affiliate (i.e., get the necessary training to help) or removing themselves from an ICAC task force because they’re afraid if they don’t help save kids from abuse that they might get sued is ridiculous. It’s yet another example of cops running away, rather than doing the job they’re supposed to be doing, but which they claim they have no obligation to do.

That’s just one problem of many in the report, which we’ll get into in the second post. But, on the whole, it seems pretty clear that with the incentives this out of whack, something like KOSA or STOP CSAM aren’t going to be of much help. Actually tackling the underlying issues, the funding, the technology, and (most of all) the incentive structures, is necessary.

Filed Under: csam, cybertipline, icac, incentives, kosa, law enforcement, liability, stop csam
Companies: ncmec

As Congress Grandstands Nonsense ‘Kid Safety’ Bills, Senator Wyden Reintroduces Legislation That Would Actually Help Deal With Kid Exploitation Online

from the no-one-will-pay-attention,-because-this-is-useful,-but-boring dept

As you’ve likely heard, this morning the Senate did one of its semi-regular hearings in which it drags tech CEOs in front of clueless Senators who make nonsense pronouncements in hopes of getting a viral clip to show up on the very social media they’re pretending to demonize, but which they rely on to pretend to their base that they’re leading the culture war moral panic against social media.

Meanwhile, Senator Ron Wyden has (yet again) released a bill that will get little (if any) attention, but which actually seeks to help protect children. Reps. Eshoo and Fitzpatrick have introduced the companion bill in the House.

As we’ve discussed multiple times, all evidence suggests that the internet companies are actually doing an awful lot to stop child exploitation online, which involves tracking it down, reporting it to NCMEC, and putting in place tools to automate and block such exploitation content from ever seeing the light of day. The real problem seems to be that after the content is reported to NCMEC, nothing happens.

Wyden’s bill aims to fix that part. The actual part where the system seems to fall down and fail to protect kids online. The part about what happens after the companies report such content, and NCMEC and the DOJ fail to take any action:

The Invest in Child Safety Act would direct more than $5 billion in mandatory funding to investigate and target the predators and abusers who create and share child sexual abuse material online. It also directs substantial new funding for community-based efforts to prevent children from becoming victims in the first place. The legislation would also create a new office within the U.S. Department of Justice (DOJ) to coordinate efforts across federal agencies, after the DOJ refused to comply with a 2008 law requiring coordination and reporting of those efforts.

“The federal government has a responsibility and moral obligation to protect children from exploitation online, but right now it’s failing in large part because of a lack of funding and coordination,” Wyden said. “It’s time for a new approach to find child predators, prosecute these monsters, and help protect children from becoming victims in the first place – and that’s why we are introducing the Invest in Child Safety Act.”

The bill includes a ton of pretty clear and obvious common sense approaches to helping deal with the actual crimes going on and to actually step in and protect children, rather than just grandstanding about it and magically pretending that if only Mark Zuckerberg nerded harder, he’d magically prevent child exploitation.

Of course, doing basic stuff like this isn’t the kind of thing that gets headlines, and so it won’t get even a fraction of the attention that terrible, unconstitutional, problematic bills like KOSA, EARN IT, STOP CSAM and others will get. Indeed, after doing a quick search online, I can find exactly no articles about Wyden’s bill. Dealing with actual problems isn’t the kind of thing this Congress does, nor something that the media cares about.

Having a show trial to pretend that terrible bills are great makes headlines. Actually presenting a bill that provides real tools to help… gets ignored.

Filed Under: anna eshoo, brian fitzpatrick, child exploitation, child safety, doj, funding, ncmec, ron wyden
Companies: ncmec

Breaking Encryption To Aid Client-Side Scanning Isn’t The Solution To The CSAM Problem

from the undermining-security-to-generate-band-aids dept

Plenty of legislators and law enforcement officials seem to believe there’s only one acceptable solution to the CSAM (child sexual abuse material) problem: breaking encryption.

They may state some support for encryption, but when it comes to this particular problem, many of these officials seem to believe everyone’s security should be compromised just so a small percentage of internet users can be more easily observed and identified. They tend to talk around the encryption issue, focusing on client-side scanning of user content — a rhetorical tactic that willfully ignores the fact that client-side scanning would necessitate the elimination of one end of end-to-end encryption to make this scanning possible.

The issue at the center of these debates often short-circuits the debate itself. Since children are the victims, many people reason no sacrifice (even if it’s a government imposition) is too great. Those who argue against encryption-breaking mandates are treated as though they’d rather aid and abet child exploitation than allow governments to do whatever they want in response to the problem.

Plenty of heat has been directed Meta’s way in recent years, due to its planned implementation of end-to-end encryption for Facebook Messenger users. And that’s where the misrepresentation of the issue begins. Legislators and law enforcement officials claim the millions of CSAM reports from Facebook will dwindle to almost nothing if Messenger is encrypted, preventing Meta from seeing users’ communications.

This excellent post by cybersecurity expert Susan Landau for Lawfare punctures holes in these assertions, pointing out that the “millions” of reports Facebook generates annually are hardly indicative of widespread sexual abuse of children.

Yes, the transition of CSAM sharing to online communication services has resulted in a massive increase in reports to NCMEC (National Center for Missing and Exploited Children).

The organization received 29 million reports of online sexual exploitation in 2021, a 10-fold increase over a decade earlier. Meanwhile the number of video files reported to NCMEC increased over 40 percent between 2020 and 2021.

But that doesn’t necessarily mean there are more children being exploited than ever before. Nor does it mean Facebook sees more CSAM than other online services, despite its massive user base.

Understanding the meaning of the NCMEC numbers requires careful examination. Facebook found that over 90 percent of the reports the company filed with NCMEC in October and November 2021 were “the same as or visually similar to previously reported content.” Half of the reports were based on just six videos.

As Landau is careful to point out, that doesn’t mean the situation is acceptable. It just means tossing around phrases like “29 million reports” doesn’t necessarily mean millions of children are being exploited or millions of users are sharing CSAM via these services.

Then there’s the uncomfortable fact that a sizable percentage of the content reported to NMCEC doesn’t actually involve any exploitation of minors by adults. Landau quotes from Laura Draper’s 2022 report on CSAM and the rise of encrypted services. In that report, Draper points out that some of the reported content is generated by minors for other minors: i.e., sexting.

Draper observed that CSAE consists of four types of activities exacerbated by internet access: (a) CSAM, which is the sharing of photos or videos of child sexual abuse imagery; (b) perceived first-person (PFP) material, which is nude imagery taken by children of themselves and then shared, often much more widely than the child intended; (c) internet-enabled child sex trafficking; and (d) live online sexual abuse of children.

While these images are considered “child porn” (to use an antiquated term), they are not actually images take by sexual abusers, which means they aren’t actually CSAM, even if they’re treated as such by NMCEC and reported as such by communication services. In these cases, Landau suggests more education of minors to inform them of the unintended consequences of these actions, first and foremost being that they can’t control who these images are shared with once they’ve shared them with anyone else.

The rest of the actions on that list are indeed extremely disturbing. But, as Landau (and Draper) suggest, there are better solutions already available that don’t involve undermining user security by removing encryption or undermining their privacy by subjecting them to client-side scanning.

[C]onsider the particularly horrific crime in which there is live streaming of a child being sexually abused according to requests made by a customer. The actual act of abuse often occurs abroad. In such cases, aspects of the case can be investigated even in the presence of E2EE. First, the video stream is high bandwidth from the abuser to the customer but very low bandwidth the other way, with only an occasional verbal or written request. Such traffic stands out from normal communications; it looks neither like a usual video communication nor a showing of a film. And the fact that the trafficker must publicly advertise for customers provides law enforcement another route for investigation.

Unfortunately, government officials tend to portray E2EE as the root of the CSAM problem, rather than just something that exists alongside a preexisting problem. Without a doubt, encryption can pose problems for investigators. But there are a plethora of options available that don’t necessitate making everyone less safe and secure just because abusers use encrypted services in order to avoid immediate detection.

Current processes need work as well. As invaluable as NCMEC is, it’s also contributing to a completely different problem. Hash matching is helpful but it’s not infallible. Hash collisions (where two different images generate identical hashes) are possible. Malicious actors could create false collisions to implicate innocent people or hide their sharing of illicit material. False positives do happen. Unfortunately, at least one law enforcement agency is treating the people on the receiving end of erroneous flagging as criminal suspects.

Responding to an information request from ICCL, the Irish police reported that NCMEC had provided 4,192 referrals in 2020. Of these, 409 of the cases were actionable and 265 cases were completed. Another 471 referrals were “Not Child Abuse Material.” The Irish police nonetheless stored “(1) suspect email address, (2) suspect screen name, [and] (3) suspect IP address.” Now 471 people have police records because a computer program incorrectly flagged them as having CSAM.

Stripping encryption and forcing service providers to engage in client-side scanning will only increase the number of false positives. But much of what’s being proposed — both overseas and here in the United States — takes the short-sighted view that encryption must go if children are to be saved. To come up with better solutions, legislators and law enforcement need to be able to see past the barriers that immediately present themselves. Rather than focus on short-term hurdles, they need to recognize online communication methods will always be in a state of fluctuation. What appears to be the right thing to do now may become utterly worthless in the near future.

_Think differently. Think long term. Think about protecting the privacy and security of all members of society—children and adults alike. By failing to consider the big picture, the U.K. Online Safety Act has taken a dangerous, short-term approach to a complex societal problem. The EU and U.S. have the chance to avoid the U.K.’s folly; they should do so. The EU proposal and the U.S. bills are not sensible ways to approach the public policy concerns of online abetting of CSAE. Nor are these reasonable approaches in view of the cyber threats our society faces. The bills should be abandoned, and we should pursue other ways of protecting both children and adult_s.

The right solution now isn’t to make everyone less safe and secure. Free world governments shouldn’t be in such a hurry to introduce mandates that lend themselves to abuse by government entities and used to justify even more abusive surveillance methods deployed by autocrats and serial human rights abusers. Yes, the problem is important and should be of utmost concern. But that doesn’t mean governments should, for all intents and purposes, outlaw encryption just because it seems to be quickest, easiest solution to a problem that’s often misrepresented and misperceived.

Filed Under: client side scanning, csam, encryption, susan landau
Companies: meta, ncmec

Elon’s ‘Zero Tolerance’ Policy On CSAM Apparently Does Not Apply To Conspiracy Theorist Accounts He Likes

from the not-how-it-works dept

You may recall that early on in Elon’s ownership of Twitter, he insisted that “removing child exploitation is priority #1” while exhorting his supporters to “reply in the comments” if they saw any.

Leaving aside that this is a ridiculously terrible process for having people report potential CSAM (Child Sexual Abuse Material) or, as some people prefer, CSEM (with the E standing for “exploitation”), there was little to no evidence of this actually being put into practice. Most of the people (one person told me everyone) who worked on the CSAM team was let go or left. Ella Irwin, who headed up trust & safety until she resigned two months ago (as far as I can tell no replacement has been named) made a bunch of statements about how the company was treating CSAM, but there was almost no evidence backing that up.

There were multiple reports of the CSAM mitigation process falling apart. There were reports of CSAM on the platform remaining up for months. Perhaps even worse (and risking serious legal consequences), the company claimed it had suspended 400k accounts, but only reported 8k to law enforcement which is required by law. Oh, and apparently Twitter’s implementation of PhotoDNA broke at some point, which is again incredibly serious as, PhotoDNA (for all its problems) remains a key tool for large sites in fighting known CSAM.

And yet the company still claims (on a Twitter-branded page, because apparently no one actually planned for the “X” transition) that it has a “zero tolerance” policy for CSAM.

The key parts of that page say both “We have a zero-tolerance child sexual exploitation policy on Twitter” and “Regardless of the intent, viewing, sharing, or linking to child sexual exploitation material contributes to the re-victimization of the depicted children.”

Anyway, that all leads up to the following. One of the small group of vocal and popular utter nonsense peddlers on the site, dom_lucre, had his account suspended. A bunch of other nonsense peddlers started wringing their hands about this and fearing that Musk was going soft and was now going to start banning “conservative” accounts. In responses, Elon just came out and said that the account had posted CSAM, that only Twitter “CSE” staff had seen it, and that after removing the tweets in question, it had reinstated that guy’s account.

It’s worth noting that this person was among the hand-picked accounts who received money during Elon’s recent pay-for-stanning rollout.

Almost everything about this statement is problematic, and one that any lawyer would have a heart attack over if Elon were their client. First off, blaming Twitter’s legacy code is getting old and less and less believable each time he does it. He could just say “we fired everyone who understood how stuff worked,” but he can’t quite get there.

Second, posting “the reason” for a suspension is, like in so many cases having to do with trust & safety, trickier and involves more nuances than Elon would ever think through. Just to scratch the surface, sometimes telling users why they were suspended can create more problems, as users try to “litigate” their suspension. It can also alert abusive users to who may have reported them, leading to further abuse. Posting the reason publicly can lead to even more issues, including the potential risk of defamation claims.

But, even more importantly, it’s not a zero tolerance policy if you reinstate the account. It really seems like an “Elon’s inner circle tolerance policy.”

The claim that the only people who saw the images were the CSE team seems… unlikely. Internet sleuths have sniffed out a bunch of replies to his now deleted post (which was up for four days on an account with hundreds of thousands of followers), suggesting that the content was very much seen.

Also, there are big questions about what process Twitter followed here, since deleting the content, telling the world about who was suspended for what, and then reinstating the account are not what one would consider normal. Did Twitter send the content to NCMEC? Did it report it to any other law enforcement? These seem like pretty big questions.

On top of that, viewing that content on Twitter itself could potentially expose users to criminal liability. This whole thing is a huge mess, with a guy in charge who seems to understand literally none of this.

He’s now making Twitter a massive risk to use. At a time when the company is begging advertisers to put their ads on the site, I can’t see how Elon choosing to reinstate someone who posted CSAM, which was left on the site for days, is going to win them back.

Filed Under: content moderation, csam, dom lucre, elon musk, zero tolerance
Companies: ncmec, twitter, x

Whatever Problem EARN IT Is Trying To Solve, It Doesn't

from the that-seems-like-a-problem dept

I’ve already talked about the potential 1st Amendment problems with the EARN IT Act and the potential 4th Amendment problems with it as well. But a recent post by Riana Pfefferkorn at Stanford raises an even bigger issue in all of this: what actual problem is EARN IT trying to solve?

This sounds like a simple question with a potentially simple answer, but the reality, once you start to dig in, suggests that either (1) the backers of EARN IT don’t actually know, or (2) if they do know, they know what they actually want is unconstitutional.

Supporters of EARN IT will say, simply, the problem they’re trying to solve is the prevalence of child sexual abuse material (CSAM) online. And, that is a real problem (unlike some other moral panics, CSAM is a legitimate, large, and extraordinarily serious problem). But… CSAM is already very, very illegal. So, if you dig in a little further, supporters of EARN IT will say that the problem they’re really trying to solve is that… internet companies don’t take CSAM seriously enough. But, the law (18 USC 2258A already has pretty strict requirements for websites to report any CSAM they find to NCMEC (the National Center for Missing & Exploited Children) — and they do. NCMEC reported that it received almost 21.4 million reports of CSAM from websites. Ironically, many supporters of EARN IT point to these numbers as proof that the websites aren’t doing enough, while also saying it proves they don’t have any incentive to report — which makes no sense at all.

So… is the problem that those 21.4 million reports didn’t result in the DOJ prosecuting enough abusers? If so… isn’t the problem somewhere between NCMEC and the DOJ? Because the DOJ can already prosecute for CSAM and Section 230 doesn’t get in the way of that (it does not immunize against federal criminal law). And, as Riana noted in her article, this very same Senate Committee just recently heard about how the FBI actually knew about an actual serial child sex abuser named Larry Nasser, and turned a blind eye.

And, if NCMEC is the problem (namely in that it can’t process the reports fast enough), then this bill doesn’t help at all there either, because the bill doesn’t give NCMEC any more funding. And, if the senators are correct that this bill would increase the reports to NCMEC (though it’s not clear why that would work), wouldn’t that just make it even more difficult for NCMEC to sort through the reports and alert law enforcement?

So… is the problem that companies aren’t reporting enough CSAM? If you read the sponsors’ myths and facts document, they make this claim — but, again, the law (with really serious penalties) already requires them to report any CSAM. Taking away Section 230 protections won’t change that. Reading between the lines of the “myths and facts” document, they seem to really be saying that the problem is that not every internet service proactively scans every bit of content, but as we’ve discussed that can’t be the problem, because if that is the problem, EARN IT has a massive 4th Amendment problem that will enable actual child sex abusers to suppress evidence!

Basically, if you look step by step through the potential problems that supporters of the bill claim it tries to solve, you immediately realize it doesn’t actually solve any of them. And, for nearly all of the potential problems, it seems like there’s a much more efficient and effective solution which EARN IT does not do. Riana’s post has a handy dandy table walking down each of these paths, but I wanted to make it even clearer, and felt that a table isn’t the best way to walk through this. So here is her chart, rewritten (all credit to her brilliant work):

If online services don’t report CSAM in violation of 2258A, and the real problem is large-scale, widespread, pervasive noncompliance by numerous providers that knowingly host CSAM without removing or reporting it (NOT just occasional isolated incidents), then there’s a very long list of potential remedies:

If that’s the actual problem (which supporters imply, but when you try to get them to say it outright they hem and haw and won’t admit it), then it seems like any of the above list would actually be helpful here. And the real question we should be asking is why hasn’t the DOJ done anything here?

But what does EARN IT actually do?

Okay, so maybe the supporters will say (as they sometimes admit) that most web sites out there actually do report CSAM under 2258A, but there are still some providers who don’t report it and these are occasional, isolated instances of failure to report by multiple providers, OR repeated failure to report by a particular rogue provider (NOT a large-scale problem across the whole tech industry). If anything, that seems more probably than the first version, which doesn’t seem to be reported by any facts. However, here again, there are a bunch of tools in the regulator’s tool box to deal with this problem:

Again, what it comes down to in this scenario is that the DOJ is not doing it’s job. The law is on the books, and the penalties can be pretty stiff (first failure to report is 150,000andeachsubsequentfailureisanother150,000 and each subsequent failure is another 150,000andeachsubsequentfailureisanother300,000). If it’s true that providers are not doing enough here, such penalties would add up to quite a lot and the question again should be why isn’t the DOJ enforcing the law?

But instead of exploring that, here’s what EARN IT actually does:

Okay, so next up, Riana points out that maybe it’s possible that the DOJ does regular investigations of websites failing to report CSAM in violation of 2258A, but those investigations are consistently resolved without charges or fines and do not become public. Then, there’s a pretty simple option for Congress:

But, instead, here’s what Congress is doing with EARN IT (stop me if you’ve heard this one before):

Okay, okay, so maybe the reality is that the DOJ does in fact criminally prosecute websites for 2258A violations, but the reason there is no public record of any such prosecution ever is that all such court records are under seal. This would be… odd, first of all, given that the DOJ loves to publicize prosecutions, especially over CSAM. But, again, here’s what Congress could do:

But, instead, here’s what EARN IT does:

So, maybe the real problem is simply that the DOJ seems to be ignoring any effort to enforce violations of 2258A. If that’s the case, Congress has tools in its toolbox:

Instead, EARN IT…

So… that’s basically all the possible permutations if the problem is — as some supporters claim repeatedly — that companies are regularly violating 2258A and not reporting CSAM that they find. And, in almost every case, the real questions then should be why isn’t the DOJ enforcing the law? And there are lots of ways that Congress should deal with that. But EARN IT does literally none of them.

About the only thing that supporters of EARN IT have claimed in response to this point is that, because EARN IT allows for state AGs and civil suits, it is “adding more cops to the beat” to take on failures to report under 2258A. But… that’s kinda weird. Because wouldn’t it make a hell of a lot more sense to first find out why the existing cops don’t bother? Because no one has done that. And, worse, when it comes to the civil suits, this response basically means “the DOJ doesn’t care to help victims of CSAM, so we’re leaving it up to them to take matters into their own hands.” And that doesn’t seem like a reasonable solution no matter how you look at it.

If anything, it looks like Congress putting the burden for the DOJ’s perpetual failings… on the victims of CSAM. Yikes!

Of course, there are other possible problems here as well, and Riana details them in the chart. In these cases, the problems aren’t with failure to report CSAM, but elsewhere in the process. So… if websites do properly report CSAM to NCMEC’s CyberTipline, perhaps the problem is that CSAM isn?t being taken down promptly enough or reported to NCMEC ?as soon as reasonably possible? as required by 2258A(a)(1)(A)(i).

Well, then, as Riana notes, there are a few things Congress could do:

Instead, what EARN IT actually does is…

Okay, so if companies are reporting to NCMEC in compliance with 2258A, perhaps the problem is the volume of reports is so high that NCMEC is overwhelmed.

Well, then, the possible solutions from Congress would seem to be:

But, what EARN IT does is…

Okay, so maybe the websites do properly report CSAM to NCMEC, and NCMEC is able to properly alert the DOJ to the CSAM such that the DOJ should be able to go prosecute the actual abusers, but the DOJ doesn?t act on the reports providers make, and doesn?t make its own mandatory reports to Congress about internet crimes against children. That would be horrifying, but again, it would seem like there’s a pretty clear course of action for Congress:

All of those would help, if this is the problem, but instead, here’s what EARN IT actually does:

You might sense a pattern here.

And finally, perhaps websites do report CSAM in compliance with 2258A to NCMEC’s CyberTipline, and maybe NCMEC does relay important information to the DOJ… and horrifyingly, perhaps federal law enforcement is failing child sex abuse victims just as the FBI turned a blind eye to Larry Nassar?s abuse of dozens of child gymnasts for years.

Well, then it seems fairly obvious what Congress should do:

But here’s what EARN IT does in that situation:

As Riana summarizes:

No matter what the problem with online CSAM is, EARN IT isn?t going to fix it. It?s only going to make things worse, both for child victims and for everyone who uses the internet. The truth about EARN IT is that either there isn?t a serious noncompliance problem among providers that?s pervasive enough to merit a new law, but Congress just can?t resist using Section 230 as a political punching bag to harm all internet users in the name of sticking it to Big Tech? or there is a problem, but the DOJ is asleep at the wheel ? and EARN IT is a concession that Congress no longer expects them to do their jobs.

Either option should be shameful and embarrassing for the bill?s supporters to admit. Instead, this horrible legislation, if it passes, will be hailed as a bipartisan victory that shows Congress can still come together across the aisle to get things done. Apparently, harming Americans? rights online while making CSAM prosecutions harder is something both parties can agree on, even in an election year.

So, whatever problem the backers of EARN IT think they’re solving for, EARN IT doesn’t do it. That seems like it should be a big fucking deal. But, instead of responding to these points, the sponsors claim that people highlighting this “don’t care about CSAM.”

Filed Under: 2258a, csam, doj, earn it, encryption, fbi, reporting, surveillance
Companies: ncmec

Senator Graham Spreads A Bunch Of Nonsense About 'Protecting Digital Innocence' Online

from the moral-panics dept

We warned last week that Senator Lindsey Graham was holding a “but think of the children online” moral panic hearing. Indeed, it happened. You can watch the whole 2 hours, but… I wouldn’t recommend it (I did it for you, though). Most of it is the usual moral panic, technologically illiterate nonsense we’ve all come to expect from Congress. Indeed, in a bit of good timing, the Pessimist’s Archive just tweeted out a clip of a 1993 Senate hearing in which then Senator Joe Lieberman flipped out about evil video games. Think about this, but two hours, and a wider array of nonsense:

It starts out with a prosecutor from South Carolina, Duffie Stone, moral panicking about basically everything. Encryption is evil. Children are being sex trafficked online. And, um, gangs are recruiting members with (gasp) music videos. Later he complains that some of those kids (gasp!) mock law enforcement in their videos. Something must be done! The second speaker, a law professor, Angela Campell, claims that we need more laws “for the children!” She also goes further and says that the FTC should go after Google and others for not magically stopping scammy companies from existing. Then there was this guy, Christopher McKenna, from an organization (“Protect Young Eyes!”) dedicated to moral panics, telling all sorts of unbelievable anecdotes about evil predators stalking young people on Instagram and “grooming” them. Remember, that actual data on this kind of activity shows that it’s actually quite rare (not zero, and that’s not excusing it when it does happen, but the speaker makes it sound like every young girl on Instagram is likely to be at risk of sex trafficking). He also asks the government to require an MPAA/ESRB-style “rating” system for apps — apparently unaware that laws attempting to require such ratings have been struck down as unconstitutional, and the MPAA/ESRB ratings only exist through voluntary agreements.

There’s also… um… this:

It’s the app where every kid, regardless of age, has access to the Discover News section, where they are taught how to engage in risky sexual behavior, such as hookup, group, anal, or torture sex, how to sell drugs, and how to hide internet activity from parents using “incognito mode.”

He’s describing Snapchat. I’ve used Snapchat for years and, uh, I’ve never come across any of that. Also, the complaint about incognito mode is… pretty messed up, considering how that’s a tool for protecting privacy. This is all straight from the standard moral panic playbook. Also, he claims that Twitter has “hardcore porn and prostitution was everywhere” — which is also news to me (and I use Twitter a lot). He also whines that VPNs are too easy to get — and then later whines that it’s “too hard” to protect our privacy. Um, hiding VPNs will harm our privacy. It’s like a hodge podge of true nonsense.

There was also John Clark from NCMEC — the National Center for Missing and Exploited Children. NCMEC actually does good work in helping platforms screen out and spot child porn. However, Clark contributes to the scare-mongering about just how awful the internet is. He also flat out lies. At one point during the panel, Senator Ted Cruz asks Clark about FOSTA and what it’s done so far. Clark flat out lies and says that FOSTA took down Backpage. This is false. Backpage was taken down and its founders arrested before FOSTA was even signed into law.

The only semi-reasonable panelist was the last one, Stephen Balkam, from the Family Online Safety Institute. While McKenna mocks the idea that “parents have a role” by pointing out that parents can’t watch over their kids every hour of every day (duh), Balkam points out that what we should be doing is not watching over our kids all the time, but rather training them and educating them to know how to be good digital citizens online and to avoid trouble. But that kind of message was basically ignored by the Senators, because what fun is actually respecting our kids and teaching them how to be smart internet users. Instead, most of panel focuses on crazy anecdotes and salacious claims about internet services that make them sounds a hell of lot more insane than any of those platforms actually are.

Later, Senator John Kennedy asks the guy from “Protect Young Eyes” if Apple can build a filter that will magically help parents block kids from ever seeing sexually explicit material. McKenna stumbles and admits he has no idea, leading Balkam to finally have to jump into the conversation (he’s the only panelist that no Senator had called on throughout the entire ordeal) to point out that all platforms have some forms of parental controls. But Kennedy cuts him off and says “but can it be done?” Balkam stutters a “yes,” which is not accurate — since Kennedy is asking for something impossible. But then Kennedy suggests that Congress write a law that requires companies like Apple and Google to install filters (something that’s already been ruled unconstitutional).

Kennedy’s idea is… nutty. He includes the obligatory “I don’t know how any of this is done” comment before suggesting a bunch of impossible ideas.

Could Apple, for example, design a program that a parent could opt into, and the instructions to Apple would be “design a program that will filter all information that my daughter or son may see that would be sexually exploitative”? Maybe “filter all pictures or written references to human genitalia.” Can that be done? … Isn’t that the short way home here?

[….]

So could we write legislation, or promulgate a rule, that says “here’s the thing that a reasonable parent would do to protect his or her child from seeing this stuff.” And we do that in conjunction with somebody that has the obvious expertise. And you filter everything. I don’t know how to do it. I can’t write software. Maybe it’s to prevent any pictures of human genitalia. Or prohibit any reference to sexual activity. I don’t know. The kids aren’t gonna like it, but that’s not who we’re trying to please here. Why couldn’t that be done?

Well, the Constitution is why it can’t be done Senator. Also, basic understanding of technology. Or the limits on filter technology. Block all mention of sexual activity? Sure, then kids will use slang. Good luck keeping up with that. Block all pictures of genitalia — then say good buy to biology texts online. Or pages about breast cancer. This is all stuff that lots of people have studied for decades and Kennedy is displaying his ignorance about the Constitution, the law, the internet, the technology, and just about everything else as well. Including kids.

Balkam points out that there are lots of private companies already making such filters, but Kennedy keeps saying “can we write a law” and “can we require every device have these filters” and Balkam looks panic’d noting he has no idea about whether or not they can write such a law (answer: they cannot, at least not if they want it to pass Constitutional muster).

Senator Blackburn… brings up Jeffrey Epstein. Who, as far as we know… didn’t use the internet to prey on girls. But according to Blackburn, Epstein proves the problems of the internet. Because. Senator Hawley then completely makes up a claim that YouTube is deliberately pushing kids to pedophiles and refuses to do anything about it. He claims — incorrectly — that Google admitted that it knows it sends videos of kids to pedophiles (and, he claims, allows the pedophiles to contact the kids) and that it deliberately has decided not to stop this. This misrepresents… basically everything once again.

Senator Thom Tillis then grandstands that it’s all the parents’ fault — and if a kid gets a mobile phone and lies about his age, we should be… blaming the parents for “giving the kids a lethal device.” No hyperbole and grandstanding there, huh? He’s also really focused on “lethality.” He later claims that the internet content itself is “lethal.”

Towards the end, the Senators all gang up on Section 230. Senator Cruz asks his FOSTA question (leading NCMEC’s Clark to falsely state that it was necessary to take down Backpage), and then Blumenthal calls 230 “the elephant in the room” and suggests that there needs to be a “duty of care” to get companies to do anything. It seems like Hawley is already gone by this time, but no one seems to point out that any such duty of care would likely lead to much greater censorship on these platforms, in direct contrast with his demand that the companies censor less.

Nevertheless, Senator Graham closes the hearing by saying that he thinks the companies need to “earn” their CDA 230 protections (which is part of Hawley’s nonsense bill). Graham suggests that Congress needs to come up with “best business practices” and platforms should only get 230 protections if they “meet those best business practices.”

Who knew the Republican Party was all about dictating business standards. What happened to the party of getting government out of business?

Who knows what will actually come out of this hearing, but it was mostly a bunch of ill-informed or mis-informed, technologically illiterate grandstanding, moral panic nonsense. In other words, standard operating procedure for most of Congress.

Filed Under: angela campbell, christopher mckenna, duffie stone, john clark, john kennedy, josh hawley, lindsey graham, moral panic, stephen balkam, ted cruz, think of the children
Companies: facebook, google, ncmec, snapchat