internet safety – Techdirt (original) (raw)

KOSA Won’t Make The Internet Safer For Kids. So What Will?

from the let's-think-this-through dept

I’ve been asked a few times now what to do about online safety if the Kids Online Safety Act is no good. I will take it as a given that not enough is being done to make the Internet safe, especially for children. I think there is enough evidence to show that while the Internet can be a positive for many young people, especially marginalized youth that find support online, there are also significant negatives that correlate to real world harms that lead to suffering.

As I see it, there are three separate but related problems:

  1. Most Internet companies make money off engagement, and so there can be misaligned incentives especially when some toxic things can drive engagement.
  2. Trust & Safety is the linchpin of efforts to improve online safety, but it represents a significant cost to companies without a direct connection to profit.
  3. The tools used by Trust & Safety, like content moderation, have become a culture war football and many – including political leaders – are trying to work the refs.

I think #1 tends to be overstated, but X/Twitter is a natural experiment on whether this model is successful in the long run so we may soon have a better answer. I think #2 is understated, but it’s a bit hard to find government solutions here – especially those that don’t run into First Amendment concerns. And #3 is a bit of a confounding problem that taints all proposed solutions. There is a tendency to want to use “online safety” as an excuse to win culture wars, or at least tack culture war battles onto legitimate attempts to make the Internet safer. These efforts run headfirst into the First Amendment, because they are almost exclusively about regulating speech.

KOSA’s main gambit is to discourage #1 and maybe even incentivize #2 by creating a sort of nebulous duty of care that basically says if companies don’t have users’ best interests at heart in six described areas then they can be sued by the FTC and State AGs. The problem is that the duty of care is largely directed at whether minors are being exposed to certain kinds of content, and this invites problem #3 in a big way. In fact, we’ve already seen politically connected anti-LGBTQ organizations like Heritage openly call for KOSA to be used against LGBTQ content and Senator Blackburn, a KOSA co-author, connected the bill with protecting “minor children from the transgender.” This also means that this part of KOSA is likely to eventually fall to the First Amendment, as the California Age Appropriate Design Code (a bill KOSA borrows from) did.

So what can be done? I honestly don’t think we have enough information yet to really solve many online safety problems. But that doesn’t mean we have to sit around doing nothing. Here are some ideas of things that can be done today to make the Internet safer or prepare for better solutions in the future:

Ideas for Solving Problem #1

Ideas for Solving Problem #2

Ideas for Solving Problem #3

Matthew Lane is a Senior Director at InSight Public Affairs.

Filed Under: children, internet safety, kosa, parental tools, privacy

State Attorneys General Trash Internet Safety Study, But Still Can't Provide Data To Counter It

from the maybe-it's-not-such-a-big-threat-after-all dept

Last month, a wide-ranging panel of experts did a big study and found out that the risks of online predators stalking kids on social networks was totally overhyped — something that we’d seen in previous studies, though none as wide-ranging and comprehensive. These results shocked and upset the group of 49 state attorneys general who have been pushing hard to force social networks to implement a variety of mechanisms to “protect” against this threat that really isn’t that big. It’s not surprising that these AGs want to push this. It makes it look like they’re doing something to “protect the children,” at little cost to themselves. The public imagination, helped along by politicians and the press, have been falsely led to believe that these sites are crawling with child predators tricking children, but the truth is that such cases are extremely rare. That’s not to play down the seriousness of the few cases where it happens, but it’s hardly a major epidemic.

Still, the state AGs were none too pleased with the report’s results, and some of the more vocal social network haters have been trashing it for using out-dated data. Of course, these AGs haven’t actually provided the up-to-date data that contradicts the report’s findings. So, one well-respected online safety researcher, Nancy Willard, went out and found some recent data to look at. Adam Thierer summarizes her findings — but the quick version is that the recent data does, in fact, support the study’s original conclusion: there just isn’t that much predatorial behavior happening on social networks. In fact, the report found that general chat rooms were much more risky than social networks. The key point:

The incidents of online sexual predation are rare. Far more children and teens are being sexually abused by family members and acquaintances. It is imperative that we remain focused on the issue of child sexual abuse — regardless of how the abusive relationship is initiated.

Focusing on social networks as being a problem is taking away resources from where the real threats are… all in an effort for some AGs to get some easy headlines.

Filed Under: attorneys general, internet safety, stats