The EFF points out that alleged fixes to KOSA don’t go far enough, explaining why it’s still a censorship bill.
Earlier, we reported on a disturbing bill out of the US. It is known as the Kids Online Safety Act (or KOSA). In a nutshell, KOSA demands that social media websites collect identifiable information on every user out there. This includes information like who you are and where you live. If those sites don’t collect that information, they could be fined by the government heavily. From there, they would be expected to have so-called “duty of care” where they would prevent people from accessing content deemed “controversial”, whatever the heck that is supposed to mean.
Critics have long pointed out that because state attorney can designate whatever they want as “controversial”, it will lead to heavy internet censorship. Specifically, content surrounding LGBTQ+ content could very easily be classified as “controversial” or not safe for children to access. As such, that content would be ordered to be blocked on various platforms. As such, many have concluded that this is basically a massive internet censorship bill.
Lawmakers have rightfully received heavy criticism over the bill and have walked things back a bit. Those lawmakers, in response, tweaked the bill and claimed to have fixed all of the problems associated with the bill. The Electronic Frontier Foundation (EFF), however, says that those fixes don’t go anywhere near far enough:
For the past two years, Congress has been trying to revise the Kids Online Safety Act (KOSA) to address criticisms from EFF, human and digital rights organizations, LGBTQ groups, and others, that the core provisions of the bill will censor the internet for everyone and harm young people. All of those changes fail to solve KOSA’s inherent censorship problem: As long as the “duty of care” remains in the bill, it will still force platforms to censor perfectly legal content. (You can read our analyses here and here.)
Despite never addressing this central problem, some members of Congress are convinced that a new change will avoid censoring the internet: KOSA’s liability is now theoretically triggered only for content that is recommended to users under 18, rather than content that they specifically search for. But that’s still censorship—and it fundamentally misunderstands how search works online.
Our concern, and the concern of others, is that this bill will be used to censor legal information and restrict the ability for minors to access it, while adding age verification requirements that will push adults off the platforms as well. Additionally, enforcement provisions in KOSA give power to state attorneys general to decide what is harmful to minors, a recipe for disaster that will exacerbate efforts already underway to restrict access to information online (and offline). The result is that platforms will likely feel pressured to remove enormous amounts of information to protect themselves from KOSA’s crushing liability—even if that information is not harmful.
The ‘Limitation’ section of the bill is intended to clarify that KOSA creates liability only for content that the platform recommends. In our reading, this is meant to refer to the content that a platform shows a user that doesn’t come from an account the user follows, is not content the user searches for, and is not content that the user deliberately visits (such as by clicking a URL). In full, the ‘Limitation’ section states that the law is not meant to prevent or preclude “any minor from deliberately and independently searching for, or specifically requesting, content,” nor should it prevent the “platform or individuals on the platform from providing resources for the prevention or mitigation of suicidal behaviors, substance use, and other harms, including evidence-informed information and clinical resources.”
In layman’s terms, minors will supposedly still have the freedom to follow accounts, search for, and request any type of content, but platforms won’t have the freedom to share some types of content to them. Again, that fundamentally misunderstands how social media works—and it’s still censorship.
Anyone who supports freedom of expression should see this bill as a bad bill. You can’t, on the one hand, say you support freedom of expression, while at the same time, advocate for the censorship of legal content you don’t like. Free speech doesn’t work that way. Saying that this is all done to protect the children simply doesn’t change any of that. It’s putting a fig leaf on the situation. Even if you are freaked out that the kid is going to see something icky online, then tools already exist out there today to address this. It is not up to the government to police such things in the first place.
Hopefully, this bill is never passed and, if it does get passed, it can be smacked down in the courts for being unconstitutional. The last thing the world needs is a Great Firewall of America.
Drew Wilson on Twitter: @icecube85 and Facebook.