(no subject)
We founded Dreamwidth in 2008 with several guiding principles in mind, among them protecting your privacy and giving you as much control as possible over your data. Given those principles, we wish we could support California's AB 2273, the Age-Appropriate Design Act, a bill signed into law in September 2022 that California passed to put restrictions into place for children's online data privacy and protection. It's an important issue, and too many companies out there don't put enough attention towards protecting their users.
Unfortunately, the law as California wrote it is not a data privacy act: it's a backdoor censorship bill that threatens anonymity online, will require privacy-and-anonymity-conscious sites such as Dreamwidth to collect more identifying data about our users than we want or need to, and will force sites to remove or restrict access content the state of California feels is "harmful to children" independent of their own editorial judgement. Like the Communications Decency Act of 1996, overturned in the landmark Supreme Court case Reno vs ACLU (521 US 844 (1997)), we feel the law uses vague and undefined terms to impose prior restraint on the protected speech of adults. Its age-verification requirements will force us to violate your privacy and place an undue burden upon your use of the site by requiring us to use invasive measures to verify your ages: proposals for how sites should verify the ages of their users include requiring copies of government issued IDs (which would end anonymity online) and forcing sites to adopt unvalidated and scientifically bogus facial recognition technology to estimate user ages (which, in addition to ending anonymity online and being a breathtaking overreach, shuts out everyone who doesn't have access to a device that can capture image or video, blind people who struggle with lack of visual cues during the facial recognition process, and everyone whose age that technology mis-estimates).
Worse, though, the law also allows the state of California to determine, through an unaccountable administrative process, what content is "harmful to children" and force sites to age-lock that content. That will force us to designate a large amount of legal, protected speech as available only to accounts that have verified their age as being over 18. Given the terrible, hot-button political environment today, and using examples of laws that have already passed or are in the legislative process in other states about what sort of content is "harmful to children", that provision could require us to age-lock an entry written by a Black Dreamwidth user talking about experiencing police violence, a Muslim user talking about experiencing discrimination at work because of her choice to wear a hijab, a trans user talking about seeing their doctor for gender-affirming health care, a user talking about the process of accessing abortion services -- or even an entry that I post with an offhanded reference to my wife bringing me candy that went on half-price sale after Valentine's Day. There's a long history of content by marginalized people talking about their lives being considered "harmful for children", and we have no interest in furthering that disparate impact at the government's directive. We offer you the ability to age-lock your content for you to have more control over who can see it, and we object most strenuously to the idea that the government should force us to force you to use that ability when you don't want to.
There's been no shortage of terrible online content regulation bills that we strenuously oppose lately, so why are we telling you about this one? Because we were invited to provide a third-party declaration in support of the motion for a temporary injunction to stop the law taking effect in Netchoice v Bonta, the legal effort to invalidate the law as unconstitutional, in order to demonstrate to the court all the ways the law as written will impose a significant undue burden on small sites like us and on our users and to provide some examples of how the law will have a significant disparate impact on marginalized groups. We're proud to contribute in some small way in the fight against this terrible overreach of a bill. As a small site with a legal budget of, like, $3.81 and the lint I turned out of the pockets of my hoodie, we're thankful to industry advocacy group Netchoice for leading the fight (and giving us the chance to stand up with them) and to the kickass lawyers at Davis Wright Tremaine, who have been a delight to work with throughout the process of turning my tl;dr rant about why this is a terrible bill into something that we hope the court will find helpful.
You can read our declaration, which was filed today with the motion asking the court to stop the law from going into effect. The full docket for the lawsuit is available via RECAP, including the motion for injunction.
Unfortunately, the law as California wrote it is not a data privacy act: it's a backdoor censorship bill that threatens anonymity online, will require privacy-and-anonymity-conscious sites such as Dreamwidth to collect more identifying data about our users than we want or need to, and will force sites to remove or restrict access content the state of California feels is "harmful to children" independent of their own editorial judgement. Like the Communications Decency Act of 1996, overturned in the landmark Supreme Court case Reno vs ACLU (521 US 844 (1997)), we feel the law uses vague and undefined terms to impose prior restraint on the protected speech of adults. Its age-verification requirements will force us to violate your privacy and place an undue burden upon your use of the site by requiring us to use invasive measures to verify your ages: proposals for how sites should verify the ages of their users include requiring copies of government issued IDs (which would end anonymity online) and forcing sites to adopt unvalidated and scientifically bogus facial recognition technology to estimate user ages (which, in addition to ending anonymity online and being a breathtaking overreach, shuts out everyone who doesn't have access to a device that can capture image or video, blind people who struggle with lack of visual cues during the facial recognition process, and everyone whose age that technology mis-estimates).
Worse, though, the law also allows the state of California to determine, through an unaccountable administrative process, what content is "harmful to children" and force sites to age-lock that content. That will force us to designate a large amount of legal, protected speech as available only to accounts that have verified their age as being over 18. Given the terrible, hot-button political environment today, and using examples of laws that have already passed or are in the legislative process in other states about what sort of content is "harmful to children", that provision could require us to age-lock an entry written by a Black Dreamwidth user talking about experiencing police violence, a Muslim user talking about experiencing discrimination at work because of her choice to wear a hijab, a trans user talking about seeing their doctor for gender-affirming health care, a user talking about the process of accessing abortion services -- or even an entry that I post with an offhanded reference to my wife bringing me candy that went on half-price sale after Valentine's Day. There's a long history of content by marginalized people talking about their lives being considered "harmful for children", and we have no interest in furthering that disparate impact at the government's directive. We offer you the ability to age-lock your content for you to have more control over who can see it, and we object most strenuously to the idea that the government should force us to force you to use that ability when you don't want to.
There's been no shortage of terrible online content regulation bills that we strenuously oppose lately, so why are we telling you about this one? Because we were invited to provide a third-party declaration in support of the motion for a temporary injunction to stop the law taking effect in Netchoice v Bonta, the legal effort to invalidate the law as unconstitutional, in order to demonstrate to the court all the ways the law as written will impose a significant undue burden on small sites like us and on our users and to provide some examples of how the law will have a significant disparate impact on marginalized groups. We're proud to contribute in some small way in the fight against this terrible overreach of a bill. As a small site with a legal budget of, like, $3.81 and the lint I turned out of the pockets of my hoodie, we're thankful to industry advocacy group Netchoice for leading the fight (and giving us the chance to stand up with them) and to the kickass lawyers at Davis Wright Tremaine, who have been a delight to work with throughout the process of turning my tl;dr rant about why this is a terrible bill into something that we hope the court will find helpful.
You can read our declaration, which was filed today with the motion asking the court to stop the law from going into effect. The full docket for the lawsuit is available via RECAP, including the motion for injunction.
no subject
I don't doubt what you've explained of the law, but basically if it passed, that means California gets to force every site with content that's "objectional" to make users identify themselves, whether they're in California or not? Does California get to decide what people in, say, France see?
no subject
The law also doesn't only apply to sites that have "objectionable" content: it applies to every site, period. (Well, technically they tried to restrict it only to larger sites, but they did a really bad job of it: it also applies to smaller sites that buy, sell, or transmit user data, and you can't run a site without transmitting user data to someone for services you can't provide yourself. It applies to us because we use a third-party payment processor, for instance, and possibly -- did I mention it's vague and badly written -- also applies to anyone who uses caching or content delivery services, DDoS protection services, or possibly even anyone who uses cloud-based webhosting.)
The mandated content restrictions are, technically, only applicable to people located in California -- but, again, since sites can't reliably determine who's in California, the only way to guarantee that you don't violate the law is to restrict that content everywhere. Except AB 2273 explicitly contradicts other states' (and even federal) equally terrible content moderation and privacy laws! (Most of which are stayed at least until the Supreme Court rules on the two upcoming content moderation cases it's hearing this term, but that's a whole 'nother rabbit hole to go down that would take me a week to explain, heh.) So yeah, California saying "you can't show this kind of content to kids" would mean most sites would just restrict that type of content to people verified as being 18+ worldwide, because building "age-lock this, but only in California" is almost impossible. That's one of the issues raised in the motion for a stay, because something called the dormant commerce clause says that California can't interfere with a transaction between somebody in Maryland and somebody in Illinois -- which is what this law would do. (It's like the fifth unconstitutional thing about it!)
There are a lot of problems with the law, but the biggest is what's known as the "chilling effect": sites (and users!) will refrain from speech they would have otherwise engaged in out of fear of causing problems with this law. That's one of the many reasons we oppose it: we don't have the legal budget to keep a laywer on retainer to review every code change and every bit of "possibly harmful to children" content, so unless we wanted the state of California to be able to, at any point, impose massive fines that would put us out of business in three seconds flat, we would need to play it safe and restrict a huge amount of content that we don't want to restrict, demand a huge amount of personally-identifying information that we don't want to have (and you don't want to give us!) and slow down development even further because literally every line of code changed would prompt the need to write up a detailed, exhaustive analysis that the state could demand we turn over at any point.
...it's a really bad law, is what I'm saying. Heh.
no subject
no subject
no subject
But YIKES.
no subject
no subject
no subject
no subject
I only use my web cam for work meetings because there's a standing directive to do so under a certain number of people in the meeting, otherwise I wouldn't use it at all...
And I've seen what facial recognition tech does, and that phrase I first mentioned comes to mind.
Thank you for helping fight this one.
no subject
Facial recognition technology is not ready for prime time, at all. The algorithms are a joke, and are heavily biased towards white men.
I agree, "Fuck that noise!"
no subject
I sincerely hope that there are lawyers lining up and salivating at the prospect of antitrust lawsuits there, because wow does that smell like ulterior motives.
(no subject)
(no subject)
no subject
I think it's significant that the UK government eventually refused to implement its own age verification requirement from the Digital Economy Act, which all the market leaders were bidding their current technology for.
(no subject)
(no subject)
(no subject)
no subject
"(This law is basically a word-for-word copy of a UK law that was already passed; most of us small sites in the US who don't have a UK/EU business presence are planning on ignoring that one and letting the UK block us if they want, but we can't really do that with California.)"
I will follow this up. Thank you.
no subject
https://en.wikipedia.org/wiki/Online_Safety_Bill
no subject
(no subject)
no subject
(no subject)
no subject
no subject
(no subject)
(no subject)
(no subject)
(no subject)
no subject
no subject
no subject
no subject
no subject
no subject
no subject
My state is doing equally stupid things, though none so far that would affect people outside the state.