As I campaign in Fairfield County, I constantly speak to parents concerned about their kids. And for good reason: this generation of young people has unprecedented rates of anxiety and depression, and a lot of it can be traced back to increased time spent on social media and the internet.
Something needs to change: We need to make the internet, and especially social media, safer for kids and teens.
Tech companies are working internally to root out harmful content and practices like harassment, hate speech, and disinformation. I know something about their current efforts — I worked to prevent child exploitation on Facebook and Instagram as a member of the Meta Child Safety Team. But it’s clear that the government should have a critical role to play in establishing safety standards for companies to uphold.
We need consumer protections so that companies design their sites with the safety of young people in mind. That’s why, if elected, I will author legislation to provide guardrails for our youngest internet users, modeled off a bipartisan bill that recently passed unanimously in the California State Senate.
The California legislation gets tech regulation right. It requires that tech companies provide significant privacy settings by default, rather than forcing users to opt-in through complicated processes. It also switches off geolocation services that track where people are when they access a website — and, importantly, it bans “nudges techniques” that encourage unsuspecting kids to provide additional personal data.
Here in Connecticut, it’s time for a similar Age-Appropriate Design Code to ensure online platforms are built to take the wellbeing of children into account.
Now, I’m not an anti-technologist. The social internet can be a powerful tool for our communities to connect and thrive. In fact, I currently work with other online safety professionals at the Integrity Institute to advise tech companies, regulators, and legislators on ways to make our internet better. But we need ethical design standards, so our young people aren’t sucked into rabbit holes of autoplayed content, often age-inappropriate and algorithmically optimized to grab their attention, while unwittingly having their personal data harvested.
As policymakers, keeping our community safe should be our foremost priority, but there is a major gap in our federal law protecting kids online. The Children’s Online Privacy Protection Act of 1998 is limited to online services specifically aimed at kids — and only protects users under the age of 13. It does not address websites with a broad audience, even when the vast majority of teens are using popular social media sites.
Here is where we can come in. Connecticut has a history of taking the lead in protecting its residents. We passed several first-in-the-nation gun-safety laws after the tragedy in Sandy Hook. In the wake of the disastrous Dobbs decision, we lead the way with reproductive rights legislation that protects medical providers and patients who are traveling from other states for abortion care. Now we can again take the lead and safeguard our most vulnerable residents. By passing a “CT Kids’ Code,” our government can stand with concerned parents around the state, and hold web and social media sites accountable for our communities’ safety.
This isn’t a partisan issue. Everyone — Democrats, Republicans, and independents — should be able to get behind making technology safer for kids. If elected, I promise to enlist my professional background in online safety toward this goal and lead the fight to pass an Age-Appropriate Design Code in Connecticut.