Kids Online Safety Act – 社区黑料 America's Education News Source Wed, 14 Aug 2024 20:14:31 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Kids Online Safety Act – 社区黑料 32 32 Web Filter Refined: Teen Builds His Own, More Nuanced Tool /article/web-filter-refined-teen-builds-his-own-more-nuanced-tool/ Thu, 15 Aug 2024 16:30:00 +0000 /?post_type=article&p=731340 This article was originally published in

Like most kids, Aahil Valliani has been frustrated by the filters that his school uses to block inappropriate websites. Often, he has no idea why certain sites are blocked, especially when his web browsing is tied to his schoolwork.

Many students in this situation find a way around their districts鈥 web filters. They access the internet on their phones instead, or use proxy servers or virtual private networks to essentially access a different, unfiltered internet. Aahil, searching for a more systemic solution, teamed up with his younger brother and father to start a company called Safe Kids, raise almost $2 million in venture funding, and design a better filter.

As The Markup, which is part of CalMatters, reported in April, almost all schools filter the web to comply with the federal Children鈥檚 Internet Protection Act and qualify for discounted internet access, among other things. Most schools The Markup examined used filters that sort all websites into categories and block entire categories at once. Others scan webpages for certain off-limits keywords, blocking websites on which they appear regardless of the context. In both cases, the filters are blunt tools that result in overblocking and sometimes keep kids from information about politicized topics like sex education and LGBTQ resources.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Aahil, now 17, points out that schools鈥 overly strict controls disappear as soon as kids graduate. 鈥淭hat鈥檚 a recipe for disaster,鈥 he said. Kids, he contends, need to learn how to make good choices about how to use the internet safely when trusted adults are nearby so they are ready to make good decisions on their own later.

The Safe Kids filter turns web blocking into a teachable moment, explaining why sites are blocked and nudging students to stay away from them of their own accord. It uses artificial intelligence to assess the intent of a student鈥檚 search, reducing the number of blocks students see while conducting legitimate academic research. One example: if a student searches for Civil War rifles for a class assignment, Safe Kids would allow it. If a student tries to shop for an AK-47, it wouldn鈥檛. Other filters would block both.

The filter also keeps student browsing data private, storing only categories of websites accessed, not URLs or search terms themselves. And it works through a Chrome browser extension, which means students can鈥檛 simply get around it with a proxy server or VPN while using that browser.

Safe Kids got its start during the early COVID-19 lockdowns. Sitting around the dinner table with his father, a tech entrepreneur; his mother, a self-employed fashion designer; and his younger brother Zohran, a budding computer scientist, Aahil got his family to strategize how to help all the kids getting sucked into dark corners of the web and battling the mental health consequences of their internet use.

Their idea, building off of the invasive and ineffective filters the brothers saw in school, essentially puts better training wheels on the internet. Aahil said his father did a bit of hand-holding in these early days, helping find board members and angel investors, as well as the data scientists who would train the AI machine learning model behind the filter and psychologists who could craft and test the filter鈥檚 hallmark pop-ups directing students toward more appropriate browsing. The company also spent time and money getting their designs patented. Aahil has three patents under his name and Safe Kids has five.

As Aahil and his family were preparing to chase seed funding for Safe Kids, the ACLU of Northern California was demanding the Fresno Unified School District a product called Gaggle, which districts use to monitor students鈥 internet use, block potentially harmful content, and step in if student browsing patterns indicate they may need mental health supports. The problem, according to ACLU attorneys, was that Gaggle amounted to intrusive surveillance, trampling on students鈥 privacy and free speech rights.

The Electronic Frontier Foundation levied similar accusations against another web filter called GoGuardian after getting records from 10 school districts, including three in California, that revealed the extent of the software鈥檚 blocking, tracking and flagging of student internet use during the 2022-23 school year, when Aahil was piloting Safe Kids. Jason Kelley, a lead researcher on EFF鈥檚 GoGuardian investigation, , looked into Safe Kids in response to an inquiry by The Markup. Accustomed to pointing out how bad filters are, he offered surprised praise for Safe Kids, commending its focus on privacy, its open source code that offers transparency about its model, and its context-specific blocking.

鈥淭his is, really, I think, an improved option for all the things that we are generally concerned about,鈥 Kelley said.

So far, Safe Kids has not been able to break into the school market. Still, Aahil hopes to one day sign a contract with a school district, and he is marketing to parents in the meantime, offering them a way to put guardrails on their kids鈥 home internet use. While Safe Kids started out charging for its filter, Aahil said an open source, free version will be released next month.

One of the company patents is for a聽 鈥減ause, reflect, and redirect鈥 method that leans on child psychology to teach kids healthy browsing habits when they try to access an inappropriate website.

鈥淲hen kids go to a site the first time, we consider that a mistake,鈥 Aahil said. 鈥淲e tell kids why it鈥檚 not good for them and kids can make a choice.鈥

For example, if a student tries to play games during a lesson, a pop-up would say, 鈥淭his isn鈥檛 schoolwork, is it?鈥 Students can click a 鈥渢ake me back鈥 button or 鈥渢ell me more鈥 link to get more information about why a given site is blocked. When students repeatedly try to access inappropriate content, their browsing is further restricted until they address the issue with an adult. If that content indicates a student might be in crisis, the user is advised to get help from an adult, and in a school setting, a staff member would get an automated alert.

The teen expects to keep building the company, even as he shifts his focus to college admissions this fall. A rising senior at the selective Thomas Jefferson High School for Science and Technology in Alexandria, Virginia, one of the nation鈥檚 best public high schools, Aahil plans to major in business or economics and make a career out of entrepreneurship.

Safe Kids stands out in a web filtering market where products鈥 blunt restrictions on the web have barely become more sophisticated over the last 25 years.

Nancy Willard, director of Embrace Civility LLC, has worked on issues of youth online safety since the mid-1990s. She submitted testimony for the congressional hearings that resulted in passage of the Children鈥檚 Internet Protection Act in 2000 and describes the filtering company representatives that showed up as snake oil salesmen, selling a technology that addresses a symptom, not the root of a problem.

鈥淲e need to prepare kids to manage themselves,鈥 Willard said. When traditional filters block certain websites with no explanation, kids don鈥檛 learn anything, and they鈥檙e often tempted to just circumvent the software.

鈥淭his approach helps increase student understanding, and hopefully there鈥檚 a way also in the instructional aspects (to increase) their skills,鈥 she said about Safe Kids.

Students on Chromebooks in particular can鈥檛 circumvent Safe Kids and its design aims to keep them from wanting to. Now Aahil and his family just need to find buyers.

Kelley said he鈥檚 not surprised Safe Kids hasn鈥檛 been able to yet, given the 鈥渉ardening鈥 of school security and student safety efforts over the last decade. 鈥淲e鈥檝e gone from having cameras and some pretty standard filters to having metal detectors, and locked doors, and biometrics, and vape detectors in the bathrooms, and these much more strict filters and content moderating control software,鈥 he said, 鈥渁nd all this is hard to undo.鈥

This was originally published on .

]]>
Lawmakers Duel With Tech Execs on Social Media Harms to Youth Mental Health /article/senate-grills-tech-ceos-on-social-media-harms/ Wed, 31 Jan 2024 23:20:00 +0000 /?post_type=article&p=721450 During a hostile Senate hearing Wednesday that sometimes devolved into bickering, lawmakers from across the political spectrum accused social media companies of failing to protect young people online and pushed rules that would hold Big Tech accountable for youth suicides and child sexual exploitation. 

The Senate Judiciary Committee hearing in Washington, D.C., was the latest act in a bipartisan effort to bolster federal regulations on social media platforms like Instagram and TikTok amid a growing chorus of parents and adolescent mental health experts warning the services have harmed youth well-being and, in some cases, pushed them to suicide. 

In an unprecedented moment, Meta founder and CEO Mark Zuckerberg, at the urging of Missouri Republican Sen. Josh Hawley, stood up and turned around to face the audience, apologizing to the parents in attendance who said their children were damaged 鈥 and in some cases, died 鈥 because of his company鈥檚 algorithms. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


“I鈥檓 sorry for everything you鈥檝e all gone through,” said Zuckerberg, whose company owns Facebook and Instagram. “It鈥檚 terrible. No one should have to go through the things that your families have suffered.”

Senators argued the companies 鈥 and tech executives themselves 鈥 should be held legally responsible for instances of abuse and exploitation under tougher regulations that would limit children鈥檚 access to social media platforms and restrict their exposure to harmful content.

鈥淵our platforms really suck at policing themselves,鈥 Sen. Sheldon Whitehouse, a Rhode Island Democrat, told Zuckerberg and the CEOs of X, TikTok, Discord and Snap, who were summoned to testify. Section 230 of the Communications Decency Act, which allows social media platforms to moderate content as they see fit and generally provides immunity from liability for user-generated posts, has routinely shielded tech companies from accountability. As youth harms persist, he said those legal protections are 鈥渁 very significant part of that problem.鈥 

Whitehouse pointed to a lawsuit against X, formerly Twitter, that was filed by two men who claimed a sex trafficker manipulated them into sharing sexually explicit videos of themselves over Snapchat when they were just 13 years old. Links to the videos appeared on Twitter years later, but the company allegedly refused to take action until after they were contacted by a Department of Homeland Security agent and the posts had generated more than 160,000 views. The by the Ninth Circuit, which cited Section 230.聽

鈥淭hat’s a pretty foul set of facts,鈥 Whitehouse said. 鈥淭here is nothing about that set of facts that tells me Section 230 performed any public service in that regard.鈥

In an opening statement, Democratic committee chair, Sen. Dick Durbin of Illinois, offered a chilling description of the harms inflicted on young people by each of the social media platforms represented at the hearing. In addition to Zuckerberg, executives who testified were X CEO Linda Yaccarino, TikTok CEO Shou Chew, Snap co-founder and CEO Evan Spiegel and Discord CEO Jason Citron.

鈥淒iscord has been used to groom, abduct and abuse children,鈥 Durbin said. 鈥淢eta鈥檚 Instagram helped connect and promote a network of pedophiles. Snapchat鈥檚 disappearing messages have been co-opted by criminals who financially extort young victims. TikTok has become a, quote, 鈥榩latform of choice’ for predators to access, engage and groom children for abuse. And the prevalance of [child sexual abuse material] on X has grown as the company has gutted its trust and safety workforce.鈥 

Citron testified that Discord has 鈥渁 zero tolerance policy鈥 for content that features sexual exploitation and that it uses filters to scan and block such materials from its service. 

鈥淛ust like all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes,鈥 Citron said. 鈥淎ll of us here on the panel today, and throughout the tech industry, have a solemn and urgent responsibility to ensure that everyone who uses our platforms is protected from these criminals both online and off.鈥 

Lawmakers have introduced a slate of regulatory bills that have gained bipartisan traction but have failed to become law. Among them is the Kids Online Safety Act, which would require social media companies and other online services to take 鈥渞easonable measures鈥 to protect children from cyberbullying, sexual exploitation and materials that promote self-harm. It would also mandate strict privacy settings when teens use the online services. Other proposals would to report suspected drug activity to the police 鈥 some parents said their children overdosed and died after buying drugs on the platforms 鈥 and a bill that would hold them accountable for hosting child sexual abuse materials. 

In their testimonies, each of the tech executives said they have taken steps to protect children who use their services, including features that restrict certain types of content, limit screen time and curtail the people they鈥檙e allowed to communicate with. But they also sought to distance their services from harms in a bid to stave off regulations. 

鈥淲ith so much of our lives spent on mobile devices and social media, it鈥檚 important to look into the effects on teen mental health and well-being,鈥 Zuckerberg said. 鈥淚 take this very seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes.鈥 

Zuckerberg by the National Academies of Sciences, Engineering and Medicine, which concluded there is a lack of evidence to confirm that social media causes changes in adolescent well-being at the population level and that the services could carry both benefits and harms for young people. While social media websites can expose children to online harassment and fringe ideas, researchers noted, the services can be used by young people to foster community. 

In October, 42 state attorneys general , alleging that the social media giant knowingly and purposely designed tools to addict children to its services. U.S. Surgeon General Vivek Murthy warning that social media sites pose a 鈥減rofound risk of harm鈥 to youth mental health, stating that the tools should come with warning labels. Among evidence of the harms is which found that Instagram led to body-image issues among teenage girls and that many of its young users blamed the platform for increases in anxiety and depression. 

Republican lawmakers devoted a significant amount of time during the hearing to criticizing TikTok for its ties to the Chinese government, calling out the app for collecting data about U.S. citizens, including in an effort to surveil American journalists. The Justice Department is reportedly investigating allegations that ByteDance, the Chinese company that owns TikTok, used the app to surveil several American journalists who report on the tech industry. 

In response, Chew said the company launched an initiative 鈥 dubbed 鈥淧roject Texas鈥 鈥 to prevent its Chinese employees from accessing personal data about U.S. citizens. But employees claim the company has . 

YouTube and TikTok are by far the platforms where teens spend the most hours per day, according to a 2023 Gallup survey although Neal Mohan, the CEO of Google-owned YouTube, was not called in to testify.

Mainstream social media platforms have also been exploited for domestic online extremism. Earlier this month, for example, a teenager accused of carrying out a mass shooting at his Iowa high school reportedly maintained an active presence on Discord and, shortly before the rampage, commented in a channel dedicated to such attacks that he was 鈥済earing up鈥 for the mayhem. Just minutes before the shooting, the suspect appeared to capture a video inside a school bathroom and uploaded it to TikTok. 

Josh Golin, the executive director of Fairplay, a nonprofit devoted to bolstering online child protections, blasted the tech executives鈥 testimony for being little more than 鈥渆vasions and deflections.鈥 

鈥淚f Congress really cares about the families who packed the hearing today holding pictures of their children lost to social media harms, they will move the Kids Online Safety Act,鈥 Golin said in a statement. 鈥淧ointed questions and sound bites won鈥檛 save lives, but KOSA will.鈥 

The safety act, known as KOSA, has faced pushback from civil rights advocates on First Amendment grounds, arguing the proposal could be used to censor certain content and . Sen. Marsha Blackburn, a Republican from Tennessee and KOSA co-author, said last fall the rules are important to protect 鈥渕inor children from the transgender in this culture鈥 and cited the legislation as a way to shield children from 鈥渂eing indoctrinated鈥 online. The Heritage Foundation, a conservative think tank, endorsed the legislation, that 鈥渒eeping trans content away from children is protecting kids.鈥 

Snap鈥檚 Evan Spiegel and X鈥檚 Linda Yaccarino both agreed to support the Kids Online Safety Act.

Aliya Bhatia, a policy analyst with the nonprofit Center for Democracy and Technology, said that although lawmakers made clear their intention to act, their directives could end up doing more harm than good. She said the platforms serve as 鈥減eer-to-peer learning and community networks鈥 where young people can access information about reproductive health and other important topics that they might not feel comfortable receiving from adults in their lives. 

鈥淚t鈥檚 clear that this is a really tricky issue, it鈥檚 really difficult for the government and companies to decide what is harmful for young people,鈥 Bhatia said. 鈥淲hat one young person finds helpful online, another might find harmful.鈥

South Carolina’s Sen. Lindsey Graham, the committee’s ranking Republican, said that social media companies can鈥檛 be trusted to keep kids safe online and that lawmakers have run out of patience.

鈥淚f you鈥檙e waiting on these guys to solve the problem,鈥 he said, 鈥渨e鈥檙e going to die waiting.鈥 

]]>
As Advocates and Parents Rally, Youth Online Privacy Bills on Life Support /article/as-advocates-and-parents-rally-youth-online-privacy-bills-on-life-support/ Wed, 14 Sep 2022 21:07:24 +0000 /?post_type=article&p=696557 Sen. Ed Markey was getting quizzed on the viability of new online privacy laws for children when he took a brief but awkward pause. 

The Democrat from Massachusetts, who has long championed consumer privacy and become a key adversary of tech companies like Meta for monetizing user data, joined a Zoom call Tuesday evening to rally support for two bills he said would protect kids from being manipulated by social media algorithms. But he also brought some bad news: The legislation had 鈥渟talled鈥 in Washington despite bipartisan support. 

Advocates this week are making a push to get the bipartisan bills 鈥 the Kids Online Safety Act and the Children鈥檚 Online Privacy Protection Act 2.0 鈥 across the finish line. In a letter on Monday, 145 groups including Fairplay and Common Sense Media urged lawmakers to pass the legislation in the interests of protecting youth mental health, now considered at an all-time low in this country. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


But Markey seemed to lay out a path requiring Herculean effort. 

鈥淥nly the paranoid survive,鈥 Markey said, adding that the legislation would pass if its supporters 鈥 and youth activists in particular 鈥 called their lawmakers and demanded they 鈥減ull this out of the pile of issues鈥 and give it priority. 鈥淲e鈥檙e going to try to get it over the finish line, but we need you to just have your energy level go higher and higher for these final couple of months and we will get it done.鈥

The legislative push comes a year after a Facebook whistleblower disclosed research showing that the social media app Instagram had a harmful effect on youth mental well-being, especially for teenage girls. The whistleblower, Frances Haugen, to regulate social media companies 鈥 Meta owns Facebook and Instagram 鈥 that she accused of pursuing 鈥渁stronomical profits鈥 while knowingly putting its users at risk. revealed the company knew Instagram made 鈥渂ody image issues worse for one in three teen girls鈥 who blamed the social media platform for driving 鈥渋ncreases in the rate of anxiety and depression鈥 and, for some, suicidal thoughts. 

The would make tech companies liable if they expose young people to content deemed harmful, including materials that promote self-harm, eating disorders and substance abuse. It would also require parental controls that could be used to block adult content and to study systems to verify users鈥 age 鈥渁t the device or operating system level.鈥

The , which expands a law that Markey championed in 1998 to cover older teens, would ban targeted advertisements directed at children and require companies to offer an 鈥渆raser button鈥 that allows children and teens to remove their personal data. 

Former Facebook employee Frances Haugen (Getty Images)

But deep-pocketed tech companies, Sen. Richard Blumenthal said Tuesday, are standing in the way. 

鈥淥ur obstacles here are the big tech lobbyists,鈥 he said. 鈥淭hey have armies of lobbyists. They pay them, they pay them very well. They hire them to block this legislation.鈥

While the legislation is designed to protect kids, some digital privacy experts say the rules could come with significant unintended consequences 鈥 and could lead to an age-verification system where all web users are made to submit documentation like a driver鈥檚 license, requiring them to hand over personal information to tech companies. 

On the Zoom call to bolster support for the bills was Vinaya Sivakumar, a high school senior from Ohio, who created her first social media profile when she was 12. What started out as being harmless, she said, quickly took a toll on her health. 

鈥淚t just snowballed into something that constantly perpetuated actions and thoughts like self-harm and eating disorders and it was really never let out of my sight,鈥 said Sivakumar, referring to a stream of content she found harmful being fed to her by algorithms. 鈥淚t almost encouraged me to make decisions that I didn’t necessarily feel were mine and my mental health was in the worst state ever.鈥

Kristin Bride, a mother and digital safety advocate from Oregon, implored lawmakers to pass the legislation for kids like her 16-year-old son Carson, who died by suicide in 2020 after he was 鈥渧isciously bullied鈥 by other kids on Snapchat who used third-party apps to conceal their identities. Last year, Bride , the company that owns the social media app Snapchat, and accused it of lacking safeguards to protect children from harassment. In response, Snap suspended two of the apps, Yolo and LMK. But , NGL, has since cropped up. 

鈥淯ntil social media companies are held accountable for their harmful products, they will always put profit over people,鈥 Bride said, 鈥渁nd kids like Carson and so many others are just collateral damage.鈥 

Despite the heightened focus in Washington around digital rights and tech companies鈥 use of user data for targeted advertising, broader digital privacy legislation has also struggled this year. which would create a national digital privacy standard and limit the personal data that tech companies can collect about users, has hit roadblocks, from House Speaker Nancy Pelosi. 

Earlier this month, Ireland鈥檚 Data Protection Commission for violating European Union data privacy laws. The commission has been investigating the company for an Instagram setting that automatically sets the profiles of teenagers as public by default. 

Meanwhile, Meta has begun to roll out , including that automatically routes new users younger than 16 to a version with limits on content deemed inappropriate.

The childrens鈥 safety legislation, which would strengthen rules that haven鈥檛 been updated for decades, has received support from a broad range of groups focused on youth well-being, including and the American Psychological Association and The Jed Foundation. from digital rights advocates including the Electronic Frontier Foundation. In that while lawmakers deserve credit 鈥渇or attempting to improve online data privacy for young people,鈥 the plan would ultimately 鈥渞equire surveillance and censorship鈥 of children and teens 鈥渁nd would greatly endanger the rights, and safety, of young people online.鈥 

鈥淒ata collection is a scourge for every internet user, regardless of age,鈥 the report notes, but the legislation could ultimately force tech companies to further track their users. 鈥淪urveillance of young people is , even in the healthiest household, and is not a solution to helping young people navigate the internet.鈥

Disclosure: Campbell Brown oversees global media partnerships at Meta. Brown co-founded 社区黑料  and sits on its board of directors.

]]>