Web Filter Refined: Teen Builds His Own, More Nuanced Tool
Frustrated by school web filters, one teenager created his own.
Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter
Like most kids, Aahil Valliani has been frustrated by the filters that his school uses to block inappropriate websites. Often, he has no idea why certain sites are blocked, especially when his web browsing is tied to his schoolwork.
Many students in this situation find a way around their districts鈥 web filters. They access the internet on their phones instead, or use proxy servers or virtual private networks to essentially access a different, unfiltered internet. Aahil, searching for a more systemic solution, teamed up with his younger brother and father to start a company called Safe Kids, raise almost $2 million in venture funding, and design a better filter.
As The Markup, which is part of CalMatters, reported in April, almost all schools filter the web to comply with the federal Children鈥檚 Internet Protection Act and qualify for discounted internet access, among other things. Most schools The Markup examined used filters that sort all websites into categories and block entire categories at once. Others scan webpages for certain off-limits keywords, blocking websites on which they appear regardless of the context. In both cases, the filters are blunt tools that result in overblocking and sometimes keep kids from information about politicized topics like sex education and LGBTQ resources.
Aahil, now 17, points out that schools鈥 overly strict controls disappear as soon as kids graduate. 鈥淭hat鈥檚 a recipe for disaster,鈥 he said. Kids, he contends, need to learn how to make good choices about how to use the internet safely when trusted adults are nearby so they are ready to make good decisions on their own later.
The Safe Kids filter turns web blocking into a teachable moment, explaining why sites are blocked and nudging students to stay away from them of their own accord. It uses artificial intelligence to assess the intent of a student鈥檚 search, reducing the number of blocks students see while conducting legitimate academic research. One example: if a student searches for Civil War rifles for a class assignment, Safe Kids would allow it. If a student tries to shop for an AK-47, it wouldn鈥檛. Other filters would block both.
The filter also keeps student browsing data private, storing only categories of websites accessed, not URLs or search terms themselves. And it works through a Chrome browser extension, which means students can鈥檛 simply get around it with a proxy server or VPN while using that browser.
Safe Kids got its start during the early COVID-19 lockdowns. Sitting around the dinner table with his father, a tech entrepreneur; his mother, a self-employed fashion designer; and his younger brother Zohran, a budding computer scientist, Aahil got his family to strategize how to help all the kids getting sucked into dark corners of the web and battling the mental health consequences of their internet use.
Their idea, building off of the invasive and ineffective filters the brothers saw in school, essentially puts better training wheels on the internet. Aahil said his father did a bit of hand-holding in these early days, helping find board members and angel investors, as well as the data scientists who would train the AI machine learning model behind the filter and psychologists who could craft and test the filter鈥檚 hallmark pop-ups directing students toward more appropriate browsing. The company also spent time and money getting their designs patented. Aahil has three patents under his name and Safe Kids has five.
As Aahil and his family were preparing to chase seed funding for Safe Kids, the ACLU of Northern California was demanding the Fresno Unified School District a product called Gaggle, which districts use to monitor students鈥 internet use, block potentially harmful content, and step in if student browsing patterns indicate they may need mental health supports. The problem, according to ACLU attorneys, was that Gaggle amounted to intrusive surveillance, trampling on students鈥 privacy and free speech rights.
The Electronic Frontier Foundation levied similar accusations against another web filter called GoGuardian after getting records from 10 school districts, including three in California, that revealed the extent of the software鈥檚 blocking, tracking and flagging of student internet use during the 2022-23 school year, when Aahil was piloting Safe Kids. Jason Kelley, a lead researcher on EFF鈥檚 GoGuardian investigation, , looked into Safe Kids in response to an inquiry by The Markup. Accustomed to pointing out how bad filters are, he offered surprised praise for Safe Kids, commending its focus on privacy, its open source code that offers transparency about its model, and its context-specific blocking.
鈥淭his is, really, I think, an improved option for all the things that we are generally concerned about,鈥 Kelley said.
So far, Safe Kids has not been able to break into the school market. Still, Aahil hopes to one day sign a contract with a school district, and he is marketing to parents in the meantime, offering them a way to put guardrails on their kids鈥 home internet use. While Safe Kids started out charging for its filter, Aahil said an open source, free version will be released next month.
One of the company patents is for a聽 鈥減ause, reflect, and redirect鈥 method that leans on child psychology to teach kids healthy browsing habits when they try to access an inappropriate website.
鈥淲hen kids go to a site the first time, we consider that a mistake,鈥 Aahil said. 鈥淲e tell kids why it鈥檚 not good for them and kids can make a choice.鈥
For example, if a student tries to play games during a lesson, a pop-up would say, 鈥淭his isn鈥檛 schoolwork, is it?鈥 Students can click a 鈥渢ake me back鈥 button or 鈥渢ell me more鈥 link to get more information about why a given site is blocked. When students repeatedly try to access inappropriate content, their browsing is further restricted until they address the issue with an adult. If that content indicates a student might be in crisis, the user is advised to get help from an adult, and in a school setting, a staff member would get an automated alert.
The teen expects to keep building the company, even as he shifts his focus to college admissions this fall. A rising senior at the selective Thomas Jefferson High School for Science and Technology in Alexandria, Virginia, one of the nation鈥檚 best public high schools, Aahil plans to major in business or economics and make a career out of entrepreneurship.
Safe Kids stands out in a web filtering market where products鈥 blunt restrictions on the web have barely become more sophisticated over the last 25 years.
Nancy Willard, director of Embrace Civility LLC, has worked on issues of youth online safety since the mid-1990s. She submitted testimony for the congressional hearings that resulted in passage of the Children鈥檚 Internet Protection Act in 2000 and describes the filtering company representatives that showed up as snake oil salesmen, selling a technology that addresses a symptom, not the root of a problem.
鈥淲e need to prepare kids to manage themselves,鈥 Willard said. When traditional filters block certain websites with no explanation, kids don鈥檛 learn anything, and they鈥檙e often tempted to just circumvent the software.
鈥淭his approach helps increase student understanding, and hopefully there鈥檚 a way also in the instructional aspects (to increase) their skills,鈥 she said about Safe Kids.
Students on Chromebooks in particular can鈥檛 circumvent Safe Kids and its design aims to keep them from wanting to. Now Aahil and his family just need to find buyers.
Kelley said he鈥檚 not surprised Safe Kids hasn鈥檛 been able to yet, given the 鈥渉ardening鈥 of school security and student safety efforts over the last decade. 鈥淲e鈥檝e gone from having cameras and some pretty standard filters to having metal detectors, and locked doors, and biometrics, and vape detectors in the bathrooms, and these much more strict filters and content moderating control software,鈥 he said, 鈥渁nd all this is hard to undo.鈥
This was originally published on .
Did you use this article in your work?
We鈥檇 love to hear how 社区黑料鈥檚 reporting is helping educators, researchers, and policymakers.