Securly – 社区黑料 America's Education News Source Wed, 01 Nov 2023 21:39:52 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Securly – 社区黑料 32 32 Biden Order on AI Tackles Tech-Enabled Discrimination in Schools /article/biden-order-on-ai-tackles-tech-enabled-discrimination-in-schools/ Tue, 31 Oct 2023 21:01:00 +0000 /?post_type=article&p=717111 Updated Nov. 1

As artificial intelligence rapidly expands its presence in classrooms, President Biden signed an executive order Monday requiring federal education officials to create guardrails that prevent tech-driven discrimination. 

The , which the White House called 鈥渢he most sweeping actions ever taken to protect Americans from the potential risks of AI systems,鈥 offers several directives that are specific to the education sector. The order dealing with emerging technologies like ChatGPT directs the Justice Department to coordinate with federal civil rights officials on ways to investigate discrimination perpetuated by algorithms. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Within a year, the education secretary must release guidance on the ways schools can use the technology equitably, with a particular focus on the tools鈥 effects on 鈥渧ulnerable and underserved communities.鈥 Meanwhile, an Education Department 鈥淎I toolkit鈥 released within the next year will offer guidance on how to implement the tools so that they enhance trust and safety while complying with federal student privacy rules. 

For civil rights advocates who have decried AI鈥檚 potentially unintended consequences, the order was a major step forward. 

The order鈥檚 focus on civil rights investigations 鈥渁ligns with what we鈥檝e been advocating for over a year now,鈥 said Elizabeth Laird, the director of equity and civic technology at the nonprofit Center for Democracy and Technology. Her group has called on the Education Department鈥檚 Office for Civil Rights to open investigations into the ways AI-enabled tools in schools could have a disparate impact on students based on their race, disability, sexual orientation and gender identity. 

鈥淚t鈥檚 really important that this office, which has been focused on protecting marginalized groups of students for literally decades, is more involved in conversations about AI and can bring that knowledge and skill set to bear on this emerging technology,鈥 Laird told 社区黑料. 

In to federal agencies on Wednesday, the Office of Management and Budget spelled out the types of AI education technologies that pose civil rights and safety risks. They include tools to detect student cheating, monitor their online activities, project academic outcomes, make discipline recommendations or facilitate surveillance online and in-person.  

An Education Department spokesperson didn鈥檛 respond to a request for comment Monday on how the agency plans to respond to Biden鈥檚 order. 

Schools nationwide have adopted artificial intelligence in divergent ways, including in to provide students individualized lessons and with the growing use of chatbots like ChatGPT by both students and teachers. It鈥檚 also generated heated debates over technology鈥檚 role in exacerbating harms to at-risk youth, including educators鈥 use of early warning systems that mine data about students 鈥 including their race and disciplinary records 鈥 to predict their odds of dropping out of school. 

鈥淲e鈥檝e heard reported cases of using data to predict who might commit a crime, so very Minority Report,鈥 Laird said. 鈥淭he bar that schools should be meeting is that they should not be targeting students based on protected characteristics unless it meets a very narrowly defined purpose that is within the government鈥檚 interests. And if you鈥檙e going to make that argument, you certainly need to be able to show that this is not causing harm to the groups that you鈥檙e targeting.鈥 

AI and student monitoring tools

An unprecedented degree of student surveillance has also been facilitated by AI, including online activity monitoring tools, remote proctoring software to detect cheating on tests and campus security cameras with facial recognition capabilities. 

Beyond its implications on schools, the Biden order requires certain technology companies to conduct AI safety testing before their products are released to the public and to provide their results to the government. It also orders new regulations to ensure AI won鈥檛 be used to produce nuclear weapons, recommends that AI-generated photos and videos be transparently identified as such with watermarks and calls on Congress to pass federal data privacy rules 鈥渢o protect all Americans, especially kids.鈥

In September, The Center for Democracy and Technology released a report that warned that schools鈥 use of AI-enabled digital monitoring tools, which track students鈥 behaviors online, could have a disparate impact on students 鈥 particularly LGBTQ+ youth and those with disabilities 鈥 in violation of federal civil rights laws. As teachers punish students for using ChatGPT to allegedly cheat on classroom assignments, a survey suggested that children in special education were more likely to face discipline than their general education peers. They also reported higher levels of surveillance and subsequent discipline as a result. 

In response to the report, a coalition of Democratic lawmakers penned a letter urging the Education Department鈥檚 civil rights office to investigate districts that use digital surveillance and other AI tools in ways that perpetuate discrimination. 

Education technology companies that use artificial intelligence could come under particular federal scrutiny as a result of the order, said consultant Amelia Vance, an expert on student privacy regulations and president of the Public Interest Privacy Center. The order notes that the federal government plans to enforce consumer protection laws and enact safeguards 鈥渁gainst fraud, unintended bias, discrimination, infringements on privacy and other harms from AI.鈥 

鈥淪uch protections are especially important in critical fields like healthcare, financial services, education, housing, law and transportation,鈥 the order notes, 鈥渨here mistakes by or misuse of AI could harm patients, cost consumers or small businesses or jeopardize safety or rights.鈥

Schools rely heavily on third-party vendors like education technology companies to provide services to students, and those companies are subject to Federal Trade Commission rules against deceptive and unfair business practices, Vance noted. The order鈥檚 focus on consumer protections, she said, 鈥渨as sort of a flag for me that maybe we鈥檙e going to see not only continuing interest in regulating ed tech, but more specifically regulating ed tech related to AI.鈥

While the order was 鈥減retty vague when it came to education,鈥 Vance said it was important that it did acknowledge AI鈥檚 potential benefits in education, including for personalized learning and adaptive testing. 

鈥淎s much as we keep talking about AI as if it showed up in the past year, it鈥檚 been there for a while and we know that there are valuable ways that it can be used,鈥 Vance said. 鈥淚t can surface particular content, it can facilitate better connections to people when they need certain content.鈥 

AI and facial recognition cameras

As school districts pour billions of dollars into school safety efforts in the wake of mass school shootings, security vendors have heralded the promises of AI. Yet civil rights groups have warned that facial recognition and other AI-driven technology in schools could perpetuate biases 鈥 and could miss serious safety risks. 

Just last month, the gun-detection company Evolv Technology, which pitches its hardware to schools, acknowledged it was the subject of a Federal Trade Commission inquiry into its marketing practices. The agency is reportedly probing whether the company employs artificial intelligence in the ways that it claims. 

In September, New York became the first state to , a move that followed outcry when an upstate school district announced plans to roll out a surveillance camera system that tracked students鈥 biometric data. 

A new Montana law bans facial recognition statewide with one notable exception 鈥 . Citing privacy concerns, the law adopted this year prohibits government agencies from using facial recognition, but with a specific carveout for schools. One rural education system, the 250-student Sun River School District, employs a 30-camera security system from Verkada that uses facial recognition to track the identities of people on its property. As a result, the district has a camera-to-student ratio of 8-to-1. 

In an email on Wednesday, a Verkada spokesperson said the company is in the process of reviewing Biden’s order to understand its implications on the company.

Verkada offers a cautionary tale about the potential security vulnerabilities of campus surveillance systems. In 2021, the company suffered a massive data breach and hackers claimed to expose the live feeds of 150,000 surveillance cameras 鈥 including those in place at Sandy Hook Elementary School in Newtown, Connecticut, the site of a mass shooting in 2012.聽A conducted on behalf of the company found the breach was more limited, affecting some 4,500 cameras.

Hikvision has similarly made inroads in the school security market with its facial recognition surveillance cameras 鈥 including during a pandemic-era push to enforce face mask compliance. Yet the company, owned in part by the Chinese government, has also faced significant allegations of civil rights abuses and in 2019 was placed on a U.S. trade blacklist after being implicated in the country鈥檚 鈥渃ampaign of repression, mass arbitrary detention and high-technology surveillance鈥 against Muslim ethnic minorities. 

Though multiple U.S. school districts continue to use Hikvision cameras, a recent investigation found the company鈥檚 software despite claiming for years it had ended the practice.

 In an email, a Hikvision spokesperson didn鈥檛 comment on how Biden’s executive order could affect its business, including in schools, but offered a letter it shared to its customers in response to the investigation, saying an outdated reference to ethnic detection appeared on its website erroneously.

鈥淚t has been a longstanding Hikvision policy to prohibit the use of minority recognition technology,鈥 the letter states. 鈥淎s we have previously stated, that functionality was phased out and completely prohibited by the company in 2018.鈥

Data scientist David Riedman, who built a national database to track school shootings dating back decades, said that artificial intelligence is at 鈥渢he forefront鈥 of the school safety conversation and emerging security technologies can be built in ways that don鈥檛 violate students鈥 rights. 

Riedman became a figure in the national conversation about school shootings as the creator of the K12 School Shooting Database but has since taken on an additional role as director of industry research and content for ZeroEyes, a surveillance software company that uses security cameras to ferret out guns. Instead of using facial recognition, the ZeroEyes algorithm was trained to identify and notify law enforcement within seconds of spotting a firearm. 

The 鈥 as opposed to facial recognition 鈥 can 鈥渆vade privacy and bias concerns that plague other AI models,鈥 and internal research found that 鈥渙nly 0.06546% of false positives were humans detected as guns.鈥 

鈥淭he simplicity鈥 of ZeroEye鈥檚 technology, Riedman said, puts the company in good standing as far as the Biden order is concerned.

鈥淶eroEyes isn鈥檛 looking for people at all,鈥 he said. 鈥淚t鈥檚 only looking for objects and the only objects it is trying to find, and it鈥檚 been trained to find, are images that look like guns. So you鈥檙e not getting student records, you鈥檙e not getting student demographics, you鈥檙e not getting anything related to people or even a school per se. You just have an algorithm that is constantly searching for images to see if there is something that looks like a firearm in them.鈥

However, false positives remain a concern. Just last week at a high school in Texas, from ZeroEyes prompted a campus lockdown that set off student and parent fears of an active shooting. The company said the false alarm was triggered by an image of a student outside who the system believed was armed based on shadows and the way his arm was positioned. 

]]>
New Report: School Shootings Spawned ‘Digital Dystopia’ of Student Surveillance /article/new-report-school-shootings-spawned-digital-dystopia-of-student-surveillance/ Tue, 03 Oct 2023 18:48:00 +0000 /?post_type=article&p=715730 Updated, Oct. 4

Reeled in by deceptive, fear-based marketing and an influx of federal cash, school leaders have purchased and pervasively deployed student surveillance tools while failing to consider their detrimental consequences to young people鈥檚 civil rights, a new ACLU report concludes. 

In a youth survey accompanying the , a majority of students expressed worries that the tools 鈥 designed to keep them safe 鈥 could actually cause harm and a third said they 鈥渁lways feel鈥 like they鈥檙e being watched. 

The 61-page report, titled 鈥淒igital Dystopia,鈥 also offers an in-depth look at the rise of schools鈥 reliance on surveillance technology over the last few decades, arguing the tools have failed to improve campus safety while subjecting students 鈥 particularly students of color and those who are undocumented, LGBTQ or from low-income households 鈥 to discrimination. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淭he ed tech surveillance companies, after fanning the flames of fear, were making these broad statements about the efficacy of their products, about their ability to keep students safe鈥 from threats like school shootings and suicide, despite a lack of evidence to back up their claims, report lead author and ACLU senior policy counsel Chad Marlow told 社区黑料. 

Rather than making kids safe, Marlow said, the tools could be damaging to their development and well-being. 鈥淭he harm is actually significant and, by not acknowledging the harms that are caused, there鈥檚 less incentive to look at other interventions,鈥 he said.

ACLU

Three-quarters of students worry about at least one negative consequence of student surveillance, which includes the widespread proliferation of digital tools that monitor their online communications for references to sex, drugs, violence or self-harm, according to the online survey. Commissioned by the ACLU, the polling firm YouGov queried 502 teens throughout the country in October 2022. Nearly a quarter of respondents said that digital monitoring tools limit the resources they feel they can access online while a similar percentage worried the information collected about them could be shared with the police or be used against them in the future by a college or an employer. Some 27% feared the tools could be used for disciplinary purposes.

As a result, students alter their behaviors due to fears that 鈥渄eviating from expectations is punishable in the world that they鈥檙e growing up in,鈥 Marlow said. 鈥淲hat does that tell them about innovation or exploring new ideas?鈥

Survey findings , released last month by the nonprofit Center for Democracy and Technology, which found that while a majority of parents and students still embrace digital tools that monitor students鈥 online behaviors, their support has dwindled over the last year. 

Both reports identified detrimental effects of digital surveillance that researchers said run counter to federal civil rights laws that protect students from discrimination based on race, disability, sexual orientation or gender identity. 

In the student survey conducted by the Center for Democracy and Technology, researchers found that while districts bought digital monitoring tools to keep students safe, they are used regularly as discipline tools that routinely bring youth in contact with the police. LGBTQ+ youth and those with disabilities were significantly more likely to experience the harms of surveillance. For example, 65% of LGBTQ+ youth said they or someone they knew got into trouble due to online activity monitoring, compared to 56% of their straight and cisgender peers. Meanwhile, nearly a third of LGBTQ+ students said that they or someone they know has been 鈥渙uted鈥 by the technology.

In the absence of rigorous, independent research on the efficacy of school surveillance tools to improve campus safety, the ACLU report argues that schools are left to make purchasing decisions based on what the group called fear-based marketing tactics. Security companies hype the risks of school violence and student self-harm while overstating the utility of their products, the report says. Security industry lobbying efforts, meanwhile, have successfully steered hundreds of millions of dollars in government school safety spending toward unproven technologies. 

鈥淚t would be like going to buy a car and the only source of information is the car salesperson,鈥 Marlow said. 鈥淭hat鈥檚 probably not the best way to make a car purchasing decision, but that鈥檚 what鈥檚 happening with student surveillance.鈥 

The Security Industry Association, a trade group that represents security companies and lobbies on their behalf, didn鈥檛 immediately respond to a request for comment. 

The ACLU survey results suggest, however, that students have a complicated relationship with school surveillance: While recognizing its potential harms, many also believe it serves its intended purpose. Specifically, 40% of students reported that surveillance technology makes them feel 鈥渟afe鈥 and 43% said it makes them feel 鈥減rotected.鈥 Meanwhile, just 14% said it makes them feel 鈥渁nxious鈥 and a fraction of respondents, 7%, said the tools made them feel 鈥渦nsafe.鈥 

Marlow said this support may be the result, at least in part, of successful marketing and a belief that few other options exist. 

鈥溾嬧媁hen you talk about keeping students safe, I think students are smart enough to realize that in too many places in this country, gun control is off the table,鈥 he said. 鈥淏ecause of the dominance of money and power of the ed tech surveillance industry,鈥 that鈥檚 used in marketing and lobbying, 鈥渢he discussion is almost entirely centered around, 鈥楧o we use or do we not use student surveillance technologies?鈥 while alternatives like mental health screenings fail to receive similar consideration. 鈥淚n that option, between a highly questionable, harmful protection or nothing at all, no one wants to pick nothing at all.鈥 

While the report focuses largely on digital tools that monitor students鈥 behaviors online, it also questions the efficacy of surveillance cameras in creating physical safety for students in schools. Cameras have become nearly ubiquitous, with them in the 2019-20 school year, according to the most recent data included in a U.S. Department of Education report released last month. 

Meanwhile, just 55% of schools offered students mental health assessments, according to the most recent federal data, and 42% offered mental health treatment services. 

Despite a sharp rise in schools鈥 reliance on surveillance and other tools in the last two decades, the number of school shootings has grown. 

There were a record 188 school shootings resulting in injuries or deaths in the 2021-22 school year, according to the federal report. That鈥檚 twice as many shootings on campus than the previous record 鈥 set just one year earlier. Placing security cameras in schools, Marlow argues, has failed to deter the very crimes they were installed to prevent. In an ACLU analysis of the 10 deadliest school shootings in the last two decades, for example, researchers found that surveillance cameras were present for eight, including in Parkland, Florida, and Uvalde, Texas. 

Along with scrutiny from researchers and civil rights groups, schools鈥 use of digital monitoring tools has led to several lawsuits alleging they鈥檙e ineffective and violate students鈥 civil liberties. 

In one class-action lawsuit, filed this year in California, the parents of two students claim the student surveillance company and sold the information to targeted advertising vendors without their knowledge or consent. 

A separate federal negligence lawsuit, filed in 2021 in Oklahoma, of being ineffective at keeping kids safe from self-harm. The lawsuit, filed by the parents of a 15-year-old boy who died by suicide, accuses the surveillance company and the state鈥檚 third-largest school district of failing to act on warning signs that could have prevented the teenager鈥檚 2019 death. 

The student submitted a 鈥減ersonal odyssey鈥 essay in his freshman English class that was riddled with references to self-harm and suicide, but his teacher failed to act, the complaint alleges, giving him a grade of 100%. The district used Gaggle to identify and flag troubling student digital communications, including references to self-harm and suicide. Yet the lawsuit alleges the company 鈥渇ailed to notify school administration鈥 about the student鈥檚 warning signs, including the essay titled 鈥淩unning Out of Reasons鈥 and an email with a classmate where the two contemplated a plan to 鈥済o out at the same time.鈥

A Gaggle spokesperson didn鈥檛 immediately respond to a request for comment. Securly spokesperson Josh Mukai called the lawsuit 鈥渂aseless and uninformed.鈥

鈥淪ecurly has never sold student data to third parties, nor have we ever used student data to target advertisements,鈥 Mukai said in an email. 鈥淪ecurly鈥檚 suite of student safety solutions upholds the highest standards for student data privacy and complies with all international, federal and state privacy regulations.鈥

]]>
Gaggle Drops LGBTQ Keywords from Student Surveillance Tool Following Bias Concerns /article/gaggle-drops-lgbtq-keywords-from-student-surveillance-tool-following-bias-concerns/ Fri, 27 Jan 2023 12:15:00 +0000 /?post_type=article&p=703034 Digital monitoring company Gaggle says it will no longer flag students who use words like 鈥済ay鈥 and 鈥渓esbian鈥 in school assignments and chat messages, a significant policy shift that follows accusations its software facilitated discrimination of LGBTQ teens in a quest to keep them safe.

A spokesperson for the company, which describes itself , cited a societal shift toward greater acceptance of LGBTQ youth 鈥 rather than criticism of its product 鈥 as the impetus for the change as part of a 鈥渃ontinuous evaluation and updating process.鈥

The company, which uses artificial intelligence and human content moderators to sift through billions of student communications each year, has long defended its use of LGBTQ-specific keywords to identify students who might hurt themselves or others. In arguing the targeted monitoring is necessary to save lives, executives have pointed to the prevalence of bullying against LGBTQ youth and data indicating they鈥檙e than their straight and cisgender classmates. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


But in practice, Gaggle鈥檚 critics argued, the keywords put LGBTQ students at a heightened risk of scrutiny by school officials and, on some occasions, the police. Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity 鈥 often called outing 鈥 as a result of digital activity monitoring, according to released in August by the nonprofit Center for Democracy and Technology. The survey encompassed the impacts of multiple monitoring companies who contract with school districts, such as GoGuardian, Gaggle, Securly and Bark. 

Gaggle鈥檚 decision to remove several LGBTQ-specific keywords, including 鈥渜ueer鈥 and 鈥渂isexual,鈥 from its dictionary of words that trigger alerts was first reported in . It follows extensive reporting by 社区黑料 into the company鈥檚 business practices and sometimes negative effects on students who are caught in its surveillance dragnet. 

Though Gaggle鈥檚 software is generally limited to monitoring school-issued accounts, including those by Google and Microsoft, the it can scan through photos on students鈥 personal cell phones if they plug them into district laptops.

The keyword shift comes at a particularly perilous moment, as Republican lawmakers in multiple states . Legislation has looked to curtail classroom instruction about sexual orientation and gender identity, ban books and classroom curricula featuring LGBTQ themes and prohibit transgender students from receiving gender-affirming health care, participating in school athletics and using restroom facilities that match their gender identities. Such a hostile political climate and pandemic-era disruptions, a recent youth survey by The Trevor Project revealed, has contributed to an uptick in LGBTQ youth who have seriously considered suicide. 

The U.S. Education Department received 453 discrimination complaints involving students鈥 sexual orientation or gender identity last year, according to data provided to 社区黑料 by its civil rights office. That鈥檚 a significant increase from previous years, including in 2021 when federal officials received 249 such complaints. The Trump administration took and complaints dwindled. In 2018, the Education Department received just 57 complaints related to sexual orientation or gender identity discrimination.

The increase in discrimination allegations involving sexual orientation or gender identity are part of , according to data obtained by The New York Times. The total number of complaints for 2021-22 grew to 19,000, a historic high and more than double the previous year. 

In September, 社区黑料 revealed that Gaggle had donated $25,000 to The Trevor Project, the nonprofit that released the recent youth survey and whose advocacy is focused on suicide prevention among LGBTQ youth. The arrangement was framed on Gaggle鈥檚 website as a collaboration to 鈥渋mprove mental health outcomes for LGBTQ young people.鈥 

The revelation was met with swift backlash on social media, with multiple Trevor Project supporters threatening to halt future donations. Within hours, the group announced it had returned the donation, acknowledging concerns about Gaggle 鈥渉aving a role in negatively impacting LGBTQ students.鈥 

The Trevor Project didn鈥檛 respond to requests for comment on Gaggle鈥檚 decision to pull certain LGBTQ-specific keywords from its systems. 

In a statement to 社区黑料, Gaggle spokesperson Paget Hetherington said the company regularly modifies the keywords its software uses to trigger a human review of students鈥 digital communications. Certain LGBTQ-specific words, she said, are no longer relevant to the 24-year-old company鈥檚 efforts to protect students from abuse and were purged late last year.

鈥淎t points in time in the not-too-distant past, those words were weaponized by bullies to harass and target members of the LGBTQ+ community, so as part of an effective methodology to combat that discriminatory harassment and violence, those words were once effective tools to help identify dangerous situations,鈥 Hetherington said. 鈥淭hankfully, over the past two decades, our society evolved and began a period of widespread acceptance, especially among the K-12 student population that Gaggle serves. With that evolution and acceptance, it has become increasingly rare to see those words used in the negative, harassing context they once were; hence, our decision to take these off our word/phrases list.鈥

Hetherington said Gaggle will continue to monitor students鈥 use of the words 鈥渇aggot,鈥 鈥渓esbo,鈥 and others that are 鈥渃ommonly used as slurs.鈥 A previous review by 社区黑料 found that Gaggle regularly flagged students for harmless speech, like profanity in fictional articles submitted to a school鈥檚 literary magazine, and students鈥 private journals. 

Anti-LGBTQ activists have , and privacy advocates warn that in the era of 鈥淒on鈥檛 Say Gay鈥 laws and abortion bans, information gleaned from Gaggle and similar services could be weaponized against students.

Gaggle executives have minimized privacy concerns and claim the tool saved more than 1,400 lives last school year. That statistic hasn鈥檛 been independently verified and there鈥檚 a dearth of research to suggest digital monitoring is an effective school-safety tool. A recent survey found a majority of parents and teachers believe the benefits of student monitoring outweigh privacy concerns. The Vice News documentary included the perspective of a high school student who was flagged by Gaggle for writing a paper titled 鈥淓ssay on the Reasons Why I Want to Kill Myself but Can鈥檛/Didn鈥檛.鈥 Adults wouldn鈥檛 have known she was struggling without Gaggle, she said. 

鈥淚 do think that it鈥檚 helpful in some ways,鈥 the student said, 鈥渂ut I also kind of think that it鈥檚 鈥 I wouldn鈥檛 say an invasion of privacy 鈥 but if obviously something gets flagged and a person who it wasn鈥檛 intended for reads through that, I think that鈥檚 kind of uncomfortable.鈥 

Student surveillance critic Evan Greer, director of the nonprofit digital rights group said the tweaks to Gaggle鈥檚 keyword dictionary are unlikely to have a significant effect on LGBTQ teens and blasted the company鈥檚 stated justification for the move as being 鈥渙ut of touch鈥 with the state of anti-LGBTQ harassment in schools. Meanwhile, Greer said that LGBTQ youth frequently refer to each other using 鈥渞eclaimed slurs,鈥 reappropriating words that are generally considered derogatory and remain in Gaggle鈥檚 dictionary. 

鈥淭his is just like lipstick on a pig 鈥 no offense to pigs 鈥 but I don鈥檛 see how this actually in any meaningful way mitigates the potential for this software to nonconsensually out LGBTQ students to administrators,鈥 Greer said. 鈥淚 don鈥檛 see how it prevents the software from being used to invade the privacy of students in a wide range of other circumstances.鈥

Gaggle and its competitors 鈥 including , and 鈥 have faced similar scrutiny in Washington. In April, Democratic Sens. Elizabeth Warren and Ed Markey argued in a report that the tools could be misused to discipline students and warned they could be used disproportionately against students of color and LGBTQ youth. 

Jeff Patterson

In , Gaggle founder and CEO Jeff Patterson said the company cannot test the potential for bias in its system because the software flags student communications anonymously and the company has 鈥渘o context or background on students,鈥 including their race or sexual orientation. They also said their monitoring services are not meant to be used as a disciplinary tool. 

In the survey released last summer by the Center for Democracy and Technology, however, 78% of teachers reported that digital monitoring tools were used to discipline students. Black and Hispanic students reported being far more likely than white students to get into trouble because of online monitoring. 

In October, the White House cautioned school districts against the 鈥渃ontinuous surveillance鈥 of students if monitoring tools are likely to trample students鈥 rights. It also directed the Education Department to issue guidance to districts on the safe use of artificial intelligence. The guidance is expected to be released early this year.

Evan Greer (Twitter/@evan_greer)

As an increasing number of districts implement Gaggle for bullying prevention efforts, surveillance critic Greer said the company has failed to consider how adults can cause harm.

鈥淭here is now a very visible far-right movement attacking LGBTQ kids, and particularly trans kids and teenagers,鈥 Greer said. 鈥淚f anything, queer kids are more in the crosshairs today than they were a year ago or two years ago 鈥 and that鈥檚 why this surveillance is so dangerous.鈥

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741. For LGBTQ mental health support, contact The Trevor Project鈥檚 toll-free support line at 866-488-7386.

]]>
White House Cautions Schools Against 鈥楥ontinuous Surveillance鈥 of Students /article/white-house-cautions-schools-against-continuous-surveillance-of-students/ Tue, 04 Oct 2022 21:38:35 +0000 /?post_type=article&p=697623 Updated, Oct. 5

The Biden administration on Tuesday urged school districts nationwide to refrain from subjecting students to 鈥渃ontinuous surveillance鈥 if the use of digital monitoring tools 鈥 already accused of targeting at-risk youth 鈥 are likely to trample students鈥 rights. 

The White House recommendation was included in an in-depth but non-binding white paper, dubbed the that seeks to rein in the potential harms of rapidly advancing artificial intelligence technologies, from smart speakers featuring voice assistants to campus surveillance cameras with facial recognition capabilities. 

The blueprint, which was released by the White House Office of Science and Technology Policy and extends far beyond the education sector, lays out five principles: Tools that rely on artificial intelligence should be safe and effective, avoid discrimination, ensure reasonable privacy protections, be transparent about their practices and offer the ability to opt out 鈥渋n favor of a human alternative.鈥


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Though the blueprint lacks enforcement, schools and education technology companies should expect greater federal scrutiny soon. In , the White House announced that the Education Department would release by early 2023 recommendations on schools鈥 use of artificial intelligence that 鈥渄efine specifications for the safety, fairness and efficacy of AI models used within education鈥 and introduce 鈥済uardrails that build on existing education data privacy regulations.鈥 

During , Education Secretary Miguel Cardona said officials at the department 鈥渆mbrace utilizing Ed Tech to enhance learning鈥 but recognize 鈥渢he need for us to change how we do business.鈥 The future guidance, he said, will focus on student data protections, ensuring that digital tools are free of biases and incorporate transparency so parents know how their children鈥檚 information is being used.

鈥淭his has to be baked into how we do business in education, starting with the systems that we have in our districts but also teacher preparation and teacher training as well,鈥 he said.

Amelia Vance, president and founder of Public Interest Privacy Consulting, said the document amounts to a 鈥渕assive step forward for the advocacy community, the scholars who have been working on AI and have been pressuring the government and companies to do better.鈥 

The blueprint, which offers a harsh critique of and systems that predict student success based on factors like poverty, follows in-depth reporting by 社区黑料 on schools鈥 growing use of digital surveillance and the tech鈥檚 impact on student privacy and civil rights.

But local school leaders should ultimately decide whether to use digital student monitoring tools, said Noelle Ellerson Ng, associate executive director of advocacy and governance at AASA, The School Superintendents Association. Ellerson Ng opposes 鈥渦nilateral federal action to prohibit鈥 the software.

鈥淭hat鈥檚 not the appropriate role of the federal government to come and say this cannot happen,鈥 she said. 鈥淏ut smart guardrails that allow for good practices, that protect students鈥 safety and privacy, that鈥檚 a more appropriate role.鈥

The nonprofit Center for Democracy and Technology praised the report. The group recently released a survey highlighting the potential harms of student activity monitoring on at-risk youth, who are already disproportionately disciplined and referred to the police as a result. In a statement Tuesday, it said the blueprint makes clear 鈥渢he ways in which algorithmic systems can deepen inequality.鈥 

鈥淲e commend the White House for considering the diverse ways in which discrimination can occur, for challenging inappropriate and irrelevant data uses and for lifting up examples of practical steps that companies and agencies can take to reduce harm,鈥 CEO Alexandra Reeve Givens said in a media release. 

The document also highlights several areas where artificial intelligence has been beneficial, including improved agricultural efficiency and algorithms that have been used to identify diseases. But the technologies, which have grown rapidly with few regulations, have introduced significant harm, it notes, including that screen job applicants and facial recognition technology that . 

After the pandemic shuttered schools nationwide in early 2020 and pushed students into makeshift remote learning, companies that sell digital activity monitoring software to schools saw an increase in business. But the tools have faced significant backlash for subjecting students to relentless digital surveillance. 

In April, Massachusetts Sens. Elizabeth Warren and Ed Markey warned in a report the technology could carry significant risks 鈥 particularly for students of color and LGBTQ youth 鈥 and promoted a 鈥渘eed for federal action to protect students鈥 civil rights, safety and privacy.鈥 Such concerns have become particularly acute as states implement new anti-LGBTQ laws and abortion bans and advocates warn that digital surveillance tools could expose expose youth to legal peril. 

Vance said that she and others focused on education and privacy 鈥渉ad no idea this was coming,鈥 and that it would focus so heavily on schools. Over the last year, the department sought input from civil rights groups and technology companies, but Vance said that education groups had lacked a meaningful seat at the table. 

The lack of engagement was apparent, she said, by the document鈥檚 failure to highlight areas where artificial intelligence has been beneficial to students and schools. For example, the document discusses a tool used by universities to predict which students were likely to drop out. It considered students鈥 race as a predictive factor, leading to discrimination fears. But she noted that if implemented equitably, such tools can be used to improve student outcomes. 

鈥淥f course there are a lot of privacy and equity and ethical landmines in this area,鈥 Vance said. 鈥淏ut we also have schools who have done this right, who have done a great job in using some of these systems to assist humans in counseling students and helping more students graduate.鈥 

Ellerson Ng, of the superintendents association, said her group is still analyzing the blueprint鈥檚 on-the-ground implications, but that student data privacy efforts present schools with 鈥渁 balancing act.鈥

鈥淵ou want to absolutely secure the privacy rights of the child while understanding that the data that can be generated, or is generated, has a role to play, too, in helping us understand where kids are, what kids are doing, how a program is or isn鈥檛 working,鈥 she said. 鈥淪ometimes that鈥檚 broader than just a pure academic indicator.鈥

Others have and just of recommendations from civil rights groups and tech companies. Some of the most outspoken privacy proponents and digital surveillance critics, such as Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project, argued it falls short of a critical policy move: outright bans.

As Cahn and other activists mount campaigns against student surveillance tools, they鈥檝e highlighted how student data can wind up in the hands of the police.

鈥淲hen police and companies are rolling out new and destructive forms of AI every day, we need to push pause across the board on the most invasive technologies,鈥 he said in a media release. 鈥淲hile the White House does take aim at some of the worst offenders, they do far too little to address the everyday threats of AI, particularly in police hands.鈥

]]>
With 鈥楧on鈥檛 Say Gay鈥 Laws & Abortion Bans, Student Surveillance Raises New Risks /article/with-dont-say-gay-laws-abortion-bans-student-surveillance-raises-new-risks/ Thu, 08 Sep 2022 10:30:00 +0000 /?post_type=article&p=696150 While growing up along the Gulf Coast in Mississippi, Kenyatta Thomas relied on the internet and other teenagers to learn about sex.

Thomas and their peers watched videos during high school gym class that stressed the importance of abstinence 鈥 and the horrors that can come from sex before marriage. But for Thomas, who is bisexual and nonbinary, the lessons didn鈥檛 explain who they were as a person. 

鈥淚t was very confusing trying to navigate understanding who I am and my identity,鈥 said Thomas, now a student at Arizona State University. It was on the internet that Thomas learned about a whole community of young people with similar experiences. Blog posts on Tumblr helped them make sense of their place in the world and what it meant to be bisexual. 鈥淚 was able to find the words to understand who I am 鈥 words that I wouldn’t be able to piece together in a sentence if the internet wasn鈥檛 there.鈥 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


But now, as states adopt anti-LGBTQ laws and abortion bans, the digital footprint that Thomas and other students leave may come back to harm them, privacy and civil rights advocates warn, and it could be their school-issued devices that end up exposing them to that legal peril.

For years, schools across the U.S. have used digital surveillance tools that collect a trove of information about youth sexuality 鈥 intimate details that are gleaned from students鈥 conversations with friends, diary entries and search histories. Meanwhile, student information collected by student surveillance companies are regularly shared with police, according to a recent survey conducted by the nonprofit Center for Democracy and Technology. These two realities are concerning to Elizabeth Laird, the center鈥檚 director of equity in civic technology. Following the Supreme Court鈥檚 repeal of Roe v. Wade in June, she said information about youth sexuality could be weaponized. 

 鈥淩ight now 鈥 without doing anything 鈥 schools may be getting alerts about students鈥 who are searching the internet for resources related to reproductive health,鈥 Laird said. 鈥淚f you are in a state that has a law that criminalizes abortion, right now this tool could be used to enforce those laws.鈥

Teens across the country are already to fill the void for themselves and their peers in the current climate. Thomas, the ASU student and an outspoken reproductive justice activist, said that while students are generally aware that school devices and accounts are monitored, the repeal of Roe has led some to take extra privacy precautions. 

Kenyatta Thomas, an Arizona State University student and activist, participates in an abortion-rights protest. (Photo courtesy Kenyatta Thomas)

鈥淚 have switched to using Signal to talk to friends and colleagues in this space,鈥 they said, referring to the . 鈥淭he fear, even though it鈥檚 been common knowledge for basically my generation鈥檚 entire life that everything you do is being surveilled, it definitely has been amplified tenfold.鈥

Police have long used social media and other online platforms to investigate people for breaking abortion rules, including where police obtained a teen鈥檚 private Facebook messages through a search warrant before charging the then-17-year-old and her mother with violating the state鈥檚 ban on abortions after 20 weeks of pregnancy. 

LGBTQ students face similar risks as lawmakers in Florida and elsewhere impose rules that prohibit classroom discussions about sexuality and gender. This year alone, lawmakers have proposed 300 anti-LGBTQ bills and about a dozen have . They so-called 鈥淒on鈥檛 Say Gay鈥 laws in Florida and Alabama that ban classroom discussions about gender and sexuality and require school officials to tell the parents of children who share that they may be gay or transgender. 

In a survey, a fifth of LGBTQ students told the Center for Democracy and Technology that they or another student they knew had their sexual orientation or gender identity disclosed without their consent due to online student monitoring. They were more likely than straight and cisgender students to report getting into trouble for their web browsing activity and to be contacted by the police about having committed a crime. 

LGBTQ youth are nearly twice as likely as their straight and cisgender classmates to search for health information online, according to . But as anti-LGBTQ laws proliferate, student surveillance tools should reconsider collecting data about youth sexuality, Christopher Wood, the group鈥檚 co-founder and executive director, told 社区黑料. 

鈥淩ight now, we are not in a landscape or an environment where that is safe for a company to be doing,鈥 Wood said. 鈥淚f there is a remote possibility that the information that they are trying to provide to help a student could potentially lead them into more harm, then they need to be looking at that very carefully and considering whether that is the appropriate direction for a company to be taking.鈥

Digital student monitoring tools have a negative disparate impact on LGBTQ youth, according to a recent student survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

鈥楨xtraordinarily concerned鈥

For decades, has required school technology to block access to images that are obscene, child pornography or deemed 鈥渉armful to minors,鈥 and schools have used web-filtering software to prevent students from accessing sexually explicit content. But in some cases, the filtering to block pro-LGBTQ websites that aren鈥檛 explicit, including those that offer crisis counseling.  

Many student monitoring tools, which saw significant growth during the pandemic, go far beyond web filtering and employ artificial intelligence to track students across the web to identify issues like depression and violent impulses. The tools can sift through students鈥 social media posts, follow their digital movements in real time and scan files on school-issued laptops 鈥 from classroom assignments to journal entries 鈥 in search of warning signs. 

They鈥檝e also come under heightened scrutiny. In a report this year, Democratic Sens. Elizabeth Warren and Ed Markey warned that schools鈥 widespread adoption of the tools could trample students鈥 civil rights. By flagging words related to sexual orientation, the report notes, LGBTQ youth could be subjected to disproportionate disciplinary rates and be unintentionally outed to their parents. 

In in July, Warren and Markey cautioned that the tools could pose new risks following the repeal of Roe and asked four leading student surveillance companies 鈥 GoGuardian, Gaggle, Securly and Bark 鈥 whether they flag students for using keywords related to reproductive health, such as 鈥減regnant鈥 and 鈥渁bortion.鈥

鈥淲e are extraordinarily concerned that your software could result in punishment or criminalization of students seeking contraception, abortion or other reproductive health care,鈥 Markey and Warren wrote. 鈥淲ith reproductive rights under attack nationwide, it would represent a betrayal of your company鈥檚 mission to support students if you fail to provide appropriate protections for students鈥 privacy related to reproductive health information.鈥

Student activity monitoring tools are more often used to discipline students than protect them from violence and mental health crises, according to a recent teacher survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

The scrutiny is part of a larger concern over digital privacy in the post-Roe world. In August, the Federal Trade Commission and accused the company of selling the location data from hundreds of millions of cell phones that could be used to track peoples鈥 movements. Such precise location data, the , 鈥渕ay be used to track consumers to sensitive locations, including places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, medical facilities and welfare and homeless shelters.鈥 

School surveillance companies have acknowledged their tools track student references to sex but sought to downplay the risks they pose to students. Bark spokesperson Adina Kalish said the company began to immediately purge all data related to reproductive health after a leaked Supreme Court draft opinion suggested Roe鈥檚 repeal was imminent 鈥 despite maintaining a 30-day retention period for most other data. 

鈥淏y immediately and permanently deleting data which contains a student鈥檚 reproductive health data or searches for reproductive health information, such data is not in our possession and therefore not produce-able under a court order, subpoena, etc.,鈥 Bark CEO Brian Bason , which the company shared with 社区黑料. 

GoGuardian spokesperson Jeff Gordon said its tools 鈥渃annot be used by educators or schools to flag reproductive health-related search terms鈥 and its web filter cannot 鈥渇lag reproductive health-related searches.鈥 Securly didn鈥檛 respond to requests for comment. Last year its web-filtering tool categorized health resources for LGBTQ teens as pornography. 

Gaggle founder and CEO Jeff Patterson to the senators that his company does not 鈥渃ollect health data of any kind including reproductive health information,鈥 specifying that the monitoring tool does not flag students who use the terms 鈥減regnant, abortion, birth control, contraception or Planned Parenthood. 鈥 

Yet tracking conversations about sex is a primary part of Gaggle’s business 鈥 more than references to suicide, violence or drug use, according to nearly 1,300 incident reports generated by the company for Minneapolis Public Schools during a six-month period in 2020. The reports, obtained by 社区黑料, showed that 38% were prompted by content that was pornographic or sexual in nature, including references to 鈥渟exual activity involving a student.鈥 Students were regularly flagged for using keywords like 鈥渧irginity,鈥 鈥渞ape,鈥 and, simply, 鈥渟ex.鈥 

Patterson, the Gaggle CEO, has acknowledged that a student鈥檚 private diary entry about being raped wasn鈥檛 off limits. In touting the tool鈥檚 capabilities, he told 社区黑料 his company uncovered the girl鈥檚 diary entry, where she discussed how the assault led to self-esteem issues and guilt. Nobody knew she was struggling until Gaggle notified school officials about what they鈥檇 learned from her diary, Patterson said. 

鈥淭hey were able to intervene and get this girl help for things that she couldn鈥檛 have dealt with on her own,鈥 Patterson said.

Any information that surveillance companies collect about students鈥 sexual behaviors could be used against them by police during investigations, privacy experts warned. And it鈥檚 unclear, Laird said, how long the police can retain any data gleaned from the tools. 

鈥楧on鈥檛 Say Gay鈥

Internet search engines are 鈥減articularly potent鈥 tools to track the behaviors of pregnant people, by the nonprofit Surveillance Technology Oversight Project. In 2017, for example, a with second-degree murder of her stillborn fetus after police scoured her browser history and identified a search for an abortion pill. 

While GoGuardian and other companies offer web filtering to schools, Gaggle has sought to differentiate itself. In his letter to the senators, Patterson said the company 鈥 which sifts through files and chat messages on students鈥 school-issued Microsoft and Google accounts 鈥 is not a web filter and therefore 鈥渄oes not track students鈥 online searches.鈥 Yet Patterson鈥檚 assurance to lawmakers appears misleading. The company acknowledges on its website that it partners with several web-filtering companies, including Linewize, to analyze students鈥 online searches. By working in tandem, flags triggered by Linewize鈥檚 web filtering 鈥渃an be sent straight to the Gaggle Safety Team,鈥 if the material 鈥渟hould be forwarded to the school or district.鈥 

In an email, Gaggle spokesperson Paget Hetherington said that in 鈥渁 very small number of school systems,鈥 the company reviews alerts from web filters before they鈥檙e sent to school officials to 鈥渁lleviate the large number of false positives鈥 and ensure that 鈥渙nly the most critical and imminent issues are being seen by the district.鈥 

Gaggle has also faced scrutiny for including LGBTQ-specific keywords in its algorithm, including 鈥済ay鈥 and 鈥渓esbian.鈥 Patterson said the heightened surveillance of LGBTQ youth is necessary because they face a disproportionately high suicide rate, and Hetherington shared examples where the keywords were used to spot cyberbullying incidents. 

But critics have accused the company of discrimination. Wood of the nonprofit LGBT Tech said that anti-LGBT activists have used surveillance to target their opponents for generations. Prior to the seminal 1969 riots after New York City police raided the Stonewall Inn gay bar, LGBTQ spaces and made arrests for 鈥渋nferring sexual perversion鈥 and 鈥渟erving gay people.鈥 From the colonial era and into the 19th century, anti-sodomy laws carried the death penalty and police used the rules to investigate and incarcerate people suspected of same-sex intimate behaviors. 

Now, in the era of 鈥淒on鈥檛 Say Gay鈥 laws, digital surveillance tools could be used to out LGBTQ students and put them in danger, Wood said. Student surveillance companies can claim their decision to include LGBTQ terminology is designed to help students, but historically such data have 鈥渂een used against us in very detrimental ways.鈥 

Companies, he said, are unable to control how officials use that information in an era 鈥渨here teachers and administrators and other students are encouraged to out other students or blame them or somehow get them in trouble for their identity.鈥 In Texas, Republican Gov. Greg Abbott calling on child protective services to investigate as child abuse any parents who provide gender-affirming health care to their transgender children. 

鈥淭hey can鈥檛 control what鈥檚 going to happen in Florida or Texas and they can鈥檛 control what鈥檚 going to happen in an individual home,鈥 where students could be subjected to abuse, Wood said. 鈥淎ny person in their right mind would be horrified to learn that it was their technology that ended up harming a youth or driving a youth to the point of feeling so isolated that they felt the only way out was suicide.鈥 

When private thoughts become public

Susan, a 14-year-old from Cincinnati, knows firsthand how surveillance companies can target students for discussing their sexuality. In middle school, she was assigned to write a 鈥渢ime capsule鈥 letter to her future self. 

Until Susan retrieved the letter after high school graduation, her teacher said that no one 鈥 not even him 鈥 would read it. So Susan, who is now a freshman and asked to remain anonymous, used the private space to question her gender identity. 

But her teacher鈥檚 assurance wasn鈥檛 quite true, she learned. Someone had been reading the letter 鈥 and would soon hold it against her. 

In an automated May 2021 email, Gaggle notified her that the letter to her future self was 鈥渋dentified as inappropriate鈥 and urged her to 鈥渞efrain from storing or sharing inappropriate content.鈥 In a 鈥渟econd warning,鈥 sent to her inbox, she was told a school administrator was given 鈥渁ccess to this violation.鈥 After a third alert, she said, access to her school email account was restricted. She said the experience left her with 鈥渁 sense of betrayal from my school.鈥 She said she had no idea words like 鈥済ay鈥 or 鈥渟ex鈥 could get flagged by Gaggle鈥檚 algorithm.

Susan, a student from Cincinnati, received an email alert from Gaggle notifying her that her classroom assignment, a 鈥渢ime capsule鈥 letter to her future self, had been 鈥渋dentified as inappropriate.鈥 (Courtesy Susan)

鈥淚t鈥檚 frustrating to know that this program finds the need to have these as keywords, and quite depressing,鈥 she said. 鈥淭here鈥檚 always going to be oppression against the community somewhere, it seems, and it鈥檚 quite disheartening.鈥 

School administrators reviewed the time capsule letter and determined it didn鈥檛 contain anything inappropriate, her mother Margaret said. While Susan lives in an LGBTQ-affirming household, Thomas, who grew up in Mississippi, warned that鈥檚 not the case for everyone.

鈥淭hat鈥檚 not just the surveillance of your activities, that鈥檚 the surveillance of your thoughts,鈥 Thomas said of Susan鈥檚 experience. 鈥淚 know that wouldn鈥檛 have gone very well for me and I know for a lot of young people that would place them in a lot of danger.鈥

Such harms could be exacerbated, Margaret said, if authorities use student data to enforce Ohio鈥檚 strict abortion ban, which has already become the subject of national debate after a 10-year-old girl traveled to Indiana for an abortion. A 27-year-old man and accused of raping the child. 

Cincinnati Public Schools spokesman Mark Sherwood said in an email that 鈥渓aw enforcement is immediately contacted鈥 if the district receives an alert from Gaggle suggesting that a student poses 鈥渁n imminent threat of harm to self or others.鈥 

Given the state of abortion rules in Ohio, Susan said she鈥檚 concerned that student conversations and classroom assignments that discuss gender and sexuality could wind up in the hands of the police. She lost faith in school-issued technology after her assignment got flagged by Gaggle. 

鈥淚 just flat out don鈥檛 trust adults in positions of power or authority,鈥 Susan said. 鈥淵ou don鈥檛 really know for sure what their true motives are or what they could be doing with the tools they have at their disposal.鈥

]]>
Survey Reveals Extent that Cops Surveil Students Online 鈥 in School and at Home /article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 /?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids鈥 private lives 鈥 including on nights and weekends.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to . Nearly all teachers 鈥 89% 鈥 reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half 鈥 44% 鈥 said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students鈥 social media posts, follow their digital movements in real-time and scan files on school-issued laptops 鈥 from classroom assignments to journal entries 鈥 in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

鈥淚f we鈥檙e saying this is to keep students safe, but instead we鈥檙e using it punitively and we鈥檙e using it to invite law enforcement literally into kids鈥 homes, is this actually achieving its intended goal?鈥 asked Elizabeth Laird, a survey author and the center鈥檚 director of equity in civic technology. 鈥淥r are we, in the name of keeping students safe, actually endangering them?鈥

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court鈥檚 recent repeal of Roe v. Wade, she said, further muddles police officers鈥 role in student activity monitoring. As states implement anti-abortion laws, that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

鈥淲e know that law enforcement gets these alerts,鈥 she said. 鈥淚f you are in a state where they are looking to investigate these kinds of incidents, you鈥檝e invited them into a student鈥檚 house to be able to do that.鈥

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students鈥 homes to conduct wellness checks. On , students have been transported to the hospital for emergency mental health care. 

In a statement to 社区黑料, district spokesperson Andre Riley said that GoGuardian helps officials 鈥渋dentify potential risks to the safety of individual students, groups or schools,鈥 and that 鈥減roper accountability measures are taken鈥 if students violate the code of conduct or break laws.

鈥淭he use of GoGuardian is not simply a prompt for a law enforcement response,鈥 Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools鈥 reliance on the tools could violate students鈥 civil rights and exacerbate 鈥渢he school-to-prison pipeline by increasing law enforcement interactions with students.鈥 Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In , Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be 鈥渋n immediate danger.鈥 In on the company鈥檚 website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the to lawmakers. Executives at Bark said 鈥渢here are limited options鈥 beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

鈥淲hile we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,鈥 in its letter. 鈥淚rrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.鈥 

In its , GoGuardian states the company may disclose student information 鈥渋f we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.鈥 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey鈥檚 release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

鈥淭his is becoming a conversation not just about privacy, but about discrimination,鈥 Laird said. 鈥淲ithout a doubt, we see certain groups of students having outsized experiences in being directly targeted.鈥

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity 鈥 often called outing 鈥 as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including 鈥済ay鈥 and 鈥渓esbian,鈥 a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination.聽

Center for Democracy and Technology

In its letter to the Education Department鈥檚 Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

鈥淪tudent activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,鈥 the letter states. 

The Education Department鈥檚 civil rights division, they said, should condemn surveillance practices that violate students鈥 civil rights and launch 鈥渆nforcement action against violations that result in discrimination.鈥

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that 鈥渋llegally surveil children when they go online to learn.鈥 

It also comes at a time of intense concern over students鈥 emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

鈥淪chools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,鈥 Laird said. 

Last week, the Senate designed to improve children鈥檚 safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

鈥淭he answer to our lack of privacy isn鈥檛 more tracking,鈥 the . The legislation 鈥渋s a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is 鈥榥ot in their best interest,鈥 as defined by the government, and interpreted by tech platforms.鈥 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, 鈥渨ill now be more heavily surveilled by basically every site on the internet, and that information will be available to parents鈥 who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

鈥淲hen you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,鈥 she said.

]]>
Senate Inquiry Warns About Harms of Digital School Surveillance Tools /article/senate-inquiry-warns-about-harms-of-digital-school-surveillance-tools-calls-on-fcc-to-clarify-student-monitoring-rules/ Mon, 04 Apr 2022 21:37:00 +0000 /?post_type=article&p=587388 Updated, April 5

Democratic Sens. Elizabeth Warren and Ed Markey are calling on the Federal Communications Commission to clarify how schools should monitor students鈥 online activities, that educators鈥 widespread use of digital surveillance tools could trample students鈥 civil rights.

They also want the U.S. Education Department to start collecting data on the tools that could highlight whether they have disproportionate 鈥 and potentially harmful 鈥 effects on certain student groups. 

In October, the senators asked four education technology companies that keep tabs on the online activity of millions of students across the country 鈥 often 24 hours a day, seven days a week 鈥 to provide information on how they use artificial intelligence to glean their information. 

Based on their responses, the senators said:

  • The companies鈥 software may be misused to identify students who are violating school disciplinary rules. They cited a recent survey where 43% of teachers reported their schools employ the monitoring systems for this purpose, potentially increasing contact between police and students and worsening the school-to-prison pipeline.
  • The companies have not attempted to determine whether their products disproportionately target students of color, who already face harsher and more frequent school discipline, or other vulnerable groups, like LGBTQ youth.
  • Schools, parents and communities are not being appropriately informed of the use 鈥 and potential misuse 鈥 of the data. Three of the four companies indicated they do not directly alert students and guardians of their surveillance.

Warren and Markey concluded a dire 鈥渘eed for federal action to protect students鈥 civil rights, safety and privacy.鈥

鈥淲hile the intent of these products, many of which monitor students鈥 online activity around the clock, may be to protect student safety, they raise significant privacy and equity concerns,鈥 the lawmakers wrote. 鈥淪tudies have highlighted unintended but harmful consequences of student activity monitoring software that fall disproportionately on vulnerable populations.鈥

An FCC spokesperson said they鈥檙e reviewing the and an Education Department spokesperson said they 鈥渓ook forward to corresponding with the senators鈥 about its findings.

Lawmakers鈥 inquiry into the business practices of school security companies Gaggle, GoGuardian, Securly and Bark Technologies is the first congressional investigation into student surveillance tools, whose use grew dramatically during the pandemic when  learning shifted online.

It follows on the heels of investigative reporting by 社区黑料 into Gaggle, which uses artificial intelligence and a team of human content moderators to track the online behaviors of more than 5 million students. 社区黑料 used public records to expose how Gaggle鈥檚 algorithm and its hourly-wage workers sift through billions of student communications each year in search of references to violence and self harm, subjecting youth to constant digital surveillance with steep implications for their privacy. Gaggle, whose tools track students on their school-issued Google and Microsoft accounts, reported a during the pandemic.

Bark didn鈥檛 respond to requests for comment. Securly spokesman Josh Mukai said in a statement that the company is reviewing the senators鈥 March 30 report and looks forward 鈥渢o continuing our dialogue with Senators Warren and Markey on the important topics they have raised.鈥

鈥淧arents expect that schools will keep children safe while in the classroom, on a field trip or while riding on a bus,鈥 GoGuardian spokesman Jeff Gordon said in a statement. 鈥淪chools also have a responsibility to keep students safe in digital spaces and on school-issued devices.鈥 

Gaggle Founder and CEO Jeff Patterson submitted a statement after this article was published. He said the company is reviewing the lawmakers鈥 recommendations 鈥渢o assess how we can further strengthen our work to better protect students.鈥

鈥淲e want to ensure our technology is effectively supporting student safety without creating unintended risks or harms,鈥 Patterson continued. 鈥淲e have taken steps over the years to ensure effective privacy protections and mitigate bias in our platform, but welcome continued dialogue that will help make sure tools like Gaggle can continue to be used to support students and educators.鈥

Bark Technologies CEO Brian Bason wrote in a letter to  lawmakers that AI-driven technology could be used to solve the country鈥檚 鈥渢errible history of bias in school discipline鈥 by removing the decisions of individual teachers and administrators.

鈥淲hile any system, including AI-based solutions, inherently have some bias, if implemented correctly AI-based solutions can substantially reduce the bias that students face,鈥 Bason wrote.

As to the question of whether their surveillance exacerbates the school-to-prison pipeline,  the companies鈥 letters acknowledge in certain cases they contact police to conduct welfare checks on students. Securly noted in its letter that in some instances, education leaders 鈥減refer that we contact public safety agencies directly in lieu of a district contact.鈥

Under the Clinton-era , passed in 2000, public schools and libraries are required to filter and monitor students鈥 internet use to ensure they don鈥檛 access material 鈥渉armful to minors,鈥 such as pornography. Districts have cited the law to justify the adoption of AI-driven surveillance tools that have proliferated in recent years. Student privacy advocates argue the tools go far beyond the federal mandate and have called on the FCC to clarify the law鈥檚 scope. Meanwhile, advocates have questioned whether schools鈥 use of digital surveillance tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures.

In a recent survey by the nonprofit Center for Democracy and Technology, 81 percent of teachers said they used software to track students鈥 computer activity, including to block obscene material or monitor their screens in real time. A majority of parents said they worried about student data getting shared with the police and more than half of students said they decline to share their 鈥渢rue thoughts or ideas because I know what I do online is being monitored.鈥  

Elizabeth Laird, the group鈥檚 director of equity in civic technology, said it has been calling on student surveillance companies to be more transparent about their business practices but it鈥檚 鈥渄isappointing that it took a letter from Congress to get this information.鈥 She said she hopes the FCC and Education Department adopt lawmakers鈥 recommendations.

鈥淣one of these companies have researched whether their products are biased against certain groups of students,鈥 she said in an email while questioning their justification for holding off on such an inquiry. 鈥淭hey cite privacy as the reason for not doing so while simultaneously monitoring students鈥 messages, documents and sites visited 24 hours a day, seven days a week.鈥 

社区黑料鈥檚 investigation, which used data on Gaggle鈥檚 foothold in Minneapolis Public Schools, failed to identify whether the tool鈥檚 algorithm disproportionately targeted Black students, who are more often subjected to student discipline than their white classmates. However, it highlighted instances in which keywords like 鈥済ay鈥 and 鈥渓esbian鈥 were flagged, potentially subjecting LGBTQ youth to heightened surveillance for discussing their sexual orientation. 

Amelia Vance, an attorney and student privacy expert, said she was intrigued that the companies pushed back on the idea that their tools are used to discipline students since the federal monitoring requirement was meant to keep kids from consuming inappropriate content online and likely face consequences for viewing violent or sexually explicit materials. She agreed the companies should research their algorithms for potential biases and would benefit from additional transparency. 

However, Vance said in an email that FCC clarification 鈥渨ould do little at best and may provide counterproductive guidance at worst.鈥 Many schools, she said, are likely to use the tools regardless of the federal rules. 

鈥淪chools aren鈥檛 required to monitor social media, and many have chosen to do so anyway,鈥 said Vance, the co-founder and president of Public Interest Privacy Consulting. Some school safety advocates are actively lobbying lawmakers to expand student monitoring requirements, she said. 

Asking the FCC to issue guidance 鈥渃ould actually be counterproductive to the goal of limiting monitoring and ensuring more privacy protections for students since it is possible that the FCC could require a higher level of monitoring.鈥

Read the letters from Gaggle, GoGuardian, Securly and Bark Technologies: 

]]>
Dems Warn School Surveillance Tools Could Compound 鈥楻isk of Harm for Students鈥 /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates 鈥渁 clear invasion of student privacy, particularly when students and families are unable to opt out.鈥

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they鈥檙e taking to ensure the tools aren鈥檛 鈥渦nfairly targeting students and perpetuating discriminatory biases,鈥 and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students鈥 online activities and identify behaviors they believe could be harmful.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淓ducation technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,鈥 the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions 鈥 and grew rapidly as schools shifted to remote learning during the pandemic 鈥 there’s . Some critics, including the lawmakers, argue they may do more harm than good. 鈥淭he use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,鈥 the senators wrote.

The letters cited a recent investigation by 社区黑料, which outlined how Gaggle鈥檚 AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students鈥 classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students鈥 school-issued Google and Microsoft accounts. Other services include students鈥 social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools鈥 capacity to track student behaviors 24/7 鈥 including when students are at home 鈥 and their ability to monitor students on their personal devices in some cases.

Schools鈥 use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

鈥淏ecause of the lack of transparency, many students and families are unaware that nearly all of their children鈥檚 online behavior is being tracked,鈥 according to the letters. 鈥淲hen students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.鈥

A Securly spokesperson said in an email the company is 鈥渞eviewing the correspondence received鈥 by the lawmakers and is in the process of responding to their requests for information. He said the company is 鈥渄eeply committed to continuously evolving our technology鈥 to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers鈥 interest in learning how the tool 鈥渟erves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.鈥 A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn鈥檛 respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students鈥 internet use to ensure they aren鈥檛 accessing material that is 鈥渉armful to minors,鈥 such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law鈥檚 scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not 鈥渞equire the tracking of internet use by any identifiable minor or adult user.鈥 It 鈥渞emains an open question鈥 as to whether schools鈥 use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

鈥淪chool disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,鈥 according to the letters. 鈥淭hese disciplinary records, even when students are cleared, may have life-long harmful consequences for students.鈥

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research 鈥渞evealed a worrisome lack of transparency鈥 around how these educational technology companies track students online and how schools rely on their tools.

鈥淩esponses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,鈥 she said in an email.

]]>