The Spy Tech That Followed Kids Home – 社区黑料 America's Education News Source Tue, 02 May 2023 01:42:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png The Spy Tech That Followed Kids Home – 社区黑料 32 32 Gaggle Drops LGBTQ Keywords from Student Surveillance Tool Following Bias Concerns /article/gaggle-drops-lgbtq-keywords-from-student-surveillance-tool-following-bias-concerns/ Fri, 27 Jan 2023 12:15:00 +0000 /?post_type=article&p=703034 Digital monitoring company Gaggle says it will no longer flag students who use words like 鈥済ay鈥 and 鈥渓esbian鈥 in school assignments and chat messages, a significant policy shift that follows accusations its software facilitated discrimination of LGBTQ teens in a quest to keep them safe.

A spokesperson for the company, which describes itself , cited a societal shift toward greater acceptance of LGBTQ youth 鈥 rather than criticism of its product 鈥 as the impetus for the change as part of a 鈥渃ontinuous evaluation and updating process.鈥

The company, which uses artificial intelligence and human content moderators to sift through billions of student communications each year, has long defended its use of LGBTQ-specific keywords to identify students who might hurt themselves or others. In arguing the targeted monitoring is necessary to save lives, executives have pointed to the prevalence of bullying against LGBTQ youth and data indicating they鈥檙e than their straight and cisgender classmates. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


But in practice, Gaggle鈥檚 critics argued, the keywords put LGBTQ students at a heightened risk of scrutiny by school officials and, on some occasions, the police. Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity 鈥 often called outing 鈥 as a result of digital activity monitoring, according to released in August by the nonprofit Center for Democracy and Technology. The survey encompassed the impacts of multiple monitoring companies who contract with school districts, such as GoGuardian, Gaggle, Securly and Bark. 

Gaggle鈥檚 decision to remove several LGBTQ-specific keywords, including 鈥渜ueer鈥 and 鈥渂isexual,鈥 from its dictionary of words that trigger alerts was first reported in . It follows extensive reporting by 社区黑料 into the company鈥檚 business practices and sometimes negative effects on students who are caught in its surveillance dragnet. 

Though Gaggle鈥檚 software is generally limited to monitoring school-issued accounts, including those by Google and Microsoft, the it can scan through photos on students鈥 personal cell phones if they plug them into district laptops.

The keyword shift comes at a particularly perilous moment, as Republican lawmakers in multiple states . Legislation has looked to curtail classroom instruction about sexual orientation and gender identity, ban books and classroom curricula featuring LGBTQ themes and prohibit transgender students from receiving gender-affirming health care, participating in school athletics and using restroom facilities that match their gender identities. Such a hostile political climate and pandemic-era disruptions, a recent youth survey by The Trevor Project revealed, has contributed to an uptick in LGBTQ youth who have seriously considered suicide. 

The U.S. Education Department received 453 discrimination complaints involving students鈥 sexual orientation or gender identity last year, according to data provided to 社区黑料 by its civil rights office. That鈥檚 a significant increase from previous years, including in 2021 when federal officials received 249 such complaints. The Trump administration took and complaints dwindled. In 2018, the Education Department received just 57 complaints related to sexual orientation or gender identity discrimination.

The increase in discrimination allegations involving sexual orientation or gender identity are part of , according to data obtained by The New York Times. The total number of complaints for 2021-22 grew to 19,000, a historic high and more than double the previous year. 

In September, 社区黑料 revealed that Gaggle had donated $25,000 to The Trevor Project, the nonprofit that released the recent youth survey and whose advocacy is focused on suicide prevention among LGBTQ youth. The arrangement was framed on Gaggle鈥檚 website as a collaboration to 鈥渋mprove mental health outcomes for LGBTQ young people.鈥 

The revelation was met with swift backlash on social media, with multiple Trevor Project supporters threatening to halt future donations. Within hours, the group announced it had returned the donation, acknowledging concerns about Gaggle 鈥渉aving a role in negatively impacting LGBTQ students.鈥 

The Trevor Project didn鈥檛 respond to requests for comment on Gaggle鈥檚 decision to pull certain LGBTQ-specific keywords from its systems. 

In a statement to 社区黑料, Gaggle spokesperson Paget Hetherington said the company regularly modifies the keywords its software uses to trigger a human review of students鈥 digital communications. Certain LGBTQ-specific words, she said, are no longer relevant to the 24-year-old company鈥檚 efforts to protect students from abuse and were purged late last year.

鈥淎t points in time in the not-too-distant past, those words were weaponized by bullies to harass and target members of the LGBTQ+ community, so as part of an effective methodology to combat that discriminatory harassment and violence, those words were once effective tools to help identify dangerous situations,鈥 Hetherington said. 鈥淭hankfully, over the past two decades, our society evolved and began a period of widespread acceptance, especially among the K-12 student population that Gaggle serves. With that evolution and acceptance, it has become increasingly rare to see those words used in the negative, harassing context they once were; hence, our decision to take these off our word/phrases list.鈥

Hetherington said Gaggle will continue to monitor students鈥 use of the words 鈥渇aggot,鈥 鈥渓esbo,鈥 and others that are 鈥渃ommonly used as slurs.鈥 A previous review by 社区黑料 found that Gaggle regularly flagged students for harmless speech, like profanity in fictional articles submitted to a school鈥檚 literary magazine, and students鈥 private journals. 

Anti-LGBTQ activists have , and privacy advocates warn that in the era of 鈥淒on鈥檛 Say Gay鈥 laws and abortion bans, information gleaned from Gaggle and similar services could be weaponized against students.

Gaggle executives have minimized privacy concerns and claim the tool saved more than 1,400 lives last school year. That statistic hasn鈥檛 been independently verified and there鈥檚 a dearth of research to suggest digital monitoring is an effective school-safety tool. A recent survey found a majority of parents and teachers believe the benefits of student monitoring outweigh privacy concerns. The Vice News documentary included the perspective of a high school student who was flagged by Gaggle for writing a paper titled 鈥淓ssay on the Reasons Why I Want to Kill Myself but Can鈥檛/Didn鈥檛.鈥 Adults wouldn鈥檛 have known she was struggling without Gaggle, she said. 

鈥淚 do think that it鈥檚 helpful in some ways,鈥 the student said, 鈥渂ut I also kind of think that it鈥檚 鈥 I wouldn鈥檛 say an invasion of privacy 鈥 but if obviously something gets flagged and a person who it wasn鈥檛 intended for reads through that, I think that鈥檚 kind of uncomfortable.鈥 

Student surveillance critic Evan Greer, director of the nonprofit digital rights group said the tweaks to Gaggle鈥檚 keyword dictionary are unlikely to have a significant effect on LGBTQ teens and blasted the company鈥檚 stated justification for the move as being 鈥渙ut of touch鈥 with the state of anti-LGBTQ harassment in schools. Meanwhile, Greer said that LGBTQ youth frequently refer to each other using 鈥渞eclaimed slurs,鈥 reappropriating words that are generally considered derogatory and remain in Gaggle鈥檚 dictionary. 

鈥淭his is just like lipstick on a pig 鈥 no offense to pigs 鈥 but I don鈥檛 see how this actually in any meaningful way mitigates the potential for this software to nonconsensually out LGBTQ students to administrators,鈥 Greer said. 鈥淚 don鈥檛 see how it prevents the software from being used to invade the privacy of students in a wide range of other circumstances.鈥

Gaggle and its competitors 鈥 including , and 鈥 have faced similar scrutiny in Washington. In April, Democratic Sens. Elizabeth Warren and Ed Markey argued in a report that the tools could be misused to discipline students and warned they could be used disproportionately against students of color and LGBTQ youth. 

Jeff Patterson

In , Gaggle founder and CEO Jeff Patterson said the company cannot test the potential for bias in its system because the software flags student communications anonymously and the company has 鈥渘o context or background on students,鈥 including their race or sexual orientation. They also said their monitoring services are not meant to be used as a disciplinary tool. 

In the survey released last summer by the Center for Democracy and Technology, however, 78% of teachers reported that digital monitoring tools were used to discipline students. Black and Hispanic students reported being far more likely than white students to get into trouble because of online monitoring. 

In October, the White House cautioned school districts against the 鈥渃ontinuous surveillance鈥 of students if monitoring tools are likely to trample students鈥 rights. It also directed the Education Department to issue guidance to districts on the safe use of artificial intelligence. The guidance is expected to be released early this year.

Evan Greer (Twitter/@evan_greer)

As an increasing number of districts implement Gaggle for bullying prevention efforts, surveillance critic Greer said the company has failed to consider how adults can cause harm.

鈥淭here is now a very visible far-right movement attacking LGBTQ kids, and particularly trans kids and teenagers,鈥 Greer said. 鈥淚f anything, queer kids are more in the crosshairs today than they were a year ago or two years ago 鈥 and that鈥檚 why this surveillance is so dangerous.鈥

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741. For LGBTQ mental health support, contact The Trevor Project鈥檚 toll-free support line at 866-488-7386.

]]>
Trevor Project Severs Ties with Surveillance Company Accused of LGBTQ Youth Bias /article/trevor-project-teams-upith-student-surveillance-company-accused-of-lgbtq-bias/ Fri, 30 Sep 2022 11:00:00 +0000 /?post_type=article&p=697341 Updated 3:15 p.m. ET

Hours after the publication of this article Friday, The Trevor Project announced in a tweet it would return a $25,000 donation from the student surveillance company Gaggle, acknowledging widespread concerns about the monitoring tool鈥檚 鈥渞ole in negatively impacting LGBTQ students.鈥

鈥淥ur philosophy is that having a seat at the table enables us to positively influence how companies engage with LGBTQ young people, and we initially agreed to work with Gaggle because we saw an opportunity to have a meaningful impact to better protect LGBTQ students,鈥 the nonprofit said in the statement. 鈥淲e hear and understand the concerns, and we hope to work alongside schools and institutions to ensure they are appropriately supporting LGBTQ youth and their mental health.鈥 

The move came after widespread condemnation on social media, with multiple supporters threatening to pull their donations to The Trevor Project moving forward. 

In a Friday statement, Gaggle spokesperson Paget Hetherington said the company wanted The Trevor Project鈥檚 鈥済uidance on how to do what we do better.鈥 The company also where it previously touted the partnership. 

鈥淲e鈥檙e disappointed that The Trevor Project has decided to pause our collaboration,鈥 she said. 鈥淗owever, we are grateful for the opportunity we have had to learn and work with them and will continue with our mission of protecting all students regardless of how they identify.鈥 

Original report below:

Amid warnings from lawmakers and civil rights groups that digital surveillance tools could discriminate against at-risk students, a leading nonprofit devoted to the mental well-being of LGBTQ youth has formed a financial partnership with a tech company that subjects them to persistent online monitoring. 

, The Trevor Project, a high-profile nonprofit focused on suicide prevention among LGBTQ youth, began to list Gaggle as on its website, disclosing that the controversial surveillance company had given them between $25,000 and $50,000 in support. Meanwhile Gaggle, which uses artificial intelligence and human content moderators to sift through billions of student chat messages and homework assignments each year in search of students who may harm themselves or others, noting the two were collaborating to 鈥渋mprove mental health outcomes for LGBTQ young people.鈥 

Though the precise contours of the partnership remain unclear, a Trevor Project spokesperson said it aims to have a positive influence on the way Gaggle navigates privacy concerns involving LGBTQ youth while a Gaggle representative said the company sees the relationship as a learning opportunity.

Both groups maintain that the partnership was forged in the interests of LGBTQ students, but student privacy advocates argue the relationship could undermine The Trevor Project鈥檚 work while allowing Gaggle to use the donation to counter criticism about its potential harms to LGBTQ students. The collaboration comes at a particularly perilous time for many students as a rash of states implement new anti-LGBTQ laws that could erode their privacy and expose them to legal jeopardy. 

Teeth Logsdon-Wallace, a 14-year-old student from Minneapolis with first-hand experience of Gaggle鈥檚 surveillance dragnet, said the deal could eliminate any motivation for Gaggle to change its business practices. 

鈥淚t really does feel like a 鈥榃e paid you, now say we鈥檙e fine,鈥 kind of thing,鈥 said Logsdon-Wallace, who is transgender. Without any real incentives to implement reforms, he said that Gaggle鈥檚 鈥渟eal of approval鈥 from The Trevor Project could offer the privately held company reputational cover amid growing concerns that such surveillance tech is disproportionately harmful to LGBTQ youth. 

鈥淧eople who want to defend Gaggle can just point to their little Trevor Project thing and say, 鈥楽ee, they have the support of 鈥淭he Gays鈥 so it鈥檚 fine actually,鈥 and all it does is make it easier to deflect and defend actual issues with Gaggle.鈥 

A screenshot showing that Gaggle is a corporate partner of The Trevor Project
Student surveillance company Gaggle is listed among 鈥淐orporate Partners鈥 on The Trevor Project鈥檚 website (screenshot)

Following an investigation by 社区黑料 into Gaggle鈥檚 monitoring practices, the company . Gaggle鈥檚 algorithm relies on keyword matching to compare students鈥 online communications against a dictionary of thousands of words the company believes could indicate potential trouble, including references to violence, drugs and sex. Among the keywords are 鈥済ay鈥 and 鈥渓esbian,鈥 verbiage the company maintains is necessary because LGBTQ youth are more likely than their straight and cisgender peers to consider suicide. 

But privacy and civil rights advocates have accused the company of discrimination by subjecting LGBTQ youth to heightened surveillance 鈥 a concern that has taken on new meaning this year as states like Florida adopt laws that ban classroom discussions about sexuality and LGBTQ youth to their parents.  

A by the nonprofit Center for Democracy and Technology found that while Gaggle and similar student monitoring tools are designed to keep students safe, teachers reported that they were more often used to discipline them. LGBTQ youth were disproportionately affected. 

In a statement, a Trevor Project spokesperson said it鈥檚 important that digital monitoring tools keep students safe without invading their privacy and that the collaboration was built on Gaggle鈥檚 鈥渄esire to identify and address privacy and safety concerns that their product could cause for LGBTQ students.鈥 

鈥淚t鈥檚 true that LGBTQ youth are among the most vulnerable to the misuse of this kind of safety monitoring 鈥 many worry that these tools could out them to teachers or parents against their will,鈥 the statement continued. 鈥淚t is because of that very real concern that we have worked in a limited capacity with digital safety companies 鈥 to play an educational role and have a seat at the table so they can consider these potential risks while they design their products and develop policies.鈥 

But it remains unclear what policy changes have occurred at Gaggle as a result of the deal. Without offering any specifics, Gaggle spokesperson Paget Hetherington said in a statement the company is 鈥渉onored to be able to align with The Trevor Project to better serve LGBTQ youth,鈥 and that the company is 鈥渁lways looking for ways to learn and to improve upon what we do to better support students and keep them safe.鈥 

鈥楩aceless bureaucracy鈥 

At its core, the partnership between Gaggle and The Trevor Project makes sense because both work to prevent youth suicides, said Amelia Vance, the founder and president of . But their approaches to solving the problem, she said, are fundamentally different. 

By combing through digital materials on students鈥 school-issued Microsoft and Google accounts, Gaggle seeks to alert educators 鈥 and in some cases the police 鈥 of students’ online behaviors that suggest they might harm themselves or others.

鈥淚t really is about collecting details that kids may not be voluntarily sharing 鈥 information that they may be looking up to learn, to explore their identities, to otherwise help them in their day-to-day lives,鈥 Vance said. At The Trevor Project, 鈥測ou have proactive outreach from youth who know that they need help or they need a community.鈥 

Katy Perry smiles in front of a Trevor Project background, holding a poster that says "Be proud of who you are."
Katy Perry poses for a photograph during a fundraising event for The Trevor Project in 2012. (Mark Davis/Getty Images for Trevor Project)

The West Hollywood-based Trevor Project, which and funding from including Macy鈥檚 and AT&T, was founded in 1998 and in contributions in 2020. Gaggle, founded in 1999, does not publicly report its finances. The Dallas-based company says it monitors the digital communications of more than 5 million students across more than 1,500 school districts nationally. 

The Trevor Project to train volunteer crisis counselors and assess the risk levels of people who reach out to for help. If counselors with The Trevor Project believe a student is at imminent suicide risk, to call the police. But it鈥檚 ultimately up to youth to decide which information they share with adults. 

It鈥檚 important for LGBTQ students to have trusting adults with whom they can confide their experiences, Vance said, rather than a system where 鈥渟ome faceless bureaucracy is finding out and informing your parents鈥 about information they intended to keep private. 

A by The Trevor Project offers troubling data about the realities of the youth suicide crisis. Nearly half of LGBTQ youth said they seriously considered attempting suicide in the past year and 14% said they made a suicide attempt. 

This isn鈥檛 the first time The Trevor Project has faced scrutiny in recent months for its ties to companies that could have detrimental effects on LGBTQ youth. In July, a HuffPost investigation revealed that CEO and Executive Director Amit Paley previously and helped create a strategic plan to boost opioid sales amid an addiction epidemic 鈥 one that鈥檚 in suicide attempts among LGBTQ youth. 

The group knows firsthand how data can be weaponized. Just last month, that target the transgender community launched a campaign to clog up The Trevor Project鈥檚 suicide prevention hotline. 

Persistent student surveillance could exacerbate the challenges that LGBTQ youth face by subjecting them to disproportionate discipline and erroneously flagging their online communications as threats, Democratic Sens. Elizabeth Warren and Ed Markey warned in an April report

Nearly a third of LGBTQ students say they or someone they know has experienced the nonconsensual disclosure of their sexual orientation or gender identity 鈥 typically called 鈥渙uting鈥 鈥 due to student activity monitoring, by the nonprofit Center for Democracy and Technology. They were also more likely than their straight and cisgender peers to report getting into trouble at school and being contacted by the police about having committed a crime. 

A bar chart showing LGBTQ+ students are more likely to get in trouble for visiting a website or saying something inappropriate online; were more likely to be contacted by counselors or other adults at school about their mental health; and were more likely to be contacted by a police officer or other adult due to concerns about them committing a crime.
A recent survey by the nonprofit Center for Democracy and Technology found that student monitoring tools have disproportionate negative effects on LGBTQ youth. (Center for Democracy and Technology) 

In response to the survey results, a coalition of civil rights groups called on the U.S. Education Department to condemn the use of activity monitoring tools that violate students鈥 civil liberties and to state its intent 鈥渢o take enforcement action against violations that result in discrimination.鈥 The letter argues that using the tools to out LGBTQ students or to subject them to disproportionate discipline and criminal investigations could violate Title IX, the federal law prohibiting sex-based discrimination in schools. 

Among the letter signatories is the nonprofit LGBT Tech, which about the harms of digital surveillance on LGBTQ people. Christopher Wood, the group鈥檚 co-founder and executive director, said The Trevor Project鈥檚 partnership with Gaggle could be positive if it鈥檚 used to ensure that LGBTQ youth who are struggling have access to help. But once Gaggle gives student information to school administrators, the company can no longer control how those records are used, he said. 

A screenshot from Gaggle's website. Gray box with text that says Gaggle is a Proud Sponsor of The Trevor Project.
Gaggle says on its website that the student surveillance company 鈥渋s proud to collaborate with The Trevor Project and improve mental health outcomes for LGBTQ young people.鈥 (Screenshot)

鈥淚f that information is provided to someone who is not accepting, who has very different views and who willfully brings their political, personal or religious views into the school system, and they are not supportive of LGBTQ youth, then what they鈥檝e done is harm the student,鈥 Wood said. 

Yet as schools increasingly turned to student activity monitoring software during the pandemic, The Trevor Project portrayed their growth as an inevitable result of districts seeking 鈥渢o avoid liability issues.鈥  

鈥淚t is our stance that since these tools are not going anywhere, we think it鈥檚 important to do our part to offer our expertise around LGBTQ experiences,鈥 the spokesperson said. 

A student holds up a peace sign with one hand and has the other wrapped around his dog
Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

The power of trust

In interviews, students flagged by Gaggle said their trust in adults suffered as a result. Among them is Logsdon-Wallace, the 14-year-old transgender student. Before the Minneapolis school district stopped using Gaggle this summer and state lawmakers put strict limits on digital surveillance in schools, the tool alerted district security when he used a classroom assignment to reflect on a previous suicide attempt and how music therapy helped him cope. That same assignment, which included references to his gender identity, was flagged to his parents. 

And while his parents are affirming, he has friends who live in less supportive environments.                                                                                                       

鈥淚 have friends who are queer and/or trans who are out at school but not to their parents,鈥 he said. 鈥淚f they want to be open with teachers, Gaggle can create a bad or even dangerous situation for these kids if their parents were contacted about what they were saying.鈥 

In The Trevor Project鈥檚 recent survey, nearly three-quarters of LGBTQ youth reported that they have endured discrimination based on their sexual orientation or gender identity, just 37% said their homes are affirming and 55% said the same about their schools. 

Given that reality, reported sharing information about their sexual orientation with teachers or guidance counselors. 

While Gaggle has maintained that keywords like 鈥済ay鈥 and 鈥渓esbian鈥 can also prevent bullying, Logsdon-Wallace said their approach is out of touch with how students generally interact. At school, he said he鈥檚 been called just about every 鈥渟lur for a queer or a trans person that isn鈥檛 from like 80 years ago.鈥 While slurs are common, terms like 鈥渓esbian鈥 are not.

鈥淎s an actual teenager going to an actual public school, those words are not being used to bully people,鈥 he said. 鈥淭hey鈥檙e just not.鈥

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
With 鈥楧on鈥檛 Say Gay鈥 Laws & Abortion Bans, Student Surveillance Raises New Risks /article/with-dont-say-gay-laws-abortion-bans-student-surveillance-raises-new-risks/ Thu, 08 Sep 2022 10:30:00 +0000 /?post_type=article&p=696150 While growing up along the Gulf Coast in Mississippi, Kenyatta Thomas relied on the internet and other teenagers to learn about sex.

Thomas and their peers watched videos during high school gym class that stressed the importance of abstinence 鈥 and the horrors that can come from sex before marriage. But for Thomas, who is bisexual and nonbinary, the lessons didn鈥檛 explain who they were as a person. 

鈥淚t was very confusing trying to navigate understanding who I am and my identity,鈥 said Thomas, now a student at Arizona State University. It was on the internet that Thomas learned about a whole community of young people with similar experiences. Blog posts on Tumblr helped them make sense of their place in the world and what it meant to be bisexual. 鈥淚 was able to find the words to understand who I am 鈥 words that I wouldn’t be able to piece together in a sentence if the internet wasn鈥檛 there.鈥 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


But now, as states adopt anti-LGBTQ laws and abortion bans, the digital footprint that Thomas and other students leave may come back to harm them, privacy and civil rights advocates warn, and it could be their school-issued devices that end up exposing them to that legal peril.

For years, schools across the U.S. have used digital surveillance tools that collect a trove of information about youth sexuality 鈥 intimate details that are gleaned from students鈥 conversations with friends, diary entries and search histories. Meanwhile, student information collected by student surveillance companies are regularly shared with police, according to a recent survey conducted by the nonprofit Center for Democracy and Technology. These two realities are concerning to Elizabeth Laird, the center鈥檚 director of equity in civic technology. Following the Supreme Court鈥檚 repeal of Roe v. Wade in June, she said information about youth sexuality could be weaponized. 

 鈥淩ight now 鈥 without doing anything 鈥 schools may be getting alerts about students鈥 who are searching the internet for resources related to reproductive health,鈥 Laird said. 鈥淚f you are in a state that has a law that criminalizes abortion, right now this tool could be used to enforce those laws.鈥

Teens across the country are already to fill the void for themselves and their peers in the current climate. Thomas, the ASU student and an outspoken reproductive justice activist, said that while students are generally aware that school devices and accounts are monitored, the repeal of Roe has led some to take extra privacy precautions. 

Kenyatta Thomas, an Arizona State University student and activist, participates in an abortion-rights protest. (Photo courtesy Kenyatta Thomas)

鈥淚 have switched to using Signal to talk to friends and colleagues in this space,鈥 they said, referring to the . 鈥淭he fear, even though it鈥檚 been common knowledge for basically my generation鈥檚 entire life that everything you do is being surveilled, it definitely has been amplified tenfold.鈥

Police have long used social media and other online platforms to investigate people for breaking abortion rules, including where police obtained a teen鈥檚 private Facebook messages through a search warrant before charging the then-17-year-old and her mother with violating the state鈥檚 ban on abortions after 20 weeks of pregnancy. 

LGBTQ students face similar risks as lawmakers in Florida and elsewhere impose rules that prohibit classroom discussions about sexuality and gender. This year alone, lawmakers have proposed 300 anti-LGBTQ bills and about a dozen have . They so-called 鈥淒on鈥檛 Say Gay鈥 laws in Florida and Alabama that ban classroom discussions about gender and sexuality and require school officials to tell the parents of children who share that they may be gay or transgender. 

In a survey, a fifth of LGBTQ students told the Center for Democracy and Technology that they or another student they knew had their sexual orientation or gender identity disclosed without their consent due to online student monitoring. They were more likely than straight and cisgender students to report getting into trouble for their web browsing activity and to be contacted by the police about having committed a crime. 

LGBTQ youth are nearly twice as likely as their straight and cisgender classmates to search for health information online, according to . But as anti-LGBTQ laws proliferate, student surveillance tools should reconsider collecting data about youth sexuality, Christopher Wood, the group鈥檚 co-founder and executive director, told 社区黑料. 

鈥淩ight now, we are not in a landscape or an environment where that is safe for a company to be doing,鈥 Wood said. 鈥淚f there is a remote possibility that the information that they are trying to provide to help a student could potentially lead them into more harm, then they need to be looking at that very carefully and considering whether that is the appropriate direction for a company to be taking.鈥

Digital student monitoring tools have a negative disparate impact on LGBTQ youth, according to a recent student survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

鈥楨xtraordinarily concerned鈥

For decades, has required school technology to block access to images that are obscene, child pornography or deemed 鈥渉armful to minors,鈥 and schools have used web-filtering software to prevent students from accessing sexually explicit content. But in some cases, the filtering to block pro-LGBTQ websites that aren鈥檛 explicit, including those that offer crisis counseling.  

Many student monitoring tools, which saw significant growth during the pandemic, go far beyond web filtering and employ artificial intelligence to track students across the web to identify issues like depression and violent impulses. The tools can sift through students鈥 social media posts, follow their digital movements in real time and scan files on school-issued laptops 鈥 from classroom assignments to journal entries 鈥 in search of warning signs. 

They鈥檝e also come under heightened scrutiny. In a report this year, Democratic Sens. Elizabeth Warren and Ed Markey warned that schools鈥 widespread adoption of the tools could trample students鈥 civil rights. By flagging words related to sexual orientation, the report notes, LGBTQ youth could be subjected to disproportionate disciplinary rates and be unintentionally outed to their parents. 

In in July, Warren and Markey cautioned that the tools could pose new risks following the repeal of Roe and asked four leading student surveillance companies 鈥 GoGuardian, Gaggle, Securly and Bark 鈥 whether they flag students for using keywords related to reproductive health, such as 鈥減regnant鈥 and 鈥渁bortion.鈥

鈥淲e are extraordinarily concerned that your software could result in punishment or criminalization of students seeking contraception, abortion or other reproductive health care,鈥 Markey and Warren wrote. 鈥淲ith reproductive rights under attack nationwide, it would represent a betrayal of your company鈥檚 mission to support students if you fail to provide appropriate protections for students鈥 privacy related to reproductive health information.鈥

Student activity monitoring tools are more often used to discipline students than protect them from violence and mental health crises, according to a recent teacher survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

The scrutiny is part of a larger concern over digital privacy in the post-Roe world. In August, the Federal Trade Commission and accused the company of selling the location data from hundreds of millions of cell phones that could be used to track peoples鈥 movements. Such precise location data, the , 鈥渕ay be used to track consumers to sensitive locations, including places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, medical facilities and welfare and homeless shelters.鈥 

School surveillance companies have acknowledged their tools track student references to sex but sought to downplay the risks they pose to students. Bark spokesperson Adina Kalish said the company began to immediately purge all data related to reproductive health after a leaked Supreme Court draft opinion suggested Roe鈥檚 repeal was imminent 鈥 despite maintaining a 30-day retention period for most other data. 

鈥淏y immediately and permanently deleting data which contains a student鈥檚 reproductive health data or searches for reproductive health information, such data is not in our possession and therefore not produce-able under a court order, subpoena, etc.,鈥 Bark CEO Brian Bason , which the company shared with 社区黑料. 

GoGuardian spokesperson Jeff Gordon said its tools 鈥渃annot be used by educators or schools to flag reproductive health-related search terms鈥 and its web filter cannot 鈥渇lag reproductive health-related searches.鈥 Securly didn鈥檛 respond to requests for comment. Last year its web-filtering tool categorized health resources for LGBTQ teens as pornography. 

Gaggle founder and CEO Jeff Patterson to the senators that his company does not 鈥渃ollect health data of any kind including reproductive health information,鈥 specifying that the monitoring tool does not flag students who use the terms 鈥減regnant, abortion, birth control, contraception or Planned Parenthood. 鈥 

Yet tracking conversations about sex is a primary part of Gaggle’s business 鈥 more than references to suicide, violence or drug use, according to nearly 1,300 incident reports generated by the company for Minneapolis Public Schools during a six-month period in 2020. The reports, obtained by 社区黑料, showed that 38% were prompted by content that was pornographic or sexual in nature, including references to 鈥渟exual activity involving a student.鈥 Students were regularly flagged for using keywords like 鈥渧irginity,鈥 鈥渞ape,鈥 and, simply, 鈥渟ex.鈥 

Patterson, the Gaggle CEO, has acknowledged that a student鈥檚 private diary entry about being raped wasn鈥檛 off limits. In touting the tool鈥檚 capabilities, he told 社区黑料 his company uncovered the girl鈥檚 diary entry, where she discussed how the assault led to self-esteem issues and guilt. Nobody knew she was struggling until Gaggle notified school officials about what they鈥檇 learned from her diary, Patterson said. 

鈥淭hey were able to intervene and get this girl help for things that she couldn鈥檛 have dealt with on her own,鈥 Patterson said.

Any information that surveillance companies collect about students鈥 sexual behaviors could be used against them by police during investigations, privacy experts warned. And it鈥檚 unclear, Laird said, how long the police can retain any data gleaned from the tools. 

鈥楧on鈥檛 Say Gay鈥

Internet search engines are 鈥減articularly potent鈥 tools to track the behaviors of pregnant people, by the nonprofit Surveillance Technology Oversight Project. In 2017, for example, a with second-degree murder of her stillborn fetus after police scoured her browser history and identified a search for an abortion pill. 

While GoGuardian and other companies offer web filtering to schools, Gaggle has sought to differentiate itself. In his letter to the senators, Patterson said the company 鈥 which sifts through files and chat messages on students鈥 school-issued Microsoft and Google accounts 鈥 is not a web filter and therefore 鈥渄oes not track students鈥 online searches.鈥 Yet Patterson鈥檚 assurance to lawmakers appears misleading. The company acknowledges on its website that it partners with several web-filtering companies, including Linewize, to analyze students鈥 online searches. By working in tandem, flags triggered by Linewize鈥檚 web filtering 鈥渃an be sent straight to the Gaggle Safety Team,鈥 if the material 鈥渟hould be forwarded to the school or district.鈥 

In an email, Gaggle spokesperson Paget Hetherington said that in 鈥渁 very small number of school systems,鈥 the company reviews alerts from web filters before they鈥檙e sent to school officials to 鈥渁lleviate the large number of false positives鈥 and ensure that 鈥渙nly the most critical and imminent issues are being seen by the district.鈥 

Gaggle has also faced scrutiny for including LGBTQ-specific keywords in its algorithm, including 鈥済ay鈥 and 鈥渓esbian.鈥 Patterson said the heightened surveillance of LGBTQ youth is necessary because they face a disproportionately high suicide rate, and Hetherington shared examples where the keywords were used to spot cyberbullying incidents. 

But critics have accused the company of discrimination. Wood of the nonprofit LGBT Tech said that anti-LGBT activists have used surveillance to target their opponents for generations. Prior to the seminal 1969 riots after New York City police raided the Stonewall Inn gay bar, LGBTQ spaces and made arrests for 鈥渋nferring sexual perversion鈥 and 鈥渟erving gay people.鈥 From the colonial era and into the 19th century, anti-sodomy laws carried the death penalty and police used the rules to investigate and incarcerate people suspected of same-sex intimate behaviors. 

Now, in the era of 鈥淒on鈥檛 Say Gay鈥 laws, digital surveillance tools could be used to out LGBTQ students and put them in danger, Wood said. Student surveillance companies can claim their decision to include LGBTQ terminology is designed to help students, but historically such data have 鈥渂een used against us in very detrimental ways.鈥 

Companies, he said, are unable to control how officials use that information in an era 鈥渨here teachers and administrators and other students are encouraged to out other students or blame them or somehow get them in trouble for their identity.鈥 In Texas, Republican Gov. Greg Abbott calling on child protective services to investigate as child abuse any parents who provide gender-affirming health care to their transgender children. 

鈥淭hey can鈥檛 control what鈥檚 going to happen in Florida or Texas and they can鈥檛 control what鈥檚 going to happen in an individual home,鈥 where students could be subjected to abuse, Wood said. 鈥淎ny person in their right mind would be horrified to learn that it was their technology that ended up harming a youth or driving a youth to the point of feeling so isolated that they felt the only way out was suicide.鈥 

When private thoughts become public

Susan, a 14-year-old from Cincinnati, knows firsthand how surveillance companies can target students for discussing their sexuality. In middle school, she was assigned to write a 鈥渢ime capsule鈥 letter to her future self. 

Until Susan retrieved the letter after high school graduation, her teacher said that no one 鈥 not even him 鈥 would read it. So Susan, who is now a freshman and asked to remain anonymous, used the private space to question her gender identity. 

But her teacher鈥檚 assurance wasn鈥檛 quite true, she learned. Someone had been reading the letter 鈥 and would soon hold it against her. 

In an automated May 2021 email, Gaggle notified her that the letter to her future self was 鈥渋dentified as inappropriate鈥 and urged her to 鈥渞efrain from storing or sharing inappropriate content.鈥 In a 鈥渟econd warning,鈥 sent to her inbox, she was told a school administrator was given 鈥渁ccess to this violation.鈥 After a third alert, she said, access to her school email account was restricted. She said the experience left her with 鈥渁 sense of betrayal from my school.鈥 She said she had no idea words like 鈥済ay鈥 or 鈥渟ex鈥 could get flagged by Gaggle鈥檚 algorithm.

Susan, a student from Cincinnati, received an email alert from Gaggle notifying her that her classroom assignment, a 鈥渢ime capsule鈥 letter to her future self, had been 鈥渋dentified as inappropriate.鈥 (Courtesy Susan)

鈥淚t鈥檚 frustrating to know that this program finds the need to have these as keywords, and quite depressing,鈥 she said. 鈥淭here鈥檚 always going to be oppression against the community somewhere, it seems, and it鈥檚 quite disheartening.鈥 

School administrators reviewed the time capsule letter and determined it didn鈥檛 contain anything inappropriate, her mother Margaret said. While Susan lives in an LGBTQ-affirming household, Thomas, who grew up in Mississippi, warned that鈥檚 not the case for everyone.

鈥淭hat鈥檚 not just the surveillance of your activities, that鈥檚 the surveillance of your thoughts,鈥 Thomas said of Susan鈥檚 experience. 鈥淚 know that wouldn鈥檛 have gone very well for me and I know for a lot of young people that would place them in a lot of danger.鈥

Such harms could be exacerbated, Margaret said, if authorities use student data to enforce Ohio鈥檚 strict abortion ban, which has already become the subject of national debate after a 10-year-old girl traveled to Indiana for an abortion. A 27-year-old man and accused of raping the child. 

Cincinnati Public Schools spokesman Mark Sherwood said in an email that 鈥渓aw enforcement is immediately contacted鈥 if the district receives an alert from Gaggle suggesting that a student poses 鈥渁n imminent threat of harm to self or others.鈥 

Given the state of abortion rules in Ohio, Susan said she鈥檚 concerned that student conversations and classroom assignments that discuss gender and sexuality could wind up in the hands of the police. She lost faith in school-issued technology after her assignment got flagged by Gaggle. 

鈥淚 just flat out don鈥檛 trust adults in positions of power or authority,鈥 Susan said. 鈥淵ou don鈥檛 really know for sure what their true motives are or what they could be doing with the tools they have at their disposal.鈥

]]>
FTC Targets Ed Tech Companies that 鈥業llegally Surveil Children鈥 /article/ftc-announces-plan-to-target-ed-tech-tools-that-illegally-surveil-children/ Fri, 20 May 2022 21:53:00 +0000 /?post_type=article&p=589724 The Federal Trade Commission announced ramped-up enforcement of education technology companies that sell student data for targeted advertising and that 鈥渋llegally surveil children when they go online to learn,鈥 in violation of federal student privacy rules.

鈥淚t is against the law for companies to force parents and schools to surrender their children鈥檚 privacy rights in order to do schoolwork online or attend class remotely,鈥 the federal agency said in a media release Thursday. 鈥淯nder the federal Children鈥檚 Online Privacy Protection Act (COPPA), companies cannot deny children access to educational technologies when their parents or school refuse to sign up for commercial surveillance.鈥 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Through a , the commission signaled its intent to 鈥渟crutinize compliance鈥 with COPPA, the federal law that limits the data that technology companies can collect on children under 13 without parental consent. The statement, approved through a unanimous bipartisan vote by the five commissioners, reminds education technology companies that they are prohibited from using student data for commercial purposes, including for marketing and advertising, should not retain student data for a period longer than what鈥檚 deemed 鈥渞easonably necessary,鈥 and must have sufficient security to ensure data remain confidential. Additionally, tech companies must not exclude students who do not disclose more personal information 鈥渢han is reasonably necessary for the child to participate in that activity.鈥 

The policy statement comes at a critical moment for education technology companies. When the pandemic shuttered schools nationally and forced children into remote learning, their place in the education landscape grew exponentially as educators relied more heavily on their services. But they鈥檝e also faced scrutiny for their data collection practices, particularly in the wake of high-profile breaches. recently notified students that their personal data was compromised in a breach at the company Illuminate Education. The hack exposed the personal information of some , the nation鈥檚 largest school district.

The FTC statement does not introduce any new rules, yet it makes clear that education technology and student privacy are an enforcement priority. Weak enforcement of student privacy rules has been a longstanding problem, said Cody Venzke, senior counsel at the nonprofit Center for Democracy and Technology.

Suggesting that the federal government had gone too easy on ed tech companies in the past, President Joe Biden criticized student surveillance practices on Thursday and signaled his support for greater student privacy protections. 

鈥淲hen children and parents access online educational products, they shouldn鈥檛 be forced to accept tracking and surveillance to do so,鈥 Biden said in a statement. The FTC, he said, 鈥渨ill be cracking down on companies that persist in exploiting our children to make money.鈥 

Among the services and applications that saw significant growth during the pandemic are those that monitor students鈥 online activities on school-issued devices and technology. Company executives say their digital products are critical to identify youth who are at risk of harming themselves or others, but critics argue the surveillance violates students鈥 privacy rights. 

社区黑料 has reported extensively on the expanding presence of such student surveillance companies, including Gaggle, which sifts through billions of student communications on school-issued Google and Microsoft accounts each year in search of references to violence and self-harm. Company executives say the tools save live,s but critics argue they could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

In one recent story, former content moderators on the front lines of Gaggle鈥檚 student monitoring efforts raised significant questions about the company鈥檚 efficacy and its effects on students鈥 civil rights. The former moderators reported insufficient safeguards to protect students鈥 sensitive data, a work culture that prioritized speed over quality, limited training and frequent exposure to explicit content that left some traumatized. 

In , FTC Chair Lina Khan said that 鈥渃ommercial surveillance cannot be a condition of doing schoolwork.鈥 

鈥淭hough widespread tracking, surveillance and expansive use of data across contexts have become increasingly common practices across the broader economy,鈥 Khan said, the policy makes clear that federal law 鈥渇orbids companies from wholesale extending these practices into the context of schools and learning.鈥 

The FTC鈥檚 comments on surveillance, Venzke said in an email, suggest that the agency will scrutinize the practices of education technology vendors that collect 鈥渢roves of sensitive information about students’ lives, including student activity monitoring software vendors.鈥 

鈥淪tudent activity monitoring companies must ensure they are taking appropriate steps to not only secure the sensitive data they collect on students, but also to ensure that they are collecting only the absolute minimum data that they need to achieve a legitimate educational purpose 鈥 and then that they delete the data when it is no longer needed,鈥 Venzke said.

A Gaggle spokesperson didn鈥檛 immediately respond to a request for comment. In on Thursday, the company noted that it takes 鈥渄ata security very seriously,鈥 only uses student information for educational purposes, has a strict data retention policy and has comprehensive security standards. The post said the company does not sell student data or engage in targeted advertising. 

Numerous companies have faced fines in recent years for violating the federal privacy law. In 2019, for example, YouTube paid to settle allegations it collected childrens鈥 data without parental consent and used it for targeted advertising. that same year to settle similar allegations. 

Amelia Vance

Despite the commission鈥檚 harsh critique of surveillance, the enforcement of student privacy rules will likely go beyond companies that monitor students online, said attorney Amelia Vance. the co-founder and president of Public Interest Privacy Consulting. She interpreted the FTC announcement to broadly encompass 鈥渟urveillance capitalism,鈥 where personal data are collected and sold for profit. However, she noted that Gaggle and other monitoring companies could have particular problems. In its announcement, the FTC said it is unreasonable for education technology companies to retain student data 鈥渇or speculative future potential purposes.鈥

鈥淪o much of the monitoring information collected and kept, especially when it comes to tracking the mental health of students, it could easily, arguably be speculative,鈥 she said. 鈥淭hat could cause confusion from companies about what obligations they have to either collect certain data or not collect certain data or not retain certain data even when the school has asked for it.鈥 

The FTC announcement follows a recent investigation into student monitoring companies by Democratic Sens. Elizabeth Warren and Ed Markey, which warned of surveillance companies鈥 potential harms and called on the Federal Communications Commission to clarify the provisions of another federal law, the Children鈥檚 Internet Protection Act, which requires schools to monitor students鈥 online activities.

In response to the FTC statement, a bipartisan group of senators cautioned that threats to online privacy have reached 鈥渁 crisis point.鈥 

鈥淲e applaud the FTC鈥檚 attention to this urgent problem and its acknowledgment that a child鈥檚 education should never come at the expense of their privacy,鈥 said a statement released by Markey, fellow Democratic Sen. Richard Blumenthal and Republican Sens. Bill Cassidy and Cynthia Lummis. 鈥淭he FTC鈥檚 policy statement is an important step in the right direction, but it is not a replacement for legislative action.鈥

]]>
Meet the Gatekeepers of Students鈥 Private Lives /article/meet-the-gatekeepers-of-students-private-lives/ Mon, 02 May 2022 11:15:00 +0000 /?post_type=article&p=588567 If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Megan Waskiewicz used to sit at the top of the bleachers, rest her back against the wall and hide her face behind the glow of a laptop monitor. While watching one of her five children play basketball on the court below, she knew she had to be careful. 

The mother from Pittsburgh didn鈥檛 want other parents in the crowd to know she was also looking at child porn.

Waskiewicz worked as a content moderator for Gaggle, a surveillance company that monitors the online behaviors of some 5 million students across the U.S. on their school-issued Google and Microsoft accounts. Through an algorithm designed to flag references to sex, drugs, and violence and a team of content moderators like Waskiewicz, the company sifts through billions of students鈥 emails, chat messages and homework assignments each year. Their work is supposed to ferret out evidence of potential self-harm, threats or bullying, incidents that would prompt Gaggle to notify school leaders and, .

As a result, kids鈥 deepest secrets 鈥 like nude selfies and suicide notes 鈥 regularly flashed onto Waskiewicz鈥檚 screen. Though she felt 鈥渁 little bit like a voyeur,鈥 she believed Gaggle helped protect kids. But mostly, the low pay, the fight for decent hours, inconsistent instructions and stiff performance quotas left her feeling burned out. Gaggle鈥檚 moderators face pressure to review 300 incidents per hour and Waskiewicz knew she could get fired on a moment鈥檚 notice if she failed to distinguish mundane chatter from potential safety threats in a matter of seconds. She lasted about a year.

鈥淚n all honesty I was sort of half-assing it,鈥 Waskiewicz admitted in an interview with 社区黑料. 鈥淚t wasn鈥檛 enough money and you鈥檙e really stuck there staring at the computer reading and just click, click, click, click.”

Content moderators like Waskiewicz, hundreds of whom are paid just $10 an hour on month-to-month contracts, are on the front lines of a company that claims it saved the lives of 1,400 students last school year and argues that the growing mental health crisis makes its presence in students鈥 private affairs essential. Gaggle founder and CEO Jeff Patterson has warned about 鈥渁 tsunami of youth suicide headed our way鈥 and said that schools have 鈥渁 moral obligation to protect the kids on their digital playground.鈥 

Eight former content moderators at Gaggle shared their experiences for this story. While several believed their efforts in some cases did shield kids from serious harm, they also surfaced significant questions about the company鈥檚 efficacy, its employment practices and its effect on students鈥 civil rights.

Among the moderators who worked on a contractual basis, none had prior experience in school safety, security or mental health. Instead, their employment histories included retail work and customer service, but they were drawn to Gaggle while searching for remote jobs that promised flexible hours. 

They described an impersonal and cursory hiring process that appeared automated. Former moderators reported submitting applications online and never having interviews with Gaggle managers 鈥 either in-person, on the phone or over Zoom 鈥 before landing jobs.

Once hired, moderators reported insufficient safeguards to protect students鈥 sensitive data, a work culture that prioritized speed over quality, scheduling issues that sent them scrambling to get hours and frequent exposure to explicit content that left some traumatized. Contractors lacked benefits including mental health care and one former moderator said he quit after repeated exposure to explicit material that so disturbed him he couldn鈥檛 sleep and without 鈥渁ny money to show for what I was putting up with.鈥

Gaggle content moderators encompass as many as 600 contractors at any given time and just two dozen work as employees who have access to benefits and on-the-job training that lasts several weeks. Gaggle executives have sought to downplay contractors鈥 role with the company, arguing they use 鈥渃ommon sense鈥 to distinguish false flags generated by the algorithm from potential threats and do 鈥渘ot require substantial training.鈥 

While the experiences reported by Gaggle鈥檚 moderator team platforms like Meta-owned Facebook, Patterson said his company relies on 鈥淯.S.-based, U.S.-cultured reviewers as opposed to outsourcing that work to India or Mexico or the Philippines,鈥 as . He rebuffed former moderators who said they lacked sufficient time to consider the severity of a particular item.

鈥淪ome people are not fast decision-makers. They need to take more time to process things and maybe they鈥檙e not right for that job,鈥 he told 社区黑料. 鈥淔or some people, it鈥檚 no problem at all. For others, their brains don鈥檛 process that quickly.鈥

Executives also sought to minimize the contractors鈥 access to students鈥 personal information; a spokeswoman said they only see 鈥渟mall snippets of text鈥 and lacked access to what鈥檚 known as students鈥 鈥減ersonally identifiable information.鈥 Yet former contractors described reading lengthy chat logs, seeing nude photographs and, in some cases, coming upon students鈥 names. Several former moderators said they struggled to determine whether something should be escalated as harmful due to 鈥済ray areas,鈥 such as whether a Victoria鈥檚 Secret lingerie ad would be considered acceptable or not. 

鈥淭hose people are really just the very, very first pass,鈥 Gaggle spokeswoman Paget Hetherington said. 鈥淚t doesn鈥檛 really need training, it鈥檚 just like if there鈥檚 any possible doubt with that particular word or phrase it gets passed on.鈥 

Molly McElligott, a former content moderator and customer service representative, said management was laser focused on performance metrics, appearing more interested in business growth and profit than protecting kids. 

鈥淚 went into the experience extremely excited to help children in need,鈥 McElligott wrote in an email. Unlike the contractors, McElligott was an employee at Gaggle, where she worked for five months in 2021 before taking a position at the Manhattan District Attorney’s Office in New York. 鈥淚 realized that was not the primary focus of the company.”

Gaggle is part of a burgeoning campus security industry that鈥檚 seen significant business growth in the wake of mass school shootings as leaders scramble to prevent future attacks. Patterson, who founded the company in 1999 by that could be monitored for , said its focus now is mitigating the .

Patterson said the team talks about 鈥渓ives saved鈥 and child safety incidents at every meeting, and they are open about sharing the company鈥檚 financial outlook so that employees 鈥渃an have confidence in the security of their jobs.鈥

Content moderators work at a Facebook office in Austin, Texas. Unlike the social media giant, Gaggle鈥檚 content moderators work remotely. (Ilana Panich-Linsman / Getty Images)

鈥榃e are just expendable鈥

Under the pressure of new federal scrutiny along with three other companies that monitor students online, it relies on a 鈥渉ighly trained content review team鈥 to analyze student materials and flag safety threats. Yet former contractors, who make up the bulk of Gaggle鈥檚 content review team, described their training as 鈥渁 joke,鈥 consisting of a slideshow and an online quiz, that left them ill-equipped to complete a job with such serious consequences for students and schools.

As an employee on the company鈥檚 safety team, McElligott said she underwent two weeks of training but the disorganized instruction meant her and other moderators were 鈥渕ore confused than when we started.鈥

Former content moderators have also flocked to employment websites like Indeed.com to warn job seekers about their experiences with the company, often sharing reviews that resembled the former moderators鈥 feedback to 社区黑料.

鈥淚f you want to be not cared about, not valued and be completely stressed/traumatized on a daily basis this is totally the job for you,鈥 one on Indeed. 鈥淲arning, you will see awful awful things. No they don鈥檛 provide therapy or any kind of support either.

鈥淭hat isn鈥檛 even the worst part,鈥 the reviewer continued. 鈥淭he worst part is that the company does not care that you hold them on your backs. Without safety reps they wouldn鈥檛 be able to function, but we are just expendable.鈥 

As the first layer of Gaggle鈥檚 human review team, contractors analyze materials flagged by the algorithm and decide whether to escalate students鈥 communications for additional consideration. Designated employees on Gaggle鈥檚 Safety Team are in charge of calling or emailing school officials to notify them of troubling material identified in students鈥 files, Patterson said.

Gaggle鈥檚 staunchest critics have questioned the tool鈥檚 efficacy and describe it as a student privacy nightmare. In March, Democratic Sens. Elizabeth Warren and Ed Markey and similar companies to protect students鈥 civil rights and privacy. In a report, the senators said the tools could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

The information shared by the former Gaggle moderators with 社区黑料 鈥渟truck me as the worst-case scenario,鈥 said attorney Amelia Vance, the co-founder and president of Public Interest Privacy Consulting. Content moderators鈥 limited training and vetting, as well as their lack of backgrounds in youth mental health, she said, 鈥渋s not acceptable.鈥

In to lawmakers, Gaggle described a two-tiered review procedure but didn鈥檛 disclose that low-wage contractors were the first line of defense. CEO Patterson told 社区黑料 they 鈥渄idn鈥檛 have nearly enough time鈥 to respond to lawmakers鈥 questions about their business practices and didn鈥檛 want to divulge proprietary information. Gaggle uses a third party to conduct criminal background checks on contractors, Patterson said, but he acknowledged they aren鈥檛 interviewed before getting placed on the job.

鈥淭here鈥檚 a lot of contractors. We can鈥檛 do a physical interview of everyone and I don鈥檛 know if that’s appropriate,鈥 he said. 鈥淚t might actually introduce another set of biases in terms of who we hire or who we don鈥檛 hire.”

鈥極ther eyes were seeing it鈥

In a previous investigation, 社区黑料 analyzed a cache of public records to expose how Gaggle鈥檚 algorithm and content moderators subject students to relentless digital surveillance long after classes end for the day, extending schools鈥 authority far beyond their traditional powers to regulate speech and behavior, including at home. Gaggle鈥檚 algorithm relies largely on keyword matching and gives content moderators a broad snapshot of students鈥 online activities including diary entries, classroom assignments and casual conversations between students and their friends. 

After the pandemic shuttered schools and shuffled students into remote learning, Gaggle oversaw a surge in students鈥 online materials and of school districts interested in their services. Gaggle as educators scrambled to keep a watchful eye on students whose chatter with peers moved from school hallways to instant messaging platforms like Google Hangouts. One year into the pandemic, Gaggle in references to suicide and self-harm, accounting for more than 40% of all flagged incidents. 

Waskiewicz, who began working for Gaggle in January 2020, said that remote learning spurred an immediate shift in students鈥 online behaviors. Under lockdown, students without computers at home began using school devices for personal conversations. Sifting through the everyday exchanges between students and their friends, Waskiewicz said, became a time suck and left her questioning her own principles. 

鈥淚 felt kind of bad because the kids didn鈥檛 have the ability to have stuff of their own and I wondered if they realized that it was public,鈥 she said. 鈥淚 just wonder if they realized that other eyes were seeing it other than them and their little friends.鈥

Student activity monitoring software like Gaggle has become ubiquitous in U.S. schools, and 81% of teachers work in schools that use tools to track students鈥 computer activity, according to a recent survey by the nonprofit Center for Democracy and Technology. A majority of teachers said the benefits of using such tools, which can block obscene material and monitor students鈥 screens in real time, outweigh potential risks.

Likewise, students generally recognize that their online activities on school-issued devices are being observed, the survey found, and alter their behaviors as a result. More than half of student respondents said they don鈥檛 share their true thoughts or ideas online as a result of school surveillance and 80% said they were more careful about what they search online. 

A majority of parents reported that the benefits of keeping tabs on their children鈥檚 activity exceeded the risks. Yet they may not have a full grasp on how programs like Gaggle work, including the heavy reliance on untrained contractors and weak privacy controls revealed by 社区黑料鈥檚 reporting, said Elizabeth Laird, the group鈥檚 director of equity in civic technology. 

鈥淚 don鈥檛 know that the way this information is being handled actually would meet parents鈥 expectations,鈥 Laird said. 

Another former contractor, who reached out to 社区黑料 to share his experiences with the company anonymously, became a Gaggle moderator at the height of the pandemic. As COVID-19 cases grew, he said he felt unsafe continuing his previous job as a caregiver for people with disabilities so he applied to Gaggle because it offered remote work. 

About a week after he submitted an application, Gaggle gave him a key to kids鈥 private lives 鈥 including, most alarming to him, their nude selfies. Exposure to such content was traumatizing, the former moderator said, and while the job took a toll on his mental well-being, it didn鈥檛 come with health insurance. 

鈥淚 went to a mental hospital in high school due to some hereditary mental health issues and seeing some of these kids going through similar things really broke my heart,鈥 said the former contractor, who shared his experiences on the condition of anonymity, saying he feared possible retaliation by the company. 鈥淚t broke my heart that they had to go through these revelations about themselves in a context where they can鈥檛 even go to school and get out of the house a little bit. They have to do everything from home 鈥 and they鈥檙e being constantly monitored.鈥 

In this screenshot, Gaggle explains its terms and conditions for contract content moderators. The screenshot, which was provided to 社区黑料 by a former contractor who asked to remain anonymous, has been redacted.

Gaggle employees are offered benefits, including health insurance, and can attend group therapy sessions twice per month, Hetherington said. Patterson acknowledged the job can take a toll on staff moderators, but sought to downplay its effects on contractors and said they鈥檙e warned about exposure to disturbing content during the application process. He said using contractors allows Gaggle to offer the service at a price school districts can afford. 

鈥淨uite honestly, we鈥檙e dealing with school districts with very limited budgets,鈥 Patterson said. 鈥淭here have to be some tradeoffs.鈥 

The anonymous contractor said he wasn鈥檛 as concerned about his own well-being as he was about the welfare of the students under the company鈥檚 watch. The company lacked adequate safeguards to protect students鈥 sensitive information from leaking outside the digital environment that Gaggle built for moderators to review such materials. Contract moderators work remotely with limited supervision or oversight, and he became especially concerned about how the company handled students鈥 nude images, which are reported to school districts and the . Nudity and sexual content accounted for about 17% of emergency phone calls and email alerts to school officials last school year, . 

Contractors, he said, could easily save the images for themselves or share them on the dark web. 

Patterson acknowledged the possibility but said he wasn鈥檛 aware of any data breaches. 

鈥淲e do things in the interface to try to disable the ability to save those things,鈥 Patterson said, but 鈥測ou know, human beings who want to get around things can.鈥

鈥楳ade me feel like the day was worth it鈥

Vara Heyman was looking for a career change. After working jobs in retail and customer service, she made the pivot to content moderation and a contract position with Gaggle was her first foot in the door. She was left feeling baffled by the impersonal hiring process, especially given the high stakes for students. 

Waskiewicz had a similar experience. In fact, she said the only time she ever interacted with a Gaggle supervisor was when she was instructed to provide her bank account information for direct deposit. The interaction left her questioning whether the company that contracts with more than 1,500 school districts was legitimate or a scam. 

鈥淚t was a little weird when they were asking for the banking information, like 鈥榃ait a minute is this real or what?鈥欌 Waskiewicz said. 鈥淚 Googled them and I think they鈥檙e pretty big.鈥

Heyman said that sense of disconnect continued after being hired, with communications between contractors and their supervisors limited to a Slack channel. 

Despite the challenges, several former moderators believe their efforts kept kids safe from harm. McElligott, the former Gaggle safety team employee, recalled an occasion when she found a student鈥檚 suicide note. 

鈥淜nowing I was able to help with that made me feel like the day was worth it,鈥 she said. 鈥淗earing from the school employees that we were able to alert about self-harm or suicidal tendencies from a student they would never expect to be suffering was also very rewarding. It meant that extra attention should or could be given to the student in a time of need.鈥 

Susan Enfield, the superintendent of Highline Public Schools in suburban Seattle, said her district鈥檚 contract with Gaggle has saved lives. Earlier this year, for example, the company detected a student鈥檚 suicide note early in the morning, allowing school officials to spring into action. The district uses Gaggle to keep kids safe, she said, but acknowledged it can be a disciplinary tool if students violate the district鈥檚 code of conduct. 

鈥淣o tool is perfect, every organization has room to improve, I鈥檓 sure you could find plenty of my former employees here in Highline that would give you an earful about working here as well,鈥 said Enfield, one of 23 current or former superintendents from across the country who Gaggle cited as references in its letter to Congress. 

鈥淭here鈥檚 always going to be pros and cons to any organization, any service,鈥 Enfield told 社区黑料, 鈥渂ut our experience has been overwhelmingly positive.鈥

True safety threats were infrequent, former moderators said, and most of the content was mundane, in part because the company鈥檚 artificial intelligence lacked sophistication. They said the algorithm routinely flagged students鈥 papers on the novels To Kill a Mockingbird and The Catcher in the Rye. They also reported being inundated with spam emailed to students, acting as human spam filters for a task that鈥檚 long been automated in other contexts. 

Conor Scott, who worked as a contract moderator while in college, said that 鈥99% of the time鈥 Gaggle鈥檚 algorithm flagged pedestrian materials including pictures of sunsets and student鈥檚 essays about World War II. Valid safety concerns, including references to violence and self-harm, were rare, Scott said. But he still believed the service had value and felt he was doing 鈥渢he right thing.鈥

McElligott said that managers鈥 personal opinions added another layer of complexity. Though moderators were 鈥渉eld to strict rules of right and wrong decisions,鈥 she said they were ultimately 鈥渂eing judged against our managers鈥 opinions of what is concerning and what is not.鈥 

鈥淚 was told once that I was being overdramatic when it came to a potential inappropriate relationship between a child and adult,鈥 she said. 鈥淭here was also an item that made me think of potential trafficking or child sexual abuse, as there were clear sexual plans to meet up 鈥 and when I alerted it, I was told it was not as serious as I thought.鈥 

Patterson acknowledged that gray areas exist and that human discretion is a factor in deciding what materials are ultimately elevated to school leaders. But such materials, he said, are not the most urgent safety issues. He said their algorithm errs on the side of caution and flags harmless content because district leaders are 鈥渟o concerned about students.鈥 

The former moderator who spoke anonymously said he grew alarmed by the sheer volume of mundane student materials that were captured by Gaggle鈥檚 surveillance dragnet, and pressure to work quickly didn鈥檛 offer enough time to evaluate long chat logs between students having 鈥渉eartfelt and sensitive鈥 conversations. On the other hand, run-of-the-mill chatter offered him a little wiggle room. 

鈥淲hen I would see stuff like that I was like 鈥極h, thank God, I can just get this out of the way and heighten how many items per hour I鈥檓 getting,鈥欌 he said. 鈥淚t鈥檚 like 鈥業 hope I get more of those because then I can maybe spend a little more time actually paying attention to the ones that need it.鈥欌 

Ultimately, he said he was unprepared for such extensive access to students鈥 private lives. Because Gaggle鈥檚 algorithm flags keywords like 鈥済ay鈥 and 鈥渓esbian,鈥 for example, it alerted him to students exploring their sexuality online. Hetherington, the Gaggle spokeswoman, said such keywords are included in its dictionary to 鈥渆nsure that these vulnerable students are not being harassed or suffering additional hardships,鈥 but critics have accused the company of subjecting LGBTQ students to disproportionate surveillance. 

鈥淚 thought it would just be stopping school shootings or reducing cyberbullying but no, I read the chat logs of kids coming out to their friends,鈥 the former moderator said. 鈥淚 felt tremendous power was being put in my hands鈥 to distinguish students鈥 benign conversations from real danger, 鈥渁nd I was given that power immediately for $10 an hour.鈥 

Minneapolis student Teeth Logsdon-Wallace, who posed for this photo with his dog Gilly, used a classroom assignment to discuss a previous suicide attempt and explained how his mental health had since improved. He became upset after Gaggle flagged his assignment. (Photo courtesy Alexis Logsdon)

A privacy issue

For years, student privacy advocates and civil rights groups have warned about the potential harms of Gaggle and similar surveillance companies. Fourteen-year-old Teeth Logsdon-Wallace, a Minneapolis high school student, fell under Gaggle鈥檚 watchful eye during the pandemic. Last September, he used a class assignment to write about a previous suicide attempt and explained how music helped him cope after being hospitalized. Gaggle flagged the assignment to a school counselor, a move the teen called a privacy violation. 

He said it鈥檚 鈥渏ust really freaky鈥 that moderators can review students鈥 sensitive materials in public places like at basketball games, but ultimately felt bad for the contractors on Gaggle鈥檚 content review team. 

鈥淣ot only is it violating the privacy rights of students, which is bad for our mental health, it鈥檚 traumatizing these moderators, which is bad for their mental health,鈥 he said. Relying on low-wage workers with high turnover, limited training and without backgrounds in mental health, he said, can have consequences for students. 

鈥淏ad labor conditions don鈥檛 just affect the workers,鈥 he said. 鈥淚t affects the people they say they are helping.鈥 

Gaggle cannot prohibit contractors from reviewing students鈥 private communications in public settings, Heather Durkac, the senior vice president of operations, said in a statement. 

鈥淗owever, the contractors know the nature of the content they will be reviewing,鈥 Durkac said. 鈥淚t is their responsibility and part of their presumed good and reasonable work ethic to not be conducting these content reviews in a public place.鈥 

Gaggle鈥檚 former contractors also weighed students’ privacy rights. Heyman said she 鈥渨ent back and forth鈥 on those implications for several days before applying to the job. She ultimately decided that Gaggle was acceptable since it is limited to school-issued technology. 

鈥淚f you don鈥檛 want your stuff looked at, you can use Hotmail, you can use Gmail, you can use Yahoo, you can use whatever else is out there,鈥 she said. 鈥淎s long as they鈥檙e being told and their parents are being told that their stuff is going to be monitored, I feel like that is OK.鈥 

Logsdon-Wallace and his mother said they didn鈥檛 know Gaggle existed until his classroom assignment got flagged to a school counselor. 

Meanwhile, the anonymous contractor said that chat conversations between students that got picked up by Gaggle鈥檚 algorithm helped him understand the effects that surveillance can have on young people. 

鈥淪ometimes a kid would use a curse word and another kid would be like, 鈥楧ude, shut up, you know they鈥檙e watching these things,鈥欌 he said. 鈥淭hese kids know that they鈥檙e being looked in on,鈥 even if they don鈥檛 realize their observer is a contractor working from the couch in his living room. 鈥淎nd to be the one that is doing that 鈥 that is basically fulfilling what these kids are paranoid about 鈥 it just felt awful.鈥 

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Disclosure: Campbell Brown is the head of news partnerships at Facebook. Brown co-founded 社区黑料 and sits on its board of directors.

]]>
Senate Inquiry Warns About Harms of Digital School Surveillance Tools /article/senate-inquiry-warns-about-harms-of-digital-school-surveillance-tools-calls-on-fcc-to-clarify-student-monitoring-rules/ Mon, 04 Apr 2022 21:37:00 +0000 /?post_type=article&p=587388 Updated, April 5

Democratic Sens. Elizabeth Warren and Ed Markey are calling on the Federal Communications Commission to clarify how schools should monitor students鈥 online activities, that educators鈥 widespread use of digital surveillance tools could trample students鈥 civil rights.

They also want the U.S. Education Department to start collecting data on the tools that could highlight whether they have disproportionate 鈥 and potentially harmful 鈥 effects on certain student groups. 

In October, the senators asked four education technology companies that keep tabs on the online activity of millions of students across the country 鈥 often 24 hours a day, seven days a week 鈥 to provide information on how they use artificial intelligence to glean their information. 

Based on their responses, the senators said:

  • The companies鈥 software may be misused to identify students who are violating school disciplinary rules. They cited a recent survey where 43% of teachers reported their schools employ the monitoring systems for this purpose, potentially increasing contact between police and students and worsening the school-to-prison pipeline.
  • The companies have not attempted to determine whether their products disproportionately target students of color, who already face harsher and more frequent school discipline, or other vulnerable groups, like LGBTQ youth.
  • Schools, parents and communities are not being appropriately informed of the use 鈥 and potential misuse 鈥 of the data. Three of the four companies indicated they do not directly alert students and guardians of their surveillance.

Warren and Markey concluded a dire 鈥渘eed for federal action to protect students鈥 civil rights, safety and privacy.鈥

鈥淲hile the intent of these products, many of which monitor students鈥 online activity around the clock, may be to protect student safety, they raise significant privacy and equity concerns,鈥 the lawmakers wrote. 鈥淪tudies have highlighted unintended but harmful consequences of student activity monitoring software that fall disproportionately on vulnerable populations.鈥

An FCC spokesperson said they鈥檙e reviewing the and an Education Department spokesperson said they 鈥渓ook forward to corresponding with the senators鈥 about its findings.

Lawmakers鈥 inquiry into the business practices of school security companies Gaggle, GoGuardian, Securly and Bark Technologies is the first congressional investigation into student surveillance tools, whose use grew dramatically during the pandemic when  learning shifted online.

It follows on the heels of investigative reporting by 社区黑料 into Gaggle, which uses artificial intelligence and a team of human content moderators to track the online behaviors of more than 5 million students. 社区黑料 used public records to expose how Gaggle鈥檚 algorithm and its hourly-wage workers sift through billions of student communications each year in search of references to violence and self harm, subjecting youth to constant digital surveillance with steep implications for their privacy. Gaggle, whose tools track students on their school-issued Google and Microsoft accounts, reported a during the pandemic.

Bark didn鈥檛 respond to requests for comment. Securly spokesman Josh Mukai said in a statement that the company is reviewing the senators鈥 March 30 report and looks forward 鈥渢o continuing our dialogue with Senators Warren and Markey on the important topics they have raised.鈥

鈥淧arents expect that schools will keep children safe while in the classroom, on a field trip or while riding on a bus,鈥 GoGuardian spokesman Jeff Gordon said in a statement. 鈥淪chools also have a responsibility to keep students safe in digital spaces and on school-issued devices.鈥 

Gaggle Founder and CEO Jeff Patterson submitted a statement after this article was published. He said the company is reviewing the lawmakers鈥 recommendations 鈥渢o assess how we can further strengthen our work to better protect students.鈥

鈥淲e want to ensure our technology is effectively supporting student safety without creating unintended risks or harms,鈥 Patterson continued. 鈥淲e have taken steps over the years to ensure effective privacy protections and mitigate bias in our platform, but welcome continued dialogue that will help make sure tools like Gaggle can continue to be used to support students and educators.鈥

Bark Technologies CEO Brian Bason wrote in a letter to  lawmakers that AI-driven technology could be used to solve the country鈥檚 鈥渢errible history of bias in school discipline鈥 by removing the decisions of individual teachers and administrators.

鈥淲hile any system, including AI-based solutions, inherently have some bias, if implemented correctly AI-based solutions can substantially reduce the bias that students face,鈥 Bason wrote.

As to the question of whether their surveillance exacerbates the school-to-prison pipeline,  the companies鈥 letters acknowledge in certain cases they contact police to conduct welfare checks on students. Securly noted in its letter that in some instances, education leaders 鈥減refer that we contact public safety agencies directly in lieu of a district contact.鈥

Under the Clinton-era , passed in 2000, public schools and libraries are required to filter and monitor students鈥 internet use to ensure they don鈥檛 access material 鈥渉armful to minors,鈥 such as pornography. Districts have cited the law to justify the adoption of AI-driven surveillance tools that have proliferated in recent years. Student privacy advocates argue the tools go far beyond the federal mandate and have called on the FCC to clarify the law鈥檚 scope. Meanwhile, advocates have questioned whether schools鈥 use of digital surveillance tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures.

In a recent survey by the nonprofit Center for Democracy and Technology, 81 percent of teachers said they used software to track students鈥 computer activity, including to block obscene material or monitor their screens in real time. A majority of parents said they worried about student data getting shared with the police and more than half of students said they decline to share their 鈥渢rue thoughts or ideas because I know what I do online is being monitored.鈥  

Elizabeth Laird, the group鈥檚 director of equity in civic technology, said it has been calling on student surveillance companies to be more transparent about their business practices but it鈥檚 鈥渄isappointing that it took a letter from Congress to get this information.鈥 She said she hopes the FCC and Education Department adopt lawmakers鈥 recommendations.

鈥淣one of these companies have researched whether their products are biased against certain groups of students,鈥 she said in an email while questioning their justification for holding off on such an inquiry. 鈥淭hey cite privacy as the reason for not doing so while simultaneously monitoring students鈥 messages, documents and sites visited 24 hours a day, seven days a week.鈥 

社区黑料鈥檚 investigation, which used data on Gaggle鈥檚 foothold in Minneapolis Public Schools, failed to identify whether the tool鈥檚 algorithm disproportionately targeted Black students, who are more often subjected to student discipline than their white classmates. However, it highlighted instances in which keywords like 鈥済ay鈥 and 鈥渓esbian鈥 were flagged, potentially subjecting LGBTQ youth to heightened surveillance for discussing their sexual orientation. 

Amelia Vance, an attorney and student privacy expert, said she was intrigued that the companies pushed back on the idea that their tools are used to discipline students since the federal monitoring requirement was meant to keep kids from consuming inappropriate content online and likely face consequences for viewing violent or sexually explicit materials. She agreed the companies should research their algorithms for potential biases and would benefit from additional transparency. 

However, Vance said in an email that FCC clarification 鈥渨ould do little at best and may provide counterproductive guidance at worst.鈥 Many schools, she said, are likely to use the tools regardless of the federal rules. 

鈥淪chools aren鈥檛 required to monitor social media, and many have chosen to do so anyway,鈥 said Vance, the co-founder and president of Public Interest Privacy Consulting. Some school safety advocates are actively lobbying lawmakers to expand student monitoring requirements, she said. 

Asking the FCC to issue guidance 鈥渃ould actually be counterproductive to the goal of limiting monitoring and ensuring more privacy protections for students since it is possible that the FCC could require a higher level of monitoring.鈥

Read the letters from Gaggle, GoGuardian, Securly and Bark Technologies: 

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it鈥檚 鈥楴ot That Smart鈥 /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone鈥檚 gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since 鈥済raduated鈥 from weekly therapy sessions and has found a better headspace, but that didn鈥檛 stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope 鈥 intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song 鈥淵our Heart is a Muscle the Size of Your Fist鈥 helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was 鈥渁 reminder to keep on loving, keep on fighting and hold on for your life.鈥 (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, 社区黑料 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company鈥檚 digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle鈥檚 surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word 鈥渟uicide,鈥 context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment 鈥 that his mental health had improved 鈥 was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 鈥淚 was trying to be vulnerable with this teacher and be like, 鈥楬ey, here鈥檚 a thing that鈥檚 important to me because you asked,鈥 Logsdon-Wallace said. 鈥淣ow, when I鈥檝e made it clear that I鈥檓 a lot better, the school is contacting my counselor and is freaking out.鈥

Jeff Patterson, Gaggle鈥檚 founder and CEO, said in a statement his company does not 鈥渕ake a judgement on that level of the context,鈥 and while some districts have requested to be notified about references to previous suicide attempts, it鈥檚 ultimately up to administrators to 鈥渄ecide the proper response, if any.鈥  

鈥楢 crisis on our hands鈥

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students鈥 online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students鈥 emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by 社区黑料 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic鈥檚 effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

鈥淏efore the pandemic, we had a crisis on our hands,鈥 he said. 鈥淚 believe there鈥檚 a tsunami of youth suicide headed our way that we are not prepared for.鈥 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there鈥檚 to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace鈥檚 mother Alexis Logsdon didn鈥檛 know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

鈥淭hat was an example of somebody describing really good coping mechanisms, you know, 鈥業 have music that is one of my soothing activities that helps me through a really hard mental health time,鈥欌 she said. 鈥淏ut that doesn鈥檛 matter because, obviously, this software is not that smart 鈥 it鈥檚 just like 鈥榃oop, we saw the word.鈥欌 

鈥楻andom and capricious鈥

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications 鈥 an experience she described as 鈥渞eally scary.鈥

鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of 鈥淚nappropriate Use鈥 while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school鈥檚 literary journal and, according to her, Gaggle had ultimately flagged profanity in students鈥 fictional article submissions. 

鈥淭he link at the bottom of this email is for something that was identified as inappropriate,鈥 Gaggle warned in its email while pointing to one of the fictional articles. 鈥淧lease refrain from storing or sharing inappropriate content in your files.鈥 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn鈥檛 catch everything. Even as she got flagged when students shared documents with her, the articles鈥 authors weren鈥檛 receiving similar alerts, she said. And neither did Gaggle鈥檚 AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle鈥檚 monitoring system is 鈥渞andom and capricious,鈥 and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

鈥淲ith such a seemingly random service, that doesn鈥檛 seem to 鈥 in the end 鈥 have an impact on improving student health or actually taking action to prevent suicide and threats鈥 she said in an interview. 鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times 鈥渄oes not properly indicate the author of a document and assigns a random collaborator.鈥

鈥淲e are hoping Google will improve this functionality so we can better protect students,鈥 Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn鈥檛 notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she鈥檇 shoot her 鈥減uny little brain with my grandpa鈥檚 rifle.鈥

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter鈥檚 teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

鈥淚 didn鈥檛 hear a word from Gaggle about it,鈥 she said. 鈥淚f I hadn鈥檛 brought it to the teacher鈥檚 attention, I don鈥檛 think that anything would have been done.鈥 

The incident, which occurred in April, fell outside the six-month period for which 社区黑料 obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it 鈥渄oes not have any insight into the steps the district took to address this particular matter.鈥 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials 鈥渨ould never discuss with a community member any communication flagged by Gaggle.鈥 

鈥淭hat unrelated but concerned parent would not have been provided that information nor should she have been,鈥 she wrote in an email. 鈥淭hat is private.鈥 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

鈥楾he big scary algorithm鈥

When identifying potential trouble, Gaggle鈥檚 algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they鈥檙e delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That鈥檚 where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 This is not Netflix鈥檚 recommendation engine. This is not Spotify.鈥
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

鈥淵ou鈥檙e going to get 25,000 emails saying that a student dropped an F-bomb in a chat,鈥 she said. 鈥淲hat鈥檚 the utility of that? That seems pretty low.鈥 

She said that Gaggle鈥檚 utility could be impaired because it doesn鈥檛 adjust to students鈥 behaviors over time, comparing it to Netflix, which recommends television shows based on users鈥 ever-evolving viewing patterns. 鈥淪omething that doesn鈥檛 learn isn鈥檛 going to be accurate,鈥 she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle鈥檚 marketing materials appear to overhype the tool鈥檚 sophistication to schools, she said. 

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 she said. 鈥淭his is not Netflix鈥檚 recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.鈥 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart.鈥
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle鈥檚 proprietary algorithm is updated regularly 鈥渢o adjust to student behaviors over time and improve accuracy and speed.鈥 The tool monitors 鈥渢housands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.鈥 

Ultimately, the algorithm to identify keywords is used to 鈥渘arrow down the haystack as much as possible,鈥 Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart,鈥 he said. 

In Minneapolis, officials denied that Gaggle infringes on students鈥 privacy and noted that the tool only operates within school-issued accounts. The district鈥檚 internet use policy states that students should 鈥渆xpect only limited privacy,鈥 and that the misuse of school equipment could result in discipline and 鈥渃ivil or criminal liability.鈥 District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor 鈥渢he online activities of minors.鈥 

Patterson suggested that teachers aren鈥檛 paying close enough attention to keep students safe on their own and 鈥渟ometimes they forget that they鈥檙e mandated reporters.鈥 On the , Patterson says he launched the company in 1999 to provide teachers with 鈥渁n easy way to watch over their gaggle of students.鈥 Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company鈥檚 role in meeting it. As technology becomes a key facet of American education, Patterson said that schools 鈥渉ave a moral obligation to protect the kids on their digital playground.鈥 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student 鈥渢racking鈥 through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn鈥檛 be 鈥渃onstrued to require the tracking of internet use by any identifiable minor or adult user.鈥 In , her group urged the government to clarify the Children鈥檚 Internet Protection Act鈥檚 requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they鈥檙e concerned the tools 鈥渕ay extend beyond鈥 the law鈥檚 intent 鈥渢o surveil student activity or reinforce biases.鈥 Around-the-clock surveillance, they wrote, demonstrates 鈥渁 clear invasion of student privacy, particularly when students and families are unable to opt out.鈥 

鈥淓scalations and mischaracterizations of crises may have long-lasting and harmful effects on students鈥 mental health due to stigmatization and differential treatment following even a false report,鈥 the senators wrote. 鈥淔lagging students as 鈥榟igh-risk鈥 may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.鈥

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd鈥檚 murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle鈥檚 algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by 社区黑料 offer a limited window into Gaggle鈥檚 potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students鈥 digital communications are forwarded to police in rare circumstances. The Minneapolis district鈥檚 internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district鈥檚 Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district鈥檚 director of emergency management, safety and security, said that law enforcement is not a 鈥渞egular partner,鈥 when responding to incidents flagged by Gaggle. It doesn鈥檛 deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by 社区黑料.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

鈥淓ven if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,鈥 Matlock said, though it鈥檚 unclear if any students have faced legal consequences. 鈥淚t鈥檚 the question as to why they鈥檙e doing it,鈥 and to raise the issue with their parents.

Gaggle鈥檚 keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including 鈥済ay, and 鈥渓esbian.鈥 On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident 鈥渄isgusting and horribly messed up.鈥 

鈥淭hey have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it鈥檚 going to be false-positive because they are acting as if the word gay is inherently sexual,鈥 he said. 鈥淲hen people are just talking about being gay, anything they鈥檙e writing would be flagged.鈥 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in 社区黑料鈥檚 data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

鈥淭hat鈥檚 definitely really messed up, especially when the school is like 鈥極h no, no, no, please keep these Chromebooks over the summer,鈥欌 an invitation that gave students 鈥渢he go-ahead to use them鈥 for personal reasons, he said.

鈥淓specially when it鈥檚 during a pandemic when you can鈥檛 really go anywhere and the only way to talk to your friends is through the internet.鈥

]]>
An Inside Look at Spy Tech Used on Students During Remote Classes 鈥 and Beyond /article/gaggle-spy-tech-minneapolis-students-remote-learning/ Tue, 14 Sep 2021 10:30:00 +0000 /?post_type=article&p=577556 A week after the pandemic forced Minneapolis students to attend classes online, the city school district鈥檚 top security chief got an urgent email, its subject line in all caps, alerting him to potential trouble. Just 12 seconds later, he got a second ping. And two minutes after that, a third.

In each instance, the emails warning Jason Matlock of 鈥淨UESTIONABLE CONTENT鈥 pointed to a single culprit: Kids were watching cartoon porn.

Over the next six months, Matlock got nearly 1,300 similar emails from Gaggle, a surveillance company that monitors students’ school-issued Google and Microsoft accounts. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day. The sheer volume of reports was overwhelming at first, Matlock acknowledged, and many incidents were utterly harmless. About 100 were related to animated pornography and, on one occasion, a member of Gaggle鈥檚 remote surveillance team flagged a fictional story that referenced 鈥渦nderwear.鈥

Hundreds of others, however, suggested imminent danger.

In emails and chat messages, students discussed violent impulses, eating disorders, abuse at home, bouts of depression and, as one student put it, 鈥渆nding my life.鈥 At a moment of heightened social isolation and elevated concern over students鈥 mental health, references to self-harm stood out, accounting for nearly a third of incident reports over a six-month period. In a document titled 鈥淢y Educational Autobiography,鈥 students at Roosevelt High School on the south side of Minneapolis discussed bullying, drug overdoses and suicide. 鈥淜ill me,鈥 one student wrote in a document titled 鈥済oodbye.鈥

Nearly a year after 社区黑料 submitted public records requests to understand the Minneapolis district鈥檚 use of Gaggle during the pandemic, a trove of documents offer an unprecedented look into how one school system deploys a controversial security tool that grew rapidly during COVID-19, but carries significant civil rights and privacy implications.

The data, gleaned from those 1,300 incident reports in the first six months of the crisis, highlight how Gaggle鈥檚 team of content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In fact, only about a quarter of incidents were reported to district officials on school days between 8 a.m and 4 p.m., bringing into sharp relief how the service extends schools鈥 authority far beyond their traditional powers to regulate student speech and behavior, including at home.

Now, as COVID-era restrictions subside and Minneapolis students return to in-person learning this fall, a tool that was pitched as a remote learning necessity isn鈥檛 going away anytime soon. Minneapolis officials reacted swiftly when the pandemic engulfed the nation and forced students to learn from the confines of their bedrooms, paying more than $355,000 鈥 including nearly $64,000 in federal emergency relief money 鈥 to partner with Gaggle until 2023. Faced with a public health emergency, the district circumvented normal procurement rules, a reality that prevented concerned parents from raising objections until after it was too late.

A mental health dilemma

With each alert, Matlock and other district officials were given a vivid look into students鈥 most intimate thoughts and online behaviors, raising significant privacy concerns. It鈥檚 unclear, however, if any of them made kids safer. Independent research on the efficacy of Gaggle and similar services .

When students鈥 mental health comes into play, a complicated equation emerges. In recent years, schools have ramped up efforts to identify and provide interventions to children at risk of harming themselves or others. Gaggle executives see their tool as a key to identify youth who are lamenting over hardships or discussing violent plans. On average, Gaggle notifies school officials within 17 minutes after zeroing in on student content related to suicide and self-harm, according to the company, and officials claim they saved more than 1,400 lives during the 2020-21 school year.

Jeff Patterson

鈥淎s a parent you have no idea what鈥檚 going on in your kid鈥檚 head, but if you don鈥檛 know you can鈥檛 help them,鈥 said Jeff Patterson, Gaggle鈥檚 founder and CEO. 鈥淎nd I would always want to err on trying to identify kids who need help.鈥

Critics, however, have questioned Gaggle鈥檚 effectiveness and worry that rummaging through students personal files and conversations 鈥 and in some cases outing students for exhibiting signs of mental health issues including depression 鈥 could backfire.

Using surveillance to identify children in distress could exacerbate feelings of stigma and shame and could ultimately make students less likely to ask for help, said Jennifer Mathis, the director of policy and legal advocacy at in Washington, D.C.

鈥淢ost kids in that situation are not going to share anything anymore and are going to suffer for that,鈥 she said. 鈥淚t suggests that anything you write or say or do in school 鈥 or out of school 鈥 may be found and held against you and used in ways that you had not envisioned.鈥

Minneapolis parent Holly Kragthorpe-Shirley had a similar concern and questioned whether kids 鈥渁ctually have a safe space to raise some of their issues in a safe way鈥 if they鈥檙e stifled by surveillance.

In Minneapolis, for instance, Gaggle flagged the keywords 鈥渇eel depressed鈥 in a document titled 鈥淪EL Journal,鈥 a reference to social-emotional learning. In another instance, Gaggle flagged 鈥渟uicidal鈥 in a document titled 鈥渕ental health problems workbook.鈥

District officials acknowledged that Gaggle had captured student assignments and other personal files, an issue that civil rights groups have long been warning about. The documents obtained by 社区黑料 put hard evidence behind those concerns, said Amelia Vance, the director of at The Future of Privacy Forum, a Washington-based think tank.

Amelia Vance

鈥淭he hypotheticals we鈥檝e been talking about for a few years have come to fruition,鈥 she said. 鈥淚t is highly likely to undercut the trust of students not only in their school generally but in their teacher, in their counselor 鈥 in the mental health problems workbook.鈥 

Patterson shook off any privacy reservations, including those related to monitoring sensitive materials like journal entries, which he characterized as 鈥渃ries for help.鈥

鈥淪ometimes when we intervene we might cause some challenges, but more often than not the kids want to be helped,鈥 he said. Though Gaggle only monitors student files tied to school accounts, he cited a middle school girl鈥檚 private journal in a success story. He said the girl wrote in a digital journal that she suffered with self esteem issues and guilt after getting raped.

鈥淣o one in her life knew about this incident and because she journaled about it,鈥 Gaggle was able to notify school officials about what they鈥檇 learned, he said. 鈥淭hey were able to intervene and get this girl help for things that she couldn鈥檛 have dealt with on her own.鈥

鈥楴eedles in haystacks鈥

Tools like Gaggle have become ubiquitous in classrooms across the country, according to forthcoming research by the D.C.-based In a recent survey, 81 percent of teachers reported having such software in place in their schools. Though most students said they鈥檙e comfortable being monitored, 58 percent said they don鈥檛 share their 鈥渢rue thoughts or ideas鈥 as a result and 80 percent said they鈥檙e more careful about what they search online.

Such data suggest that youth are being primed to accept surveillance as an inevitable reality, said Elizabeth Laird, the center鈥檚 director of equity in civic technology. In return, she said, they鈥檙e giving up the ability to explore new ideas and learn from mistakes.

Gaggle, in business since 1999 and recently relocated to Dallas, monitors the digital files of more than 5 million students across the country each year with the pandemic being very good for its bottom line. Since the onset of the crisis, the number of students surveilled by the privately held company, which does not report its yearly revenue, has . Through artificial intelligence, Gaggle scans students鈥 emails, chat messages and other materials uploaded to students鈥 Google or Microsoft accounts in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. Moderators evaluate flagged material and notify school officials about content they find troubling 鈥 a bar that Matlock acknowledged is quite low as 鈥渢he system is always going to err on the side of caution鈥 and requires district administrators to evaluate materials鈥 context.

鈥淲e鈥檙e looking for needles in haystacks to basically save kids.鈥
鈥擩eff Patterson, founder and CEO of Gaggle, which analyzed more than 10 billion online student communications in the 2020-21 school year.

In Minneapolis, Gaggle officials discovered a majority of offenses in files within students鈥 Google Drive, including in word documents and spreadsheets. More than half of incidents originated on the Drive. Meanwhile, 22 percent originated in emails and 23 percent came from Google Hangouts, the chat feature.

School officials are alerted to only a tiny fraction of student communications caught up in Gaggle鈥檚 dragnet. Last school year, Gaggle collected more than 10 billion items nationally but just 360,000 incidents resulted in notifications to district officials, according to the company. Nationally, 41 percent of incidents during the 2020-21 school year related to suicide and self-harm, according to Gaggle, and a quarter centered on violence.

鈥淲e are looking for needles in haystacks to basically save kids,鈥 Patterson said.

鈥楢 really slippery slope鈥

It was Google Hangouts that had Matt Shaver on edge. When the pandemic hit, classrooms were replaced by video conferences and casual student interactions in hallways and cafeterias were relegated to Hangouts. For Shaver, who taught at a Minneapolis elementary school during the pandemic, students鈥 Hangouts use became overwhelming.

Students were so busy chatting with each other, he said, that many had lost focus on classroom instruction. So he proposed a blunt solution to district technology officials: Shut it down.

鈥淭he thing I wanted was 鈥楾ake the temptation away, take the opportunity away for them to use that,鈥欌 said Shaver, who has since left teaching and is now policy director at the education reform group EdAllies. 鈥淎nd I actually got pushback from IT saying 鈥楴o we鈥檙e not going to do that, this is a good social aspect that we鈥檙e trying to replicate.鈥欌

But unlike those hallway interactions, nobody was watching. Matlock, the district鈥檚 security head, said he was initially in the market for a new anonymous reporting tool, which allows students to flag their friends for behaviors they find troubling. He turned to Gaggle, which operates the anonymous reporting system SpeakUp for Safety, and saw the company鈥檚 AI-powered digital surveillance tool, which goes well beyond SpeakUp鈥檚 powers to ferret out potentially alarming student behavior, as a possibility to 鈥渆nhance the supports for students online.鈥

鈥淲e wanted to get something in place quickly, as we were moving quickly with the lockdown,鈥 he said, adding that going through traditional procurement hoops could take months. 鈥淕aggle had a strong national presence and a reputation.鈥

The district signed an initial six-month, $99,603 contract with Gaggle just a week after the virus shuttered schools in Minneapolis. Board of Education Chair Kim Ellison signed a second, three-year contract at an annual rate of $255,750 in September 2020.

The move came with steep consequences. Though SpeakUP was used just three times during the six-month window included in 社区黑料鈥檚 data, Gaggle鈥檚 surveillance tool flagged students nearly 1,300 times.

During that time, which coincided with the switch to remote learning, the largest share of incidents 鈥 38 percent 鈥 were pornographic or sexual in nature, including references to 鈥渟exual activity involving a student,鈥 professional videos and explicit, student-produced selfies which trigger alerts to the National Center for Missing and Exploited Children.

鈥淚鈥檓 trying to imagine finding out about this as a high schooler, that every single word I鈥檝e written on a Google Hangout or whatever is being monitored 鈥 we live in a country with laws around unreasonable search and seizure 鈥 and surveillance is just a really slippery slope.鈥
鈥擬att Shaver, former Minneapolis Public Schools teacher

An additional 30 percent were related to suicide and self-harm, including incidents that were triggered by keywords including 鈥渃utting,鈥 鈥渇eeling depressed,鈥 鈥渨ant to die,鈥 and 鈥渆nd it all.鈥 an additional 18 percent were related to violence, including threats, physical altercations, references to weapons and suspected child abuse. Such incidents were triggered by keywords including 鈥淏omb,鈥 鈥淕lock,鈥 鈥済oing to fight,鈥 and 鈥渂eat her.鈥 About a fifth of incidents were triggered by profanity.

Concerns over Gaggle鈥檚 reach during the pandemic weren鈥檛 limited to Minneapolis. In December 2020, a group of civil rights organizations including the American Civil Liberties Union of Northern California that by using Gaggle, the Fresno Unified School District had violated the California Electronic Communications Privacy Act, which requires officials to obtain search warrants before accessing electronic information. Such monitoring, the groups contend, infringe on students鈥 free-speech and privacy rights with little ability to opt out.

Shaver, whose students used Google Hangouts to the point of it becoming a distraction, was alarmed to learn that those communications were being analyzed by artificial intelligence and poured over by a remote team of people he didn鈥檛 even know.

鈥淚鈥檓 trying to imagine finding out about this as a high schooler, that every single word I鈥檝e written on a Google Hangout or whatever is being monitored,鈥 he said. 鈥淭here is, of course, some lesson in this, obviously like, 鈥楤e careful of what you put online.鈥 But we live in a country with laws around unreasonable search and seizure 鈥 and surveillance is just a really slippery slope.”

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

The potential to save lives

To Matlock, Gaggle is a lifesaver 鈥 literally. When the tool flagged a Minneapolis student鈥檚 suicide note in the middle of the night, Matlock said he rushed to intervene. In a late-night phone call, the security chief said he warned the unnamed parents, who knew their child was struggling but didn鈥檛 fully recognize how bad things had become. Because of Gaggle, school officials were able to get the student help. To Matlock, the possibility that he saved a student鈥檚 life offers a feeling he 鈥渃an鈥檛 even measure in words.鈥

鈥淚f it saved one kid, if it supported one caregiver, if it supported one family, I鈥檒l take it,鈥 he said. 鈥淭hat鈥檚 the bottom line.鈥

Despite heightened concern over youth mental health issues during the pandemic, its effect on youth suicide rates remains fuzzy. Preliminary data from the Minnesota health department show . Between 2019 and 2020, suicides among people 24 years old and younger decreased by more than 20 percent statewide. Nationally, the has surged during the pandemic, according to the Centers for Disease Control and Prevention, but for people of all ages show a 5.6 percent decline in self-inflicted fatalities in 2020 compared to 2019.

Meanwhile, Gaggle reported that it identified a significant increase of threats related to suicide, self-harm and violence nationwide between March 2020 and March 2021. During that period, Gaggle observed a 31 percent increase in flagged content overall, including a 35 percent increase in materials related to suicide and self-harm. Gaggle officials said the data highlight a mental health crisis among youth during the pandemic. But other factors could be at play. Among them is , creating additional opportunities for Gaggle to tag youth behavior. Meanwhile, the number of students monitored by Gaggle nationally grew markedly during the pandemic.

But that hasn鈥檛 stopped Gaggle from as it markets a new service: Gaggle Therapy. In school districts that sign up for the service, students who are flagged by Gaggle鈥檚 digital monitoring tool are matched with counselors for weekly teletherapy sessions. Therapists available through the service are independent contractors for Gaggle and districts can either pay Gaggle for 鈥渂lanket coverage,鈥 which makes all students eligible, or a 鈥渞etainer鈥 fee, which allows them to 鈥渦se the service as you need it,鈥 . Under the second scenario, Gaggle would have a financial incentive to identify more students in need of teletherapy.

In Minneapolis, Matlock said that school-based social workers and counselors lead intervention efforts when students are identified for materials related to self-harm. 鈥淭he initial moment may be a shock鈥 when students are confronted by school staff about their online behaviors, he said, but providing them with help 鈥渋s much better in the long run.鈥

A presentation sent to Minneapolis teachers explains how the district responds after Gaggle flags a 鈥減ossible student situation鈥 that officials say present an imminent threat. (Photo obtained by 社区黑料)

As the district rolled out the service, many parents and students were out of the loop. Among them was Nathaniel Genene, a recent graduate who served as the Minneapolis school board鈥檚 student representative at the time. He said that classmates contacted him after initial news of the Gaggle contract was released.

鈥淚 had a couple of friends texting me like 鈥楴athaniel, is this true?鈥欌 he said. 鈥淚t was kind of interesting because I had no idea it was even a thing.鈥

Yet as students gained a greater awareness that their communications were being monitored, Matlock said they began to test Gaggle鈥檚 parameters using potential keywords 鈥渁nd then say 鈥楬i鈥 to us while they put it in there.鈥

As students became conditioned to Gaggle, 鈥渢he shock is probably a little bit less,鈥 said Rochelle Cox, an associate superintendent at the Minneapolis school district. Now, she said students have an outlet to get help without having to explicitly ask. Instead, they can express their concerns online with an understanding that school officials are listening. As a result, school-based mental health professionals are able to provide the care students need, she said.

Mathis, with The Bazelon Center for Mental Health Law, called that argument 鈥渞idiculous.鈥 Officials should make sure that students know about available mental health services and ensure that they feel comfortable reaching out for help, she said.

鈥淭hat鈥檚 very different than deciding that we鈥檙e going to catch people by having them write into the ether and that鈥檚 how we鈥檙e going to find the students who need help,鈥 she said. 鈥淲e can be a lot more direct in communicating than that, and we should be a lot more direct and a lot more positive.鈥

In fact, subjecting students to surveillance could push them further into isolation and condition them to lie when officials reach out to inquire about their digital communications, argued Vance of the Future of Privacy Forum.

鈥淓ffective interventions are rarely going to be built on that, you know, 鈥業 saw what you were typing into a Google search last night鈥 or 鈥榳riting a journal entry for your English class,鈥欌 Vance said. 鈥淭hat doesn鈥檛 feel like it builds a trusting relationship. It feels creepy.鈥

]]>