education

A boy wrote about his suicide attempt. He didn’t realize his school’s Gaggle software was watching

[ad_1]

In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens.

For the 13-year-old from Minneapolis, who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplified the emotional distress he was experiencing as a result of gender dysphoria. His billowing depression landed him in the hospital after he tried to kill himself. During that dark stretch, he spent his days in an outpatient psychiatric facility, where he listened to a punk song on loop that promised things would soon “get better”. Eventually they did.

Logsdon-Wallace, a transgender eighth-grader, has since “graduated” from weekly therapy sessions and is doing better, but that didn’t stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the anthem by the band Ramshackle Glory helped him cope – intimate details that wound up in the hands of district security.

In a classroom assignment, Teeth Logsdon-Wallace explained how a Ramshackle Glory song helped him cope after he tried to kill himself. The assignment was flagged by the student surveillance company Gaggle.
In a classroom assignment, Teeth Logsdon-Wallace explained how a Ramshackle Glory song helped him cope after he tried to kill himself. The assignment was flagged by the student surveillance company Gaggle. Photograph: (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, the non-profit website The 74 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless, round-the-clock digital surveillance, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company’s algorithm and human content moderators.

But technology experts and families with first-hand experience with Gaggle’s surveillance dragnet have raised another issue: the service is not only invasive; it may also be ineffective.

In mid-September, a school counselor called Logsdon-Wallace’s mother to let her know the system flagged him for using the word “suicide”. The meaning of the classroom assignment – that his mental health had improved – was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed.

“I was trying to be vulnerable with this teacher and be like, ‘Hey, here’s a thing that’s important to me because you asked,” Logsdon-Wallace said. “Now, when I’ve made it clear that I’m a lot better, the school is contacting my counselor and is freaking out.”

Jeff Patterson, Gaggle’s founder and CEO, said in a statement his company does not “make a judgement on that level of the context”, and it’s ultimately up to school administrators to “decide the proper response, if any”.

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students’ online behavior every day by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students’ emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling.

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by The 74 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments.

Gaggle executives maintain that the system saves lives, including those of more than 1,400 youth during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic’s effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business grow by more than 20% during Covid-19, said Gaggle could be part of the solution.

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there’s a dearth of independent research to back up their claims that these tools are effective.

Logsdon-Wallace’s mother, Alexis Logsdon, didn’t know Gaggle existed until she got the call from his school counselor.

“That was an example of somebody describing really good coping mechanisms, you know, ‘I have music that is one of my soothing activities that helps me through a really hard mental health time,’” she said. “But that doesn’t matter because, obviously, this software is not that smart – it’s just like ‘Woop, we saw the word.’”

‘Random and capricious’

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology in Washington DC. But some youth are registering their objections, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications – an experience she described as “really scary”.

Gaggle sent her an email notification of “Inappropriate Use” while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school’s literary journal. She says Gaggle flagged profanity in fiction submissions that students sent her.

“The link at the bottom of this email is for something that was identified as inappropriate,” Gaggle warned in its email, while pointing to one of the student’s stories. “Please refrain from storing or sharing inappropriate content in your files.”

But Gaggle doesn’t catch everything. The authors of the submissions weren’t receiving similar alerts, she said. And neither did Gaggle’s AI pick up when she wrote about the discrepancy in a student newspaper article where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle’s monitoring system is “random and capricious”, and could be dangerous if school officials act on its findings.

Responding to the fact that the original authors weren’t notified of profanities in their submissions, Gaggle’s CEO blamed Google, which he said does not always “properly indicate the author of a document”.

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article.
Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. Photograph: Courtesy of Lucy Dockter

Gaggle’s algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential problems. The company scans student emails before they’re delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, according to the company. Gaggle prevented the file from ever reaching its intended recipient.

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5% of districts customize the filter, Patterson said.


That’s where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the Future of Privacy Forum in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

On the other hand, she noted that false-positives are very common, especially when the system flags common swear words and fails to understand context.

“You’re going to get 25,000 emails saying that a student dropped an F-bomb in a chat,” she said. “What’s the utility of that? That seems pretty low.”

Patterson said Gaggle’s proprietary algorithm is updated regularly “to adjust to student behaviors over time and improve accuracy and speed”. The tool monitors “thousands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work”. Gaggle content moderators then review materials to gauge their risk levels.

In Minneapolis, officials denied that Gaggle infringes on students’ privacy and noted that the tool only operates within school-issued accounts. The district’s internet use policy states that students should “expect only limited privacy”, and that the misuse of school equipment could result in discipline and “civil or criminal liability”. District leaders have also cited compliance with the Clinton-era Children’s Internet Protection Act, which became law in 2000 and requires schools to monitor “the online activities of minors”.

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student “tracking” through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn’t be “construed to require the tracking of internet use by any identifiable minor or adult user”. In a recent letter to federal lawmakers, her group urged the government to clarify the Children’s Internet Protection Act’s requirements and distinguish monitoring from tracking individual student behaviors.

Senator Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they were concerned the tools “may extend beyond” the law’s intent “to surveil student activity or reinforce biases”.

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color. Algorithms have long been found to reinforce biases.

Data obtained by The 74 offers a limited window into Gaggle’s potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports showed Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data.

Gaggle and Minneapolis district leaders acknowledged that students’ digital communications are forwarded to police in rare circumstances. Jason Matlock, the Minneapolis district’s director of emergency management, safety and security, said that the district had interacted with law enforcement about student materials flagged by Gaggle on several occasions, often involving students sharing explicit photographs of themselves. Such images could trigger police involvement if officials classify them as child pornography. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to what officials deem child pornography, according to records obtained by The 74. It is unclear whether any students faced legal consequences as a result.

Gaggle’s keywords could also have a disproportionate impact on LGBTQ children. In three dozen incident reports, Gaggle flagged keywords related to sexual orientation including “gay” and “lesbian”. On at least one occasion, school officials outed an LGBTQ student to their parents, according to a Minneapolis high school student newspaper article.

“They have ‘gay’ flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it’s going to be false-positive because they are acting as if the word gay is inherently sexual,” said Logsdon-Wallace, the 13-year-old student. “When people are just talking about being gay, anything they’re writing would be flagged.”

The service could also end up disproportionately surveilling low-income families, he added. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in The 74’s data, only about a quarter were reported to district officials on school days between 8am and 4pm.

“That’s definitely really messed up, especially when the school is like, ‘Oh no, no, no, please keep these Chromebooks over the summer,’” an invitation that gave students “the go-ahead to use them” for personal reasons, he said.

“Especially when it’s during a pandemic, when you can’t really go anywhere and the only way to talk to your friends is through the internet.”

[ad_2]

READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.  Learn more