by Lois Beckett in Oakland edited by O Society October 25, 2019
For Adam Jasinski, a technology director for a school district outside of St Louis, Missouri, monitoring student emails used to be a time-consuming job. Jasinski used to do keyword searches of the official school email accounts for the district’s 2,600 students, looking for words like “suicide” or “marijuana”. Then he would have to read through every message that included one of the words. The process would occasionally catch some concerning behavior, but “it was cumbersome”, Jasinski recalled.
Last year Jasinski heard about a new option: following the school shooting in Parkland, Florida, the technology company Bark was offering schools free, automated, 24-hour-a-day surveillance of what students were writing in their school emails, shared documents and chat messages, and sending alerts to school officials any time the monitoring technology flagged concerning phrases.
The automated alerts were a game-changer, said Jason Buck, the principal of the Missouri district’s middle school. One Friday evening last fall, Buck was watching television at home when Bark alerted him one of his students had just written an email to another student talking about self-harm. The principal immediately called the first student’s mother: “Is the student with you?” he asked. “Are they safe?”
Before his school used Bark, the principal said, school officials would not know about cyberbullying or a student talking about hurting themselves unless one of their friends decided to tell an adult about it. Now, he said, “Bark has taken that piece out of it. The other student doesn’t have to feel like they’re betraying or tattling or anything like that.”
Although students at his school are aware they’re being monitored, they were surprised at first at how quickly school administrators could follow up on what they had typed, Buck said. “It’s not, ‘Hey, I sent this email two days ago,’ [it’s] ‘You just sent this email three minutes ago, let’s talk.’”
Bark and similar tech companies are now monitoring the emails and documents of millions of American students, across thousands of school districts, looking for signs of suicidal thoughts, bullying or plans for a school shooting.
The new school surveillance technology doesn’t turn off when the school day is over: anything students type in official school email accounts, chats or documents is monitored 24 hours a day, whether students are in their classrooms or their bedrooms.
Tech companies are also working with schools to monitor students’ web searches and internet usage, and, in some cases, to track what they are writing on public social media accounts.
Parents and students are still largely unaware of the scope and intensity of school surveillance, privacy experts say, even as the market for these technologies has grown rapidly, fueled by fears of school shootings, particularly in the wake of the Parkland shooting in February 2018, which left 17 people dead.
Digital surveillance is just one part of a booming, nearly $3bn-a-year school security industry in the United States, where Republican lawmakers have blocked any substantial gun control legislation for a quarter century.
“Schools feel massive pressure to demonstrate that they’re doing something to keep kids safe. This is something they can spend money on, roll out and tell parents, this is what we’re doing,” said Chad Marlow, a privacy expert at the American Civil Liberties Union.
Unlike gun control, Marlow said, “Surveillance is politically palatable, and so they’re pursuing surveillance as a way you can demonstrate action, even though there’s no evidence it will positively impact the problem.”
There is still no independent evaluation of whether this kind of surveillance technology actually works to reduce violence and self-harm. Privacy experts say pervasive monitoring may hurt children, and may be particularly dangerous for students with disabilities and students of color.
Despite the lack of research evidence, tech companies are marketing school monitoring technologies with bold claims of hundreds of lives saved, mostly through prevention of youth suicide attempts.
Gaggle, a leading provider of school email and shared document monitoring, says its technology is currently used to monitor 4.5 million students across 1,400 school districts. The company claims that in the last academic year alone its technology “helped districts save the lives of more than 700 students who were planning or actually attempting suicide”.
Bark says it works with at least 1,400 school districts across the country, and claims its technology has helped prevent “16 credible school shootings” and detected “twenty thousand severe self-harm situations”.
Securly, another leading provider, says its products are used to protect 10 million students across 10,000 individual schools. In the past year, Securly said it helped school officials intervene in 400 situations that presented an “imminent threat”.
The companies’ statistics on lives saved are based on their own anecdotal data, and have not been independently evaluated.
“I heard from a lot of districts in the weeks after Parkland, they were getting nonstop email solicitations from all sorts of brand new, or fairly new companies specializing in social media saying, ‘We can fix your problems,’ and a lot of them were adopting it,” Amelia Vance, the director of education privacy at the Future of Privacy Forum, said.
“Some people think technology is magic, artificial intelligence will save us,” Vance said. “A lot of the questions and a lot of the privacy concerns haven’t [been] thought of, let alone addressed.”
How it works
In Florence, South Carolina, school officials intervened after a middle school student started writing about suicide while working on an in-class English assignment. The phrases she typed in a Google document triggered an alert from Gaggle, the surveillance company working with the school district. “Within minutes”, the student was pulled out of class for a conversation with school officials, according to Dr Richard O’Malley, the district superintendent.
In Cincinnati, Ohio, the school district’s chief information officer had to call the police in the middle of the night to conduct a wellness check on a student who had been flagged by Gaggle for writing about self-harm. The situation was serious enough the student was hospitalized to receive mental health services, the chief information officer, Sarah Trimble-Oliver, said.
In rural Weld county, Colorado, a school official got an alert from GoGuardian, a company that monitors students’ internet searches, a student was doing a Google search for “how to kill myself” late one evening. The official worked with a social worker to call law enforcement to conduct an in-person safety check at the student’s home, said Dr Teresa Hernandez, the district’s chief intervention and safety officer. When the student’s mother answered the door, she was confused, and said her child had been upstairs sleeping since 9pm. “We had the search history to show, actually, no, that’s not what was going on,” Hernandez said.
Federal law requires American public schools block access to harmful websites, and that they “monitor” students’ online activities. What exactly this “monitoring” means is not defined clearly: the Children’s Internet Protection Act, passed nearly 20 years ago, driven in part by fears American children might look at porn on federally-funded school computers.
As technology advances and schools integrate laptops and digital technology into every part of the school day, school districts largely define for themselves how to responsibly monitor students on school-provided devices – and how aggressive they think monitoring should be.
Schools have faced lawsuits by parents of students who have committed suicide and by parents of children who have been cyberbullied, said Vance, the student privacy expert.
“Schools are almost in a damned if you do, damned if you don’t situation. If they choose to be more privacy protective they could be sued, but on the other hand, they could be sued for over-surveilling,” she said.
Bark’s decision following the Parkland shooting to give away free email, chat and shared document monitoring to any school district that wanted it was partly altruistic, an effort to respond to a horrifying crisis, and partly strategic, with the hope that providing a free service for school districts would make it a trusted brand with parents, helping sales of its for-profit parent surveillance products, which it markets for $9 a month, said Titania Jordan, the company’s “chief parenting officer”.
Other companies, some of which offer schools human analysts who help review the automated alerts, charge districts thousands or tens of thousands of dollars.
The amount American public school districts spend on email and document monitoring services appears to have increased sharply from 2013, the year after a mass shooting at Sandy Hook elementary school, to 2018, from nearly $4m to more than $8m, according to an analysis of purchasing contracts between just two major monitoring companies, Gaggle and Securly, and roughly 250 school districts. These numbers appear to be an undercount of the full size of the market, according to the Brennan Center for Justice, the progressive advocacy group that compiled and analyzed the purchasing records.
As of 2018, at least 60 American school districts had also spent more than $1m on separate monitoring technology to track what their students were saying on public social media accounts, an amount that spiked sharply in the wake of the 2018 Parkland school shooting.
Charles Whitman sniper in the clock tower at the University of Texas in 1966
Values at stake
Some proponents of school monitoring say the technology is part of educating today’s students in how to be good “digital citizens”, and that monitoring in school helps train students for constant surveillance after they graduate.
“Take an adult in the workforce. You can’t type anything you want in your work email: it’s being looked at,” Bill McCullough, a Gaggle spokesperson, said. “We’re preparing kids to become successful adults.”
Experience with school monitoring is a “training ground” that might mean that students “won’t lose their job for sharing inappropriate content”, said Trimble-Oliver, the chief information officer for Cincinnati’s public school district, which uses Gaggle.
Students “need to know that organizations can and probably are monitoring their content”, she said.
Privacy experts called these arguments “concerning”, and note there are legal limits to how companies can monitor employees’ work emails.
“The idea everything students are searching for or everything they’re writing down is going to be monitored by their school can really inhibit growth and self-discovery,” Natasha Duarte, a policy analyst at the Center for Democracy and Technology, said.
For black students, and students with disabilities, who already face a disproportionate amount of harsh disciplinary measures, the introduction of new kinds of surveillance may be especially harmful, privacy experts said.
Both machine-learning algorithms and human analysts are at risk of misunderstanding what students write – particularly if the human analysts are older, or from different cultural backgrounds than the students they are monitoring, experts said. If digital surveillance companies scanning students’ emails and chats misinterpret their jokes or sarcasm as real threats, that “could expose students to law enforcement in a way they have not been in the past”, said Elizabeth Laird, the senior fellow for student privacy at the Center for Democracy and Technology.
The consequences of involving law enforcement in responding to what students are typing on their school computers is a particular concern in a country where more than 40% of schools have police officers inside school buildings serving as dedicated “school resource officers”, Vance said.
In some cases, surveillance companies monitoring students may themselves directly contact local law enforcement officials to take action if they’re concerned a threat is serious, and if school officials have given them permission to do so. The data surveillance companies are collecting on students may also be shared with law enforcement.
Securly, one of the leading educational surveillance companies, makes it possible for the human analysts who evaluate potentially troubling student messages to look back at the history of an individual student’s internet browsing history and web searches, allowing them to connect the dots between what students are reading, writing, searching for, and, in some cases, posting on social media.
Securly will share information with law enforcement “if there’s a warrant or subpoena”, Mike Jolley, the company’s director of K-12 Safety, said. Data about individual students are deleted when the students graduate, or at a school’s request , Jolley said.
In the United Kingdom, school surveillance technology has already been tested for use in counter-terrorism efforts. Impero, a British education software company, piloted its monitoring technology as a counter-terrorism tool, flagging children for using phrases like “Jihadi bride”, “War on Islam”, or “You only die once”, the Guardian reported in 2015.
The company billed this as a “de-radicalisation effort” that would help teachers and other officials identify “vulnerable children” or “children that may be at risk in the future”.
“It’s certainly fair to ask to what extent we feel comfortable with technologies first developed for use in war being used against our children,” Marlow, the ACLU expert, said.
It’s not clear what kind of “chilling effect” the monitoring might have on students’ self-exploration, their group conversations and their academic freedom, Marlow, the ACLU privacy expert, said. If students know their schools are monitoring their computer usage, will LGBTQ students in conservative school districts feel comfortable researching their sexuality? What about young Trump supporters in liberal school districts who want to do some political research?
“Schools don’t post on a bulletin board outside the principal’s office, ‘Here are the words we’re searching for,’” Marlow said. “It forces students to be careful. They might not write about things or talk about things that are not, in fact, being monitored.”
Shifts in culture
School officials say that their primary motivation for using surveillance technology is the chance to save a student’s life. But schools are monitoring students’ digital documents in real time for a wide range of content they see as problematic, from swear words to nude images and pornography to cyberbullying to evidence of drug and alcohol use.
In Weld county, Colorado, a student emailed a teacher that she heard two boys were going to smoke weed in a bathroom, Hernandez, the student services and safety director, said. Gaggle immediately alerted school officials: “Within four minutes of her sending this email, the troops had deployed,” she said.
Gaggle also automatically sends students a scolding email any time they use a profanity.
A few school districts have chosen not to send students Gaggle’s warnings about swear words, some because they’re concerned that if students are reminded that they’re being monitored, “the children will then resort to other tools to communicate, and they’ll miss the life-threatening issues they could have intervened in,” McCullough, the Gaggle spokesperson, said.
McCullough said these fears were misplaced, and that the company had seen little evidence that the students being surveilled on school devices had switched to other forms of communication.
“Kids who have used us in their districts for years and years still use these tools to communicate their innermost thoughts, because they’re hoping that their cries for help are answered, and they’re not comfortable communicating the way adults communicate, face-to-face,” McCullough said.
O’Malley, the South Carolina superintendent whose district uses Gaggle, had a different impression of how kids in his district had reacted to the new surveillance.
“Once the kids know they’re being Gaggled, they’re being watched 24-7, they tend to be more proactive in watching what they do,” O’Malley said.
He said he had heard students in the district using Gaggle, the surveillance company’s name, as a verb: “We can’t do that. We’re being Gaggled.”
Drawing the line
While parents are often “grateful” for the information that comes from an intervention, Buck, the Missouri principal, said, students can be “a little bit upset sometimes. They feel like there’s a little bit of that privacy issue. But over the course of time they see we’re really trying to help, especially when we’re talking about the issue of self-harm.”
Some school surveillance companies defended their products as more sensitive to students’ privacy than their competitors – or the students’ own parents.
“Some parents want technology that will give them an exact record of every single text, every single email,” Jordan, Bark’s chief parenting officer, said. But Bark does not offer that, she said: “We only alert parents and schools when there is a real issue that they need to know about.”
For Gaggle, McCullough said, the bright line was offering monitoring of only students’ official school emails and school documents.
“We shouldn’t be looking at their private email. We shouldn’t be looking at their private social media posts. But in the school, with school-issued tools, we should protect them,” McCullough said.
Securly, in contrast, offers a free app for parents in the districts that use its technology that allows them to see exactly what websites their children have visited, what Google searches they have made, and what videos they are watching on YouTube, Jolley, Securly’s safety director, said
At the moment, the company has no privacy protections for LGBTQ students who might need to search for information without their parents knowing about it, although Jolley said that was a concern the company was actively discussing.
While school surveillance companies tout products informed by “machine learning”, and “artificial intelligence”, school officials say that the technology is still far from perfect, and that they have received plenty of false alerts, like getting red flags when students tell each other sarcastically to “kill yourself”, talk about the band Suicide Boys, or have to write a school assignment on the classic American novel To Kill a Mockingbird.
An investigation into school surveillance by Education Week examined nearly 3000 alerts Gaggle sent to one school district in Michigan. Most of these were “minor violations,” including a dozen students who stored or sent files that contained the word “gay,” and one who used the word “bastard” in a school assignment about the Odyssey.
After he received multiple alerts from the same group of middle school boys who were sending edgy memes to each other, sometimes as late as 3am, Buck, the Missouri principal, called the boys into the office and encouraged them to find another way to share their jokes – perhaps a group text message – that would not constantly alert their principal.