School Safety Tech is EdTech

Civics of Tech Announcements

  1. Monthly Tech Talk on Tuesday, 11/12/24 (Note the updated date!). Join our monthly tech talks to discuss current events, articles, books, podcast, or whatever we choose related to technology and education. There is no agenda or schedule. Our next Tech Talk will be on Tuesday, November 12th, 2024 at 8-9pm EST/7-8pm CST/6-7pm MST/5-6pm PST. Learn more on our Events page and register to participate.

  2. Book Club, Tuesday, 12/17/24 @ 8 EST - We’re reading Access Is Capture: How Edtech Reproduces Racial Inequality by Roderic N. Crooks. Charles Logan is hosting, so register now to reserve your spot!

  3. Spring Book Clubs - The next two book clubs are now on the schedule and you can register for them on the events page. On February 18th, Allie Thrall will be leading a discussion of The Propagandists' Playbook. And on April 10th Dan Krutka will be leading a discussion of Building the Innovation School, written by our very own CoT Board Member Phil Nichols.

  4. Call for Chapters: Civics of Tech contributor, Dr. Cathryn van Kessel, asked us to share the following call for chapters: “Rewiring for Artificial Intelligence: Philosophies, Contemporary Issues, and Educational Futurities.” This edited collection seeks to explore how we can critically rewire our approaches to AI, ensuring that it not only integrates “seamlessly” and into and across different educational contexts but also raises “unseemly” philosophical and ethical questions. This volume will focus, in part, on the rapidly evolving field of Artificial Intelligence and implications for higher education, teacher education, and/or K-12 public schooling system. Submit proposals here by November 29.

By Erin Anderson

I just left the Association for Educational Communications & Technology (AECT) International convention, where George Veletsianos gave a thought-provoking keynote on the need for reimagining how we think about educational technology. This made me think of a type of technology not usually discussed at these types of education and technology conferences, perhaps because it is not often considered educational technology. Conferences such as AECT, ISTE, SITE…these prominent education and technology conferences are devoted to exploring, developing, interrogating, and contextualizing educational technology. While they all have divisions and SIGs, such as Learning and Instruction, Design and Development, Emerging Technologies, and Educational Policy, they do not have a division or SIG specifically dedicated to exploring, developing, and evaluating one of the largest and costliest types of technology used in schools: school safety technology. These types of technologies usually fall under the umbrella of law enforcement and criminal justice: AI-powered security cameras, passive weapon scanners, facial recognition software, and smart vaping sensors like the HALO smart sensor that claims to detect both vape smoke AND bullying. They might arise as school districts partner with former U.S. Special Forces to scrape students' social media. However, these technologies are increasingly finding their way into the classroom. Little rigorous research has been conducted on their efficacy, yet they still significantly impact students’ learning. After the Apalachee school shooting, where a school safety technology, the CENTEGIX security badge, contributed to the rapid law enforcement response, there are greater calls for these technologies to be mandated at a federal level.

Yet these technologies must be rigorously vetted to ensure they don’t negatively impact student learning and wellbeing. Students cannot learn if their schools are inadvertently put on lockdown due to a faulty school safety badge, as has happened at Green Bay West High School, Plainfield East High School, Lodi High, Don Estridge Middle, North Valleys High School, Lanphier High School, Jacksonville High, Ridgeview Middle School. These accidental lockdowns disrupt learning, send out hundreds of alerts that terrify parents, and can cause extreme anxiety in children. During one accidental lockdown, a terrified child texted his mother, “We are going to die. I love you.

 AI-enhanced school safety technology is exploding as states pass versions of Alyssa’s Law, which first passed in Florida after the Marjory Stoneman Douglas High school shootings. While thousands of students walked out of their classrooms to push legislators to enact better gun control, the parents of Alyssa Alhadeff, who died during that school shooting, took a different tack. They made it their mission to make schools safer by adding safety features to schools, including metal detectors, bullet-proof glass, and enhanced fences. This evolved into a partnership with major technology providers, like Raptor Technologies, to create school safety technology such as alert apps and security badges. These apps/badges are downloaded to teachers' devices or wrapped around teachers' necks in the case of Centegix, and can be triggered in the case of a school shooting. In fact, law enforcement credits the rapid law enforcement response during the Apalachee school shooting to the Centegix badge. GBI Director Chris Hosey said, “this system activated today prevented this from being a much larger tragedy.”

Legislators love Alyssa’s Law because it gives them something to turn to in the wake of school shootings. When people demand better gun control to stop school shootings, legislators can point to their efforts to pass Alyssa’s Law, which requires school districts to install silent panic alarms. These products can then be bundled so districts purchase the silent panic alarms and then get the anti-vaping/anti-bullying sensor at a discounted rate. As technology companies rush to push their products into school districts trying to comply with the new legislation, many privacy and ethics concerns are not being addressed. For example, with the HALO smart sensor, which claims to detect vaping and bullying, how does a sensor operationalize a very subjective thing like bullying? After posing this question to one principal, I was told that district leaders could program the sensor to listen for preidentified words, like an always-listening Alexa, which begs the question: which words signal bullying is happening? There have already been problems with computer surveillance tech such as GoGuardian and Gaggle accidentally outing LGBTQ students. Some of these technologies, like Cenetgix, are partnering with large data-mining operations such as AXON Enterprises, which VICE already called out for being on a mission to “surveil America with AI”. With the Centegix school badge, developed to alert authorities of a school shooter, the company admitted that the majority of its uses were to alert school admin of disruptive students or medical emergencies. One teacher I spoke with told me she loved Centegix because she could secretly push the button and call the admin to her room to remove disruptive students. While this might make the teacher feel good, it does little to address deeper systemic issues involved with developing a positive and inclusive classroom culture. And I’m not sure if the teachers’ students feel any safer with their teacher managing the class by pushing a button over building relationships to create a supportive classroom environment.

Lori Alhadeff, Alyssa’s Law founder, said she prefers mandating school safety technologies over gun control. While having immediate access to law enforcement might make many teachers and parents feel safer, it fails to answer the question of why schools are not safe in the first place. While I’m not advocating for the end of Alyssa’s law or the cessation of these technologies, I think the educational and technology academic community needs to pull the research on these technologies away from law enforcement and into our field. One way this can happen is for major education and technology academic conferences to develop divisions or SIGs that promote research on the efficacy of these technologies. Schools need help vetting these technologies to ensure they abide by child data privacy laws. Our field has the know-how to do this. The Civics of Technology community has the criticality needed to vet these technologies. Researchers like Audrey Waters, Marie Heath, and Aman Yadav have already connected school safety technology with edtech. It’s time for us all to start thinking about school safety technology as educational technology.


*This work was informed by my rich discussions with Dr. Roxana Marachi.

Previous
Previous

Being a Civics of Tech Parent Part 2: Testimonials

Next
Next

How Refrigeration Changed Everything