Nine University of Miami students were called into a Zoom meeting with the dean to discuss their participation in a protest regarding the coronavirus.
In late September, doctoral student Mars Fernandez-Burgos and eight other students received an email from the assistant to the dean of students asking them to attend a Zoom meeting to discuss the “incident that happened on September 4,” reports WIRED.
On September 4, Fernandez-Burgos took part in a campus protest regarding coronavirus protections put in place at the school and sick pay for contract employees such as cafeteria workers and janitors. After the protest, Fernandez-Burgos said administrators ignored requests for a meeting, so she was unsure why the dean was reaching out several weeks later.
The dean, Ryan Holmes, said the meeting was to discuss “policies around use of university space” and that it “should not last long, is not designed to dictate content, and is not adversarial in nature.”
At the meeting, Fernandez-Burgos said Holmes told the group that he had asked them to meet since they hadn’t booked campus space for the protest. While the content of the protest wasn’t an issue, he added, Holmes said officials were concerned about liability and security.
Fernandez-Burgos and the other students have accused the University of Miami Police Department (UMPD) of using an undisclosed facial recognition system at the protest to identify those invited to the meeting. According to WIRED, officials deny using the tech but documents suggest campus police have had access to facial recognition databases.
In 2019, the Orlando Sentinel said UMPD uses the Face Analysis Comparison & Examination System (FACES), a 33-million-photo database that uses images from driver’s licenses and law enforcement photos. As recently as October 15, UMPD Chief David Rivero’s résumé included references to cameras equipped with “motion detection, facial recognition, object detection and much more,” according to Fight for the Future, a digital rights advocacy group.
“He is literally describing facial recognition and saying it’s not facial recognition,” said Evan Greer, deputy director of Fight for the Future.
Fernandez-Burgos and Esteban Wood, another student who participated in the protest, said Holmes told them that campus police used software to analyze camera footage from the protest to identify the students. Although none of the students were disciplined, Wood claims they were flagged because they criticized the school.
In late October, Greer and the protesting students penned an open letter to the school asking it to ban facial recognition technology. The letter was signed by more than 20 privacy and civil liberties organizations.
According to Fight for the Future, more colleges have adopted facial recognition technology for security, to administer remote exams, and for public health measures to fight the coronavirus. In July, the Security Industry Association (SIA) announced its strong opposition to the recently introduced Facial Recognition and Biometric Technology Moratorium Act, which would impose a blanket ban on most federal use of nearly all biometric and related image analytics technologies. The following month, SIA released its new policy principles guiding the development and deployment of facial recognition technology.
However, not all have been quick to support the evolving technology. Over 60 schools, including MIT, Harvard, Brown, Columbia, and Florida State University, said they are not currently using facial recognition and don’t plan to do so. In February, UCLA dropped its plan to use the technology on campus.
Back in September, in less than a week, more than 1,000 parents signed an open letter calling for a ban on the technology in K-12 schools, citing the interest of kids’ safety. The letter specifically highlights the unknown psychological dangers of constant surveillance, the potential physical danger if someone gains access to the data collected by facial recognition systems, and how it will worsen discrimination against students of color, girls, and gender-nonconforming kids.
In May 2019, San Francisco began to prohibit its government from using the technology. In June, Boston did the same. Back in June, Amazon also halted law enforcement’s use of its facial recognition technology for one year. The company, however, said it will continue to let organizations such as Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics use its technology to help rescue human trafficking victims and reunite missing children with their families.