In addition to wearing masks and social distancing, students on campus are expected to wear a coin size "BioButton" attached to the chest with medical adhesive. It would continuously measure their temperature, breathing rate, and heart rate and tell them whether they were in close contact with a button carrier that tested positive for Covid-19. Coupled with a series of daily screening questions, the button would let them know if they were cleared for class.
Dixon, a senior resident advisor, said the late July email was the first he and one of his friends at the university north of Detroit heard from the BioButton. "Nobody I spoke to liked the idea of having to wear something to be on campus," he said. "They wondered how secure the information was and who would have access to it."
A friend was concerned about what would happen if he went to protest against Black Lives Matter that broke out violence. Would he be tracked down and disciplined? Would sleeping on the opposite side of a thin dorm wall from an infected student put someone in quarantine unnecessarily?
Dixon has a petition on Change.org and urge Oakland to give students the option to opt out. Angry reactions to the BioButton request came from students and parents. College invaded their privacy, they wrote. You'd rather stop than wear the button. The college went communist.
Institutions have been looking for a technological solution where there is none.
"I went to bed with 100 signatures and when I woke up it exploded and a man on a far-right talk show wanted to give me an award," says Dixon.
Oakland isn't the only institution seeing this type of pushback. The pandemic has led many colleges to swiftly put in place surveillance tools that could help limit the spread of the virus or mitigate its impact on learning when students are sent from the classroom to private rooms. Some students who have to flash covid-free badges to enter classrooms or turn their laptops to allow online test administrators to scan their bedrooms are fed up with feeling watched. And some are suspicious of how the information gathered will be used, whether it could leak, and whether there is a process to destroy it once the pandemic is over.
This caution is not limited to students. Colleges that strive to keep students healthy and educationally up-to-date have put in place a mass surveillance structure that doesn't just go away and can have lasting effects on the student experience. "Tracking technology tends to linger after its original purpose has faded," says Sarah E. Igo, a professor of history at Vanderbilt University who studies surveillance and privacy. “It should be clear that these are temporary, exceptional measures. We have to be just as careful with how we start them as we set them up. "
Oakland officials regret the information about the BioButton was shared before they could educate people about what it did and what it didn't. Only the wearers would have access to their specific data and the close contact alerts would be based on Bluetooth discovery rather than GPS location tracking. In other words, the device does not track a student's specific location. It only monitors whether it is within Bluetooth distance (approx. 15 feet) from another BioButton device. Given the backlash, the university agreed to "encourage" rather than mandate their use.
David A. Stone, Professor of Philosophy and Chief Research Officer in Oakland, led the team that selected and evaluated the BioButton. In his view, providing health information is a relatively small price to pay if it is to stop the spread of a virus that has ravaged the nation.
"Given the hundreds of thousands of people who have died in this pandemic, is it too much to ask to share your heart rate or temperature?" he asked. He said wearable technology is the least invasive way to identify symptoms early and give students tools to help them identify whether they may have early signs of Covid-19 or possible exposure to Covid-19.
Other locations fear that the solutions, heavily marketed in the first few months of the pandemic, are causing more problems than they could solve. The University of Maryland in College Park was considering, but decided against, using technology that senses a person's temperature or location. One company offered an internet-connected thermometer that could help campus predict where the virus was spreading. However, some faculty members feared that the company would sell the personal information it had collected.
"Heaven forbids the thermometer from knowing that you have a fever," and suddenly you're receiving direct mail on Nyquil or Clorox wipes, says Neil Jay Sehgal, assistant professor of health policy and management in Maryland.
There's a difference between posting information yourself – often the carefully curated version of a life you want to convey – and requiring a surveillance service to scan your bedroom for cheat sheets or open books before taking a test, says Chris Gilliard, English professor at Macomb Community College in Warren, Michigan, studying privacy and inequality.
"For a long time we believed in the myth that students don't care about these issues. Now it's impossible to ignore how they push back," he says.
At some colleges, including the City University of New York and the University of Illinois At Urbana Champaign, students have distributed petitions demanding that online surveillance systems be kicked out of their classrooms.
After around 1,000 students, Urbana-Champaign protested against the systems, the university announced last month that the Proctorio software will no longer be used after the summer semester 2021. That doesn't mean anti-cheating software is out of the window. A campus spokesman said the short-term license that Proctorio signed last March as an emergency related to Covid will not be renewed but that other options for remote monitoring will be explored.
Surveillance is really about power and control, and universities are looking for security in very uncertain times.
Some colleges have argued that remote learning has left them no other option to ensure the integrity of exams. However, critics say this is an excuse.
"Much of the technology implemented is something that schools have done or wanted to do in the past but were not licensed," says Gilliard. "The pandemic has been a convenient excuse to recharge these technologies."
And they now have a special incentive, he says. “Surveillance is really about power and control, and universities are looking for security in very uncertain times. There was no surefire way to get students back on campus. "But instead of keeping the campus closed and taking the political heat away, Gilliard says:" The institutions have been looking for a technological solution where there is none. "
Menlo College in Atherton, Calif., Does not claim that its newest technology tool is a panacea. However, she wants to help the students with a smartphone app that waits for signs of anxiety and depression.
With fewer than 900 students, the Silicon Valley private college prides itself on the ability to provide personalized attention, but Covid-19 left students distracted and isolated. So Menlo worked with a start-up, Ellipsis Health, to encourage students to try an app that uses machine learning to tag people whose language matches the voice patterns of people with depression. Students first record themselves for two to three minutes. Every time they log into the app they are asked a series of questions. Depending on how they are scored for anxiety and depression, they may be asked to relax with a meditation tape or call a crisis hotline.
College officials emphasize that a machine, not a person, is listening and the student is the only one who receives the individual feedback.
Ellipsis and the college worked with student leaders to streamline an approach that would raise as few privacy flags as possible. "They were very receptive to what the students wanted and were comfortable with," says Lina Lakoczky-Torres, an entrepreneur who serves as the wellness advocate for the college's student government. "It feels like it's our baby, just like hers."
Students didn't want mental health counselors overhearing, she said, and they wanted to add their own questions to assess their mental health, such as how stressed they were from posts and "likes" on social media. "The technology is very scary, but this comes from a place where you want to help," says Lakoczky-Torres.
The students chose the technology, she said, because they had a role in developing the technology and felt they were in control of the data that was collected. If they don't, and students suspect their personal lives are being scrutinized by companies that care more about profits than their well-being, they are likely to rebel.
The software, which faculty members can customize, typically scans students' rooms, locks their computer browsers, and monitors eye and head movements through their webcams while they run tests.
Critics complain that using such software signals to students that faculty members do not trust them. Some students also say that being able to be flagged for "suspicious" activity increases the stress of taking a test and sometimes causes panic attacks.
"I've been tagged a few times for moving or taking a second and looking away while thinking," says Olivia Eskritt, a sophomore at St. Clair College in Windsor, Ontario whose class was using the software Response.
Before starting a test, students had to pick up their laptops and turn them around in their rooms to show they hadn't posted cheat sheets on the walls, she says. They also had to record themselves so the system could tell if someone else was starting to give them answers. "My mom walked into the room while I was in the middle of the test and I say, 'Oh no, you're going to get me in trouble!'" Escritt worried, meanwhile, that her father would sit down and turn off the scam software with his booming ex-military voice while you zoom in on a nearby work call.
Critics say black and brown students face even more obstacles – one of the complaints from students protesting at the University of Illinois at Urbana-Champaign. Studies have shown that facial recognition software sometimes struggles to identify the faces of dark-skinned students.
Alivardi Khan, who recently graduated from Brooklyn Law School, found out the hard way.
The @ ExamSoft Software cannot “recognize” me due to “poor lighting” even though I am sitting in a well-lit room. I'm starting to think it has nothing to do with lighting. We're pretty sure we all predicted that their facial recognition software wouldn't work for people of color. @ DiplomaPriv4All
– Alivardi Khan (@uhreeb) September 8, 2020
Khan says he spent much of the week leading up to the New York State bar exam trying to get ExamSoft, the oversight system, to recognize him. "I tried to sit in front of a window when the sun was shining in and then went into a bright bathroom with light shining from white tiles," he says. After getting help from a customer service representative, the system finally recognized him.
Although Brooklyn Law School gave him a bar exam room, Khan took a lamp with him just in case. If you were forced to sit still that long, the room's automatic lights were turned off. "I had to flap my arms to make it work again," he says, creating another potential flag for fraud. "We had a 15-minute break between sections and I used it to call ExamSoft Customer Service." All in all, quite a stressful experience, he says.
Britt Nichols, ExamSoft's chief revenue officer, says poor lighting can cause problems with recognizing people's faces, but there's no evidence that the problem is worse for people with darker skin.
"Every time in a very small blue moon, it doesn't recognize your face," he says. "Some people assume that something nefarious is involved," he added, when the problem could be a poor internet connection.
Students with disabilities have also complained that something like a facial tic or other unexpected movement could cause them to be tagged. Some have reported that the browser lock feature can limit the use of tools that convert text to speech.
According to Proctoring Services, instructors have the option to accommodate special needs, for example by turning off the camera or giving students a short break during an exam. Realistically, faculty members struggling with the technological demands of online courses might find it difficult to make such customizations.
Some faculty members have made it clear that they do not intend to use anti-cheating software.
Derek A. Houston, associate professor of educational leadership at Southern Illinois University in Edwardsville, said he was alarmed to learn that the state college association had published one Request for quotation for $ 44 million over five years to fund two online surveillance programs. Houston wanted to signal to its employers, students, and higher education generally that it believed online supervision is setting the wrong tone.
His message on Twitter: “You don't have to worry about this kind of unnecessary surveillance. We will build mutual trust and expectations in the classroom. My goal is collective growth and surveillance is the opposite of that. "
Students and faculty members are not the only ones to resist. A group of Democratic Senators in December wrote to Three online surveillance companies who want to know how to protect student privacy and ensure that students, including those with disabilities or dark skin, are not being falsely accused of cheating.
In response to such concerns, regulators have argued that getting rid of their tools will lead to widespread fraud.
In an interview, Proctorio founder and CEO Mike Olsen says that a lot of criticism of proctoring software is based on misconceptions.
"We don't throw someone out of an exam if someone speaks or stands up" to go to the bathroom, he says. The system only marks the interruption, which a faculty member can later review. If someone has a shaky internet connection, they can be disconnected for up to two minutes and return to the exam. However, if they are offline for longer, they are at too high a risk of fraud. This also poses equity issues, as disadvantaged students with spotty WiFi are more likely to experience prolonged outages.
Fairness challenges will arise even without his software, says Olsen. Some students are upset when their professors tell them they are using the honor system because they know that some of their classmates will intercept the answers from online tutoring tools like the subscription-based one Cheggthat not everyone can afford.
He advises teachers to explain to students if they need to use certain features, such as cameras, that may be uncomfortable. “Perhaps accreditation requires a certain level of test security – communicate this. The students just want to know why. "
In a 2018 opinion for The Washington Post, Mitchell E. Daniels Jr., president of Purdue University, noted that the university's technology infrastructure designed to support student success, campus services, and research produces "a huge amount of fascinating information" as a by-product.
"Forget that ominous old line," We know where you live, "" he wrote. "These days they say," We know where you are. "
The dilemma Daniels posed at the time is one of which many are still puzzled: “Many of us will have to pause and wonder whether our good intentions will carry us across borders where privacy and individual autonomy should still prevail. "
This question comes up a lot when discussing location tracking and facial recognition tools. In September, some Brown University students were alerted because they received incorrect emails from the administration accuses her living in Providence when they said they would attend remotely. The students were accused of violating the Student Code of Conduct, which requires campus residents to adhere to strict Covid-19 testing requirements, and threatened disciplinary action.
Factors used to locate students included “Evidence that they have accessed electronic services from private universities or secure networks in the Providence area; Indications that you have directly accessed buildings on our campus; and / or reports from other community members, ”a Brown spokesperson, Brian E. Clark, wrote in an email to The Chronicle. When further details emerged the next day that the students were indeed not around, the university withdrew the charges and apologized to the students.
The pandemic isn't the first crisis to have sparked a flurry of security technology. After a series of school shootings, there was "a rush and urgency to use new technology to prevent mass violence," said Elizabeth Laird, director of justice in citizen technology at the Center for Democracy & Technology. She sees a similar response to the Covid pandemic when tools that would have been viewed as too intrusive are now tolerated, if not welcomed. But what happens, she asks, when the urgent need for them is over?
"In times of crisis, you most likely sacrifice your civil rights," she said. "But the problem is that once you've sacrificed it, it's difficult to get it back."
(tagsToTranslate) ExamSoft (t) Brooklyn Law School (t) University of Illinois at Urbana-Champaign (t) Elizabeth Laird (t) Proctorio (t) Center for Democracy and Technology (t) Silicon Valley (t) Black Lives Matter (t ) t) Menlo College (t) Purdue University