Human rights concerns stem from border closures, contact tracing, says U of T law professor
As nations look to contain the spread of COVID-19, ideas like technological tracing or constraints on immigration have been presented as solutions to fight the spread of the virus.
For example, a tool proposed by Apple and Google would allow someone diagnosed with COVID-19 to inform a public health app. The location data from the infected person’s cell phone would be used to alert nearby phones of their adjacency to the infected person. On April 22, the White House proclaimed that “the entry into the United States of aliens as immigrants is hereby suspended,” citing the pandemic.
Lawyer Petra Molnar, acting director of the International Human Rights Program at the University of Toronto, says her experiences in immigration law raise questions about the use of technology to monitor people’s movements.
“As countries and states think about ways to help manage the pandemic, data has become more valuable to people. But data is not neutral,” said Molnar in a livestreamed YouTube lecture on The Ethics of COVID Surveillance. Later, she added: “Unless all of us are healthy, including marginalized communities, no one is.”
Even in a simple, tech-free system — where decisions are made by judges or adjudicators — outcomes can be unpredictable, said Molnar. “As someone who practised refugee law and represented [people] in court, it was hard to get decision makers to understand why people crossed borders. Cases turn on really minute details that I would never remember if I were put to task,” she said. “The client couldn’t remember if the car was getting shot at from was white or grey. . . . It makes me think about how opaque this kind of decision-making is.”
Molnar noted that now, surveillance technology is generally being sold as a solution to complex problems, including in the legal space. For instance, she said, algorithms make their way into immigration detention at the U.S.-Mexico border to try and justify Donald Trump’s “hardline policies.” She also gave examples of lie detector use by immigration officers, as well as tools that propose to detect whether someone is LGBTQI2S using facial recognition.
“It is a very discretionary phenomenon . . . .whether they should be reunited with their spouse, whether they can adopt a child. Now we are seeing new technology make these decisions, even rely on biased data,” she said.
For example, she said, lie detectors may not account for different cultural contexts, such as someone uncomfortable making eye contact with a human of the opposite gender. These problems could extend to technology proposed to track COVID-19 spread. Recipients of social benefits, such as refugees, are already put into biometric scanning programs, raising questions about informed consent when it comes to data collection, she said.
Latest News
“With this increasing push to make communities more knowable and trackable we will be leaning on these problematic assumptions . . . . [if someone is] telling the truth about their symptoms, how will this technology discern that? We don’t have information about how that’s being done,” she said.
“It’s been quite remarkable how many private sector solutions have been brought up to address this pandemic.”
Molnar said a human rights perspective will see much of this as “responsibility laundering,” where neither a public body nor a private entity takes responsibility for the decisions made by technology. This could be done under the shield of IP laws, or simply by claiming they don’t understand how the other party designed or used the technology.
Another issue, she said, is that adopting another “sexy app” doesn’t address underlying issues that exacerbate pandemics, such as unequal resource distribution.
“I’m not saying technology can’t be used for good. There is a lot of interesting work being done,” she said. “There is very little governance over these tools ….migrants become a testing ground [for tools] that will be rolled out in other ways.”